Logging
Kora uses slf4j-api as the logging engine for the entire framework, it is expected that an implementation based on Logback will be used.
Usage¶
Loggers are required to be provided through the SLF4J factory.
Configuration¶
Logging levels described in the LoggingConfig
class:
Logback configuration parameters are described in the modules that include logback, e.g. HTTP server, HTTP client, etc.
Module¶
Enabling/disabling logging of certain modules is specified in the configuration of the modules themselves.
Logging of all modules is disabled by default, for convenience below is a separate configuration to enable logging of most modules.
db.telemetry.logging.enabled = true //(1)!
cassandra.telemetry.logging.enabled = true //(2)!
grpcServer.telemetry.logging.enabled = true //(3)!
httpServer.telemetry.logging.enabled = true //(4)!
scheduling.telemetry.logging.enabled = true //(5)!
grpcClient.SomeGrpcServiceName.telemetry.logging.enabled = true //(6)!
soapClient.SomeSoapServiceName.telemetry.logging.enabled = true //(7)!
SomePathToConfigHttpClient.telemetry.logging.enabled = true //(8)!
SomePathToConfigKafkaConsumer.telemetry.logging.enabled = true //(9)!
SomePathToConfigKafkaProducer.telemetry.logging.enabled = true //(10)!
- Database JDBC / R2DBC / Vertx
- Database Cassandra
- gRPC server
- HTTP server
- Scheduler
- gRPC client (Specified for a specific service)
- SOAP client (Specified for a specific service)
- HTTP client (Specified for a specific client)
- Kafka consumer (Specified for a specific consumer)
- Kafka producer (Specified for a specific producer)
db.telemetry.logging.enabled: true #(1)!
cassandra.telemetry.logging.enabled: true #(2)!
grpcServer.telemetry.logging.enabled: true #(3)!
httpServer.telemetry.logging.enabled: true #(4)!
scheduling.telemetry.logging.enabled: true #(5)!
grpcClient.SomeGrpcServiceName.telemetry.logging.enabled: true #(6)!
soapClient.SomeSoapServiceName.telemetry.logging.enabled: true #(7)!
SomePathToConfigHttpClient.telemetry.logging.enabled: true #(8)!
SomePathToConfigKafkaConsumer.telemetry.logging.enabled: true #(9)!
SomePathToConfigKafkaProducer.telemetry.logging.enabled: true #(10)!
- Database JDBC / R2DBC / Vertx
- Database Cassandra
- gRPC server
- HTTP server
- Scheduler
- gRPC client (Specified for a specific service)
- SOAP client (Specified for a specific service)
- HTTP client (Specified for a specific client)
- Kafka consumer (Specified for a specific consumer)
- Kafka producer (Specified for a specific producer)
Logback¶
The module provides a logging implementation based on Logback, adds support for structured logs and the ability to configure logging levels via config file.
Dependency¶
Dependency build.gradle
:
Module:
Dependency build.gradle.kts
:
Module:
Configuration¶
It is assumed that Logback will be configured via logback.xml
, and only logging levels will be specified in the Kora configuration, example logback.xml
:
<configuration debug="false">
<statusListener class="ch.qos.logback.core.status.NopStatusListener" />
<appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<charset>UTF-8</charset>
<pattern>%d{HH:mm:ss.SSS} %-5level [%thread] %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<appender name="ASYNC" class="ru.tinkoff.kora.logging.logback.KoraAsyncAppender">
<appender-ref ref="STDOUT"/>
</appender>
<root level="WARN">
<appender-ref ref="ASYNC"/>
</root>
</configuration>
Other implementation¶
Kora uses slf4j-api as the logging engine, you can plug in any custom compatible implementation. The base module adds support for structured logs and the ability to configure logging levels via config file.
Dependency¶
A generic logging implementation will need to be connected:
Dependency build.gradle
:
Module:
Dependency build.gradle.kts
:
Module:
Usage¶
When using your custom implementation, you would need to provide an implementation of LoggingLevelApplier
that implements the
setting the logging level and resetting it.
It will also be necessary for the implementation to independently support StructuredArgument
, StructuredArgumentWriter
and MDC
if they are to be used.
Structured Logs¶
You can pass structured data to a log record in two ways via:
- Marker
- Parameter
The marker and parameter methods also take Long
, Integer
, String
, Boolean
and Map<String, String>
as arguments.
Marker¶
You can pass structured data to the log via a marker:
Parameter¶
You can transfer structured data to the log via parameters:
MDC¶
Structured data can be attached to all records within a context using the ru.tinkoff.kora.logging.common.MDC
class:
If you are using AsyncAppender
to send logs, you need to use ru.tinkoff.kora.logging.logback.KoraAsyncAppender
to correctly pass MDC parameters,
which will pass to the delegate ru.tinkoff.kora.kora.logging.logging.logback.KoraLoggingEvent
containing, among other things, a structured MDC.