log4j2配置解析及发送log到kafka(xmlyml)

mac2025-09-16  8

log4j2的优势

在异步,吞吐量等方面有极大的性能提升,Log4j2的性能为什么这么好https://www.jianshu.com/p/359b14067b9e

maven

<!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-api --> <dependency>     <groupId>org.apache.logging.log4j</groupId>     <artifactId>log4j-api</artifactId>     <version>2.11.1</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-core --> <dependency>     <groupId>org.apache.logging.log4j</groupId>     <artifactId>log4j-core</artifactId>     <version>2.12.1</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.logging.log4j/log4j-slf4j-impl --> <dependency>     <groupId>org.apache.logging.log4j</groupId>     <artifactId>log4j-slf4j-impl</artifactId>     <version>2.11.1</version>     <scope>test</scope> </dependency>

gradle

compile 'org.apache.logging.log4j:log4j-api:2.11.1' compile 'org.apache.logging.log4j:log4j-core:2.11.1' compile "org.apache.logging.log4j:log4j-slf4j-impl:2.11.1"

log4j与slf4j的关系

slf4j是日志框架的标准接口之一,log4j是它的一个实现。面向接口编程,告诉我们使用具体的日志系统,不方便升级换代,引入的jar中的日志系统还可能不一样, 无法使用,所以需要一个统一的日志接口,可以兼容各类日志系统。那就是slf4j。

slf4j提供接口,供用户使用。但不提供实现,用户要在自己的项目中进行选择配置 期望的日志系统。只要引入的jar中都使用slf4j,那么就不会出现兼容问题。

具体使用方法是,在slf4j和具体的日志系统中间使用桥接,实现slf4j的spi接口,同时 使用具体的日志系统。

详细请参考https://www.jianshu.com/p/370ed25cb7c4

Log2j配置文件

Log4j2支持多种配置文件,XML、JSON、YAML和perperties文件都支持,当然最常用的还是XML文件。将配置文件放在类路径下即可,如果使用Maven或者Gradle的话,就是在resources文件夹下。

xml格式

<?xml version="1.0" encoding="UTF-8"?> <configuration status="WARN" monitorInterval="30"> <!-- appeenders自定义log控制的标签 --> <appenders> <!-- 输出到控制台的配置 --> <console name="Console" target="SYSTEM_OUT"> <!-- 日志输出的格式 --> <PatternLayout pattern="[%d{yyyy-MM-dd HH:mm:ss:SSS}] [%p] - %l - %m%n"/> </console> <!--文件回打印出所有的信息,这个log每次运行回自动清空,由append属性决定,适合临时测试使用 append为TRUE表示消息增加到指定文件中,false表示 消息覆盖指定的文件内容,默认是TRUE--> <File name="log" fileName="./logs/test.log" append="false"> <PatternLayout pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} %-5level %class{36} %L %M - %msg%xEx%n"/> </File> <!--产生一个新的log文件 filePattern指定在文件超过指定size的时候的备份路径 --> <RollingFile name="RollingFileInfo" fileName="./logs/info.log" filePattern="./logs/backups/$${date:yyyy-MM}/info-%d{yyyy-MM-dd}-%i.log.gz"> <!-- 指定log输出的最低优先级 --> <ThresholdFilter level="info" onMatch="ACCEPT" onMismatch="DENY"/> <PatternLayout pattern="[%d{yyyy-MM-dd HH:mm:ss:SSS}] [%p] - %l - %m%n"/> <Policies> <TimeBasedTriggeringPolicy/> <!-- 日志的size--> <SizeBasedTriggeringPolicy size="5 MB"/> </Policies> </RollingFile> <RollingFile name="RollingFileWarn" fileName="./logs/warn.log" filePattern="./logs/backups/$${date:yyyy-MM}/warn-%d{yyyy-MM-dd}-%i.log"> <ThresholdFilter level="warn" onMatch="ACCEPT" onMismatch="DENY"/> <PatternLayout pattern="[%d{yyyy-MM-dd HH:mm:ss:SSS}] [%p] - %l - %m%n"/> <Policies> <TimeBasedTriggeringPolicy/> <SizeBasedTriggeringPolicy size="5 MB"/> </Policies> <DefaultRolloverStrategy max="20"/> </RollingFile> <RollingFile name="RollingFileError" fileName="./logs/error.log" filePattern="./logs/backups/$${date:yyyy-MM}/error-%d{yyyy-MM-dd}-%i.log"> <ThresholdFilter level="error" onMatch="ACCEPT" onMismatch="DENY"/> <PatternLayout pattern="[%d{yyyy-MM-dd HH:mm:ss:SSS}] [%p] - %l - %m%n"/> <Policies> <TimeBasedTriggeringPolicy/> <SizeBasedTriggeringPolicy size="5 MB"/> </Policies> </RollingFile> </appenders> <!-- log的记录器 建立要给默认的logger--> <loggers> <root level="all"> <appender-ref ref="Console"/> <appender-ref ref="RollingFileInfo"/> <appender-ref ref="RollingFileWarn"/> <appender-ref ref="RollingFileError"/> </root> </loggers> </configuration>

 

Yaml格式

如果要使用yaml格式则需要在引入jar包

maven:

<dependency>     <groupId>com.fasterxml.jackson.dataformat</groupId>     <artifactId>jackson-dataformat-yaml</artifactId>     <version>2.10.1</version> </dependency>

gradle: compile group: 'com.fasterxml.jackson.dataformat', name: 'jackson-dataformat-yaml', version: '2.10.1'

Configuration: Properties: Property: - name: log-path value: "logs" - name: charset value: "UTF-8" - name: compact value: false - name: eventEol value: true - name: kafka-topic value: test - name: bootstrap-servers value: 127.0.0.1:9092 - name: complete value: false - name: stacktraceAsString value: true - name: log.pattern value: "%d{yyyy-MM-dd HH:mm:ss.SSS} -%5p ${PID:-} [%X{tracking_id}] [%15.15t] %-30.30C{1.} : %m%n" Appenders: Console: name: CONSOLE target: SYSTEM_OUT PatternLayout: pattern: ${log.pattern} RollingFile: - name: REQUEST_LOG fileName: ${log-path}/request.log filePattern: "${log-path}/historyLog/info-%d{yyyy-MM-dd}-%i.log.gz" Filters: ThresholdFilter: - level: error onMatch: DENY onMismatch: NEUTRAL - level: warn onMatch: DENY onMismatch: NEUTRAL - level: info onMatch: ACCEPT onMismatch: DENY JsonLayout: - charset: ${charset} compact: ${compact} complete: ${complete} stacktraceAsString: ${stacktraceAsString} eventEol: ${eventEol} properties: true KeyValuePair: - key: tags value: REQUEST_LOG Policies: TimeBasedTriggeringPolicy: interval: 1 modulate: true DefaultRolloverStrategy: max: 100 - name: SERVICE_LOG fileName: ${log-path}/service.log filePattern: "${log-path}/historyLog/service-%d{yyyy-MM-dd}-%i.log.gz" Filters: ThresholdFilter: - level: error onMatch: DENY onMismatch: NEUTRAL - level: info onMatch: ACCEPT onMismatch: DENY JsonLayout: - charset: ${charset} compact: ${compact} complete: ${complete} stacktraceAsString: ${stacktraceAsString} eventEol: ${eventEol} properties: true objectMessageAsJsonObject: true KeyValuePair: - key: tags value: SERVICE_LOG Policies: TimeBasedTriggeringPolicy: interval: 1 modulate: true DefaultRolloverStrategy: max: 100 - name: ERROR_LOG fileName: ${log-path}/error.log filePattern: "${log-path}/historyLog/error-%d{yyyy-MM-dd}-%i.log.gz" Filters: ThresholdFilter: - level: error onMatch: ACCEPT onMismatch: DENY JsonLayout: - charset: ${charset} compact: ${compact} complete: ${complete} stacktraceAsString: ${stacktraceAsString} eventEol: ${eventEol} properties: true KeyValuePair: - key: tags value: ERROR_LOG Policies: TimeBasedTriggeringPolicy: interval: 1 modulate: true DefaultRolloverStrategy: max: 100 # RollingFile: # - name: REQUEST_LOG # fileName: ${log-path}/request.log # filePattern: "${log-path}/historyLog/info-%d{yyyy-MM-dd}-%i.log.gz" # PatternLayout: # charset: ${charset} # pattern: ${log.pattern} # Filters: # ThresholdFilter: # - level: error # onMatch: DENY # onMismatch: NEUTRAL # - level: warn # onMatch: DENY # onMismatch: NEUTRAL # - level: debug # onMatch: ACCEPT # onMismatch: DENY # Policies: # TimeBasedTriggeringPolicy: # interval: 1 # modulate: true # DefaultRolloverStrategy: # max: 100 # - name: SERVICE_LOG # fileName: ${log-path}/service.log # filePattern: "${log-path}/historyLog/service-%d{yyyy-MM-dd}-%i.log.gz" # PatternLayout: # charset: ${charset} # pattern: ${log.pattern} # Filters: # ThresholdFilter: # - level: info # onMatch: ACCEPT # onMismatch: DENY # Policies: # TimeBasedTriggeringPolicy: # interval: 1 # modulate: true # DefaultRolloverStrategy: # max: 100 # - name: ERROR_LOG # fileName: ${log-path}/error.log # filePattern: "${log-path}/historyLog/error-%d{yyyy-MM-dd}-%i.log.gz" # PatternLayout: # charset: ${charset} # pattern: ${log.pattern} # Filters: # ThresholdFilter: # - level: error # onMatch: ACCEPT # onMismatch: DENY # Policies: # TimeBasedTriggeringPolicy: # interval: 1 # modulate: true # DefaultRolloverStrategy: # max: 100 Kafka: - name: KAFKA_REQUEST_LOG topic: ${kafka-topic} Property: name: bootstrap.servers value: ${bootstrap-servers} Filters: ThresholdFilter: - level: error onMatch: DENY onMismatch: NEUTRAL - level: warn onMatch: DENY onMismatch: NEUTRAL - level: info onMatch: ACCEPT onMismatch: DENY JsonLayout: - charset: ${charset} compact: ${compact} complete: ${complete} stacktraceAsString: ${stacktraceAsString} eventEol: ${eventEol} properties: true KeyValuePair: - key: tags value: INFO_LOG - name: KAFKA_SERVICE_LOG topic: ${kafka-topic} Property: name: bootstrap.servers value: ${bootstrap-servers} Filters: ThresholdFilter: - level: error onMatch: DENY onMismatch: NEUTRAL - level: info onMatch: ACCEPT onMismatch: DENY JsonLayout: - charset: ${charset} compact: ${compact} complete: ${complete} stacktraceAsString: ${stacktraceAsString} eventEol: ${eventEol} properties: true objectMessageAsJsonObject: true KeyValuePair: - key: tags value: SERVICE_LOG - name: KAFKA_ERROR_LOG topic: ${kafka-topic} Property: name: bootstrap.servers value: ${bootstrap-servers} Filters: ThresholdFilter: - level: error onMatch: ACCEPT onMismatch: DENY JsonLayout: - charset: ${charset} compact: ${compact} complete: ${complete} stacktraceAsString: ${stacktraceAsString} eventEol: ${eventEol} properties: true KeyValuePair: - key: tags value: ERROR_LOG Loggers: AsyncRoot: level: debug # add location in async includeLocation: true AppenderRef: - ref: CONSOLE AsyncLogger: - name: REQUEST_LOG AppenderRef: - ref: REQUEST_LOG - name: SERVICE_LOG AppenderRef: - ref: SERVICE_LOG - name: ERROR_LOG AppenderRef: - ref: ERROR_LOG

 

最新回复(0)