Flink yarn集群 日志冲突问题
问题1 项目无日志输出:
类路径中存在多个 SLF4J 绑定(StaticLoggerBinder.class)。SLF4J 是一个日志抽象框架,它通过绑定到具体的日志实现来工作。在这种情况下,有三个不同的 SLF4J 绑定:
- 在 
dsp-flink-1.0.0.jar中,绑定了一个StaticLoggerBinder。 - 在 
slf4j-log4j12-1.7.15.jar中,绑定了StaticLoggerBinder。 - 在 
slf4j-reload4j-1.7.36.jar中,绑定了StaticLoggerBinder。 
LogType:jobmanager.err LogLastModifiedTime:Tue Dec 10 17:53:49 +0800 2024 LogLength:1722 LogContents: SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/data/tmp/hadoop/nodemanager/usercache/root/appcache/application_1732245980801_0042/filecache/16/dsp-flink-1.0.0.jar!/org/slf4j/impl/StaticL oggerBinder.class] SLF4J: Found binding in [jar:file:/data/tmp/hadoop/nodemanager/usercache/root/appcache/application_1732245980801_0042/filecache/10/slf4j-log4j12-1.7.15.jar!/org/slf4j/impl/St aticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/hadoop-3.4.0/share/hadoop/common/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] ERROR StatusLogger Unrecognized format specifier [d] ERROR StatusLogger Unrecognized conversion specifier [d] starting at position 16 in conversion pattern. ERROR StatusLogger Unrecognized format specifier [thread] ERROR StatusLogger Unrecognized conversion specifier [thread] starting at position 25 in conversion pattern. ERROR StatusLogger Unrecognized format specifier [level] ERROR StatusLogger Unrecognized conversion specifier [level] starting at position 35 in conversion pattern. ERROR StatusLogger Unrecognized format specifier [logger] ERROR StatusLogger Unrecognized conversion specifier [logger] starting at position 47 in conversion pattern. ERROR StatusLogger Unrecognized format specifier [msg] ERROR StatusLogger Unrecognized conversion specifier [msg] starting at position 54 in conversion pattern.
由于存在多个绑定,SLF4J 不知道应该使用哪一个绑定,最终会使用其中一个绑定(一般选择最后一个);项目实际使用的是日志配置是log4j2.xml,进而导致日志配置文件和使用的日志框架不符,加载日志配置文件失败。 直接将
/data/hadoop-3.4.0/share/hadoop/common/lib下 依赖的包和项目依赖的日志jar包保持一致即可:
方案一:替换hadoop下的包和项目依赖的一样 ,直接替换对应日志jar包即可,但需要重启hadoop,影响线上服务的话推荐方案二
方案二:修改项目依赖的jar包保持和hadoop的一致:
log4j.properties文件:
# 设置日志级别 log4j.rootLogger=INFO, console, file # 控制台输出配置 log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.Target=System.out log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c - %m%n # 文件输出配置 log4j.appender.file=org.apache.log4j.FileAppender log4j.appender.file.File=/data/tmp/flink-logs/flink.log log4j.appender.file.Append=true log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{ISO8601} [%t] %-5p %c{1} - %m%n # 控制日志级别 log4j.logger.org.apache.flink=INFO log4j.logger.org.apache.hadoop=INFO log4j.logger.org.apache=INFO
项目pom.xml 增加和hadoop相同的依赖日志jar包:
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.36</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.slf4j/slf4j-reload4j -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-reload4j</artifactId>
<version>1.7.36</version>
</dependency>
<!-- https://mvnrepository.com/artifact/ch.qos.reload4j/reload4j -->
<dependency>
<groupId>ch.qos.reload4j</groupId>
<artifactId>reload4j</artifactId>
<version>1.2.22</version>
</dependency>
                    
                
                
            
        
浙公网安备 33010602011771号