Spark中日志文件log4j设置
log4j.rootCategory=ERROR, console\ log4j.appender.console=org.apache.log4j.ConsoleAppender \ log4j.appender.console.target=System.err \ log4j.appender.console.layout=org.apache.log4j.PatternLayout \ log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd\ HH:mm:ss} %p %c{1}: %m%n \ # Set the default spark-shell log level to ERROR. When running the spark-shell, the \ # log level for this class is used to overwrite the root logger's log level, so that \ # the user can have different defaults for the shell and regular Spark apps. \ log4j.logger.org.apache.spark.repl.Main=ERROR \ # Settings to quiet third party logs that are too verbose \ log4j.logger.org.spark_project.jetty=ERROR \ log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR \ log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=ERROR \ log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=ERROR \ log4j.logger.org.apache.parquet=ERROR log4j.logger.parquet=ERROR \ # SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support \ log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL \ log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
注意:可以在项目的resources目录中创建log4j.properties文件,并添加日志配置信息。

浙公网安备 33010602011771号