我已经浏览了以下问题和页面来寻找我的问题的答案,但它们并没有解决我的问题:
https://www.javacodegeeks.com/2016/03/log-apache-spark.html
我们在独立模式下使用 Spark,而不是在 Yarn 上。我已经在驱动程序和执行程序中配置了 log4j.properties 文件来定义自定义记录器“myLogger”。我在驱动程序和执行程序中复制的 log4j.properties 文件如下:
log4j.rootLogger=INFO, Console_Appender, File_Appender
log4j.appender.Console_Appender=org.apache.log4j.ConsoleAppender
log4j.appender.Console_Appender.Threshold=INFO
log4j.appender.Console_Appender.Target=System.out
log4j.appender.Console_Appender.layout=org.apache.log4j.PatternLayout
log4j.appender.Console_Appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
log4j.appender.File_Appender=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.File_Appender.Threshold=INFO
log4j.appender.File_Appender.File=/opt/spark_log/app_log.txt
log4j.appender.File_Appender.RollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.File_Appender.TriggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
log4j.appender.File_Appender.RollingPolicy.FileNamePattern=/opt/spark_log/app_log.%d{MM-dd-yyyy}.%i.txt.gz
log4j.appender.File_Appender.RollingPolicy.ActiveFileName=/opt/spark_log/app_log.txt
log4j.appender.File_Appender.TriggeringPolicy.MaxFileSize=1000
log4j.appender.File_Appender.layout=org.apache.log4j.PatternLayout
log4j.appender.File_Appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c - %m%n
log4j.logger.myLogger=INFO,File_Appender
# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=WARN
# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark-project.jetty=WARN
log4j.logger.org.spark-project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
在我的 Java 应用程序中,我使用以下行放置了记录器:
private static Logger logger = LogManager.getLogger("myLogger");
我正在使用以下命令运行应用程序:
spark-submit --driver-java-options "-Dlog4j.configuration=file:///opt/spark/spark-2.4.4/conf/log4j.properties" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/spark-2.4.4/conf/log4j.properties" --class com.test.SparkApp file:///opt/test/cepbck/test.person.app-0.0.7.jar
当我在集群上运行应用程序时,主驱动程序类中的日志在控制台和日志文件中都显示得很好。但是当控件进入 UDF 时,不会打印任何日志。我正在打开执行程序的日志文件,但它们也不包含我给出的任何日志语句。请在这方面帮助我。