Scala:使用log4j将日志写入文件
发布时间:2020-12-16 10:01:07 所属栏目:安全 来源:网络整理
导读:我试图在 eclipse中构建一个基于scala的jar文件,它使用log4j来创建日志.它在控制台中完美打印出来,但是当我尝试使用log4j.properties文件将其写入日志文件时,没有任何反应. 项目结构如下 loggerTest.scala package scala.n*****.n****import org.apache.log4
我试图在
eclipse中构建一个基于scala的jar文件,它使用log4j来创建日志.它在控制台中完美打印出来,但是当我尝试使用log4j.properties文件将其写入日志文件时,没有任何反应.
项目结构如下 loggerTest.scala package scala.n*****.n**** import org.apache.log4j.Logger object loggerTest extends LogHelper { def main(args : Array[String]){ log.info("This is info"); log.error("This is error"); log.warn("This is warn"); } } trait LogHelper { lazy val log = Logger.getLogger(this.getClass.getName) } log4j.properties # Root logger option log4j.rootLogger=WARN,stdout,file # Redirect log messages to console log4j.appender.stdout=org.apache.log4j.ConsoleAppender log4j.appender.stdout.Target=System.out log4j.appender.stdout.layout=org.apache.log4j.PatternLayout log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n # Redirect log messages to a log file log4j.appender.file=org.apache.log4j.RollingFileAppender log4j.appender.file.File=/home/abc/test/abc.log log4j.appender.file.encoding=UTF-8 log4j.appender.file.MaxFileSize=2kB log4j.appender.file.MaxBackupIndex=1 log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n 的pom.xml <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>loggerTest</groupId> <artifactId>loggerTest</artifactId> <version>0.0.1-SNAPSHOT</version> <name>loggerTest</name> <description>loggerTest</description> <repositories> <repository> <id>scala-tools.org</id> <name>Scala-tools Maven2 Repository</name> <url>http://scala-tools.org/repo-releases</url> </repository> <repository> <id>maven-hadoop</id> <name>Hadoop Releases</name> <url>https://repository.cloudera.com/content/repositories/releases/</url> </repository> <repository> <id>cloudera-repos</id> <name>Cloudera Repos</name> <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url> </repository> </repositories> <pluginRepositories> <pluginRepository> <id>scala-tools.org</id> <name>Scala-tools Maven2 Repository</name> <url>http://scala-tools.org/repo-releases</url> </pluginRepository> </pluginRepositories> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> </properties> <build> <plugins> <!-- mixed scala/java compile --> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <executions> <execution> <id>compile</id> <goals> <goal>compile</goal> </goals> <phase>compile</phase> </execution> <execution> <id>test-compile</id> <goals> <goal>testCompile</goal> </goals> <phase>test-compile</phase> </execution> <execution> <phase>process-resources</phase> <goals> <goal>compile</goal> </goals> </execution> </executions> </plugin> <plugin> <artifactId>maven-compiler-plugin</artifactId> <configuration> <source>1.7</source> <target>1.7</target> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-jar-plugin</artifactId> <configuration> <archive> <manifest> <addClasspath>true</addClasspath> <mainClass>fully.qualified.MainClass</mainClass> </manifest> </archive> </configuration> </plugin> </plugins> <pluginManagement> <plugins> <!--This plugin's configuration is used to store Eclipse m2e settings only. It has no influence on the Maven build itself. --> <plugin> <groupId>org.eclipse.m2e</groupId> <artifactId>lifecycle-mapping</artifactId> <version>1.0.0</version> <configuration> <lifecycleMappingMetadata> <pluginExecutions> <pluginExecution> <pluginExecutionFilter> <groupId>org.scala-tools</groupId> <artifactId> maven-scala-plugin </artifactId> <versionRange> [2.15.2,) </versionRange> <goals> <goal>compile</goal> <goal>testCompile</goal> </goals> </pluginExecutionFilter> <action> <execute></execute> </action> </pluginExecution> </pluginExecutions> </lifecycleMappingMetadata> </configuration> </plugin> </plugins> </pluginManagement> </build> <dependencies> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.10</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-hive_2.10</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.10</artifactId> <version>1.5.0</version> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>2.10.4</version> </dependency> </dependencies> </project> 当我将其作为maven构建运行时,它会在“target”文件夹中生成一个jar文件. 我将jar复制到/ home / abc / test 我用命令在spark shell中运行那个jar $spark-submit --class scala.n*****.n*****.loggerTest loggerTest-0.0.1-SNAPSHOT.jar 控制台出来没问题,但它应该写入/ home / abc / log中的文件,而不是. 有人可以帮忙吗? 解决方法
您在部署应用程序时遇到问题,您应该为执行程序和驱动程序定义log4j文件,如下所示
spark-submit --class MAIN_CLASS --driver-java-options "-Dlog4j.configuration=file:PATH_OF_LOG4J" --conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:PATH_OF_LOG4J" --master MASTER_IP:PORT JAR_PATH 有关详细信息和分步解决方案,请查看此链接https://blog.knoldus.com/2016/02/23/logging-spark-application-on-standalone-cluster/ (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |