scala – 如何在单元测试中抑制火花记录?
发布时间:2020-12-16 09:29:45 所属栏目:安全 来源:网络整理
导读:所以感谢容易的googleable博客我试过: import org.specs2.mutable.Specificationclass SparkEngineSpecs extends Specification { sequential def setLogLevels(level: Level,loggers: Seq[String]): Map[String,Level] = loggers.map(loggerName = { val l
所以感谢容易的googleable博客我试过:
import org.specs2.mutable.Specification class SparkEngineSpecs extends Specification { sequential def setLogLevels(level: Level,loggers: Seq[String]): Map[String,Level] = loggers.map(loggerName => { val logger = Logger.getLogger(loggerName) val prevLevel = logger.getLevel logger.setLevel(level) loggerName -> prevLevel }).toMap setLogLevels(Level.WARN,Seq("spark","org.eclipse.jetty","akka")) val sc = new SparkContext(new SparkConf().setMaster("local").setAppName("Test Spark Engine")) // ... my unit tests 但不幸的是它不工作,我仍然得到很多火花输出,例如: 14/12/02 12:01:56 INFO MemoryStore: Block broadcast_4 of size 4184 dropped from memory (free 583461216) 14/12/02 12:01:56 INFO ContextCleaner: Cleaned broadcast 4 14/12/02 12:01:56 INFO ContextCleaner: Cleaned shuffle 4 14/12/02 12:01:56 INFO ShuffleBlockManager: Deleted all files for shuffle 4 解决方法
将以下代码添加到src / test / resources目录中的log4j.properties文件中,如果不存在,请创建文件/目录
# Change this to set Spark log level log4j.logger.org.apache.spark=WARN # Silence akka remoting log4j.logger.Remoting=WARN # Ignore messages below warning level from Jetty,because it's a bit verbose log4j.logger.org.eclipse.jetty=WARN 当我运行单元测试(我使用JUnit和Maven)时,我只接收到WARN级别的日志,换句话说,它不会混乱INFO级别的日志(尽管它们有时可用于调试)。 我希望这有帮助。 (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |