scala – 启动spark-shell时WARN消息的含义是什么?
发布时间:2020-12-16 19:21:31 所属栏目:安全 来源:网络整理
导读:在启动我的spark- shell时,我收到了一堆WARN消息.但我无法理解他们.我应该注意哪些重要问题?或者我错过了任何配置?或者这些WARN消息是正常的. cliu@cliu-ubuntu:Apache-Spark$spark-shell log4j:WARN No appenders could be found for logger (org.apache.
在启动我的spark-
shell时,我收到了一堆WARN消息.但我无法理解他们.我应该注意哪些重要问题?或者我错过了任何配置?或者这些WARN消息是正常的.
cliu@cliu-ubuntu:Apache-Spark$spark-shell log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties To adjust logging level use sc.setLogLevel("INFO") Welcome to ____ __ / __/__ ___ _____/ /__ _ / _ / _ `/ __/ '_/ /___/ .__/_,_/_/ /_/_ version 1.5.2 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM,Java 1.8.0_66) Type in expressions to have them evaluated. Type :help for more information. 15/11/30 11:43:54 WARN Utils: Your hostname,cliu-ubuntu resolves to a loopback address: 127.0.1.1; using xxx.xxx.xxx.xx (`here I hide my IP`) instead (on interface wlan0) 15/11/30 11:43:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 15/11/30 11:43:55 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set. Spark context available as sc. 15/11/30 11:43:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 15/11/30 11:43:58 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 15/11/30 11:44:11 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 15/11/30 11:44:11 WARN ObjectStore: Failed to get database default,returning NoSuchObjectException 15/11/30 11:44:14 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 15/11/30 11:44:14 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 15/11/30 11:44:14 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 15/11/30 11:44:27 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 15/11/30 11:44:27 WARN ObjectStore: Failed to get database default,returning NoSuchObjectException SQL context available as sqlContext. scala> 解决方法
记录信息绝对正常.这里BoneCP尝试绑定到JDBC连接,这就是您收到这些警告的原因.在任何情况下,如果您想管理日志记录,您可以通过复制< spark-path> /conf/log4j.properties.template来指定日志记录级别.
将文件发送到< spark-path> /conf/log4j.properties并进行配置. 最后,可以在此处找到类似的日志记录级别答案: (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |