scala – 错误用法:线程SparkListenerBus中未捕获的异常
我尝试用Apache Spark执行简单的项目。这是我的代码SimpleApp.scala
/* SimpleApp.scala */ import org.apache.spark.SparkContext import org.apache.spark.SparkContext._ import org.apache.spark.SparkConf object SimpleApp { def main(args: Array[String]) { val logFile = "/home/hduser/spark-1.2.0-bin-hadoop2.4/README.md" // Should be some file on your system // val conf = new SparkConf().setAppName("Simple Application") val sc = new SparkContext("local","Simple Job","/home/hduser/spark-1.2.0-bin-hadoop2.4/") val logData = sc.textFile(logFile,2).cache() val numAs = logData.filter(line => line.contains("hadoop")).count() val numBs = logData.filter(line => line.contains("see")).count() println("Lines with hadoop: %s,Lines with see: %s".format(numAs,numBs)) } } 当我用命令行手动将此作业发送到Spark时:/home/hduser/spark-1.2.0-hadoop-2.4.0/bin/spark-submit –class“SimpleApp”–master local [4] target / scala -2.10 / simple-project_2.10-1.0.jar它运行成功。 如果我运行sbt运行和服务apache火花正在运行,它是成功的,但在日志结束它给出如下错误: 15/02/06 15:56:49 ERROR Utils: Uncaught exception in thread SparkListenerBus java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer.doAcquireSharedInterruptibly(AbstractQueuedSynchronizer.java:996) at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1303) at java.util.concurrent.Semaphore.acquire(Semaphore.java:317) at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:48) at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47) at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460) at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46) 15/02/06 15:56:49 ERROR ContextCleaner: Error in cleaning thread java.lang.InterruptedException at java.lang.Object.wait(Native Method) at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135) at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply$mcV$sp(ContextCleaner.scala:136) at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134) at org.apache.spark.ContextCleaner$$anonfun$org$apache$spark$ContextCleaner$$keepCleaning$1.apply(ContextCleaner.scala:134) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1460) at org.apache.spark.ContextCleaner.org$apache$spark$ContextCleaner$$keepCleaning(ContextCleaner.scala:133) at org.apache.spark.ContextCleaner$$anon$3.run(ContextCleaner.scala:65) 我的代码有错吗提前致谢。 解决方法
根据
this mail archive,即:
该错误可以被忽略。 (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |