scala – 在Windows 7上运行spark-submit后无法删除临时文件
发布时间:2020-12-16 19:23:59 所属栏目:安全 来源:网络整理
导读:我正在使用 this示例中的代码来使用spark运行scala程序.程序执行正常,但是当StreamingContext尝试停止时,我收到此错误: java.io.IOException: Failed to delete: ..AppDataLocalTempspark-53b87fb3-1154-4f0b-a258-8dbeab6601ab at org.apache.spark.ut
我正在使用
this示例中的代码来使用spark运行scala程序.程序执行正常,但是当StreamingContext尝试停止时,我收到此错误:
java.io.IOException: Failed to delete: ..AppDataLocalTempspark-53b87fb3-1154-4f0b-a258-8dbeab6601ab at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1010) at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65) at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62) at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33) at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186) at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62) at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1951) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188) at scala.util.Try$.apply(Try.scala:192) at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54) 我没有改变代码.只需将其克隆到我的本地文件系统,运行sbt assembly命令生成.jar文件,然后使用spark-submit运行该程序. 此外,我正在以管理员身份运行Windows cmd,因此我认为这不是特权问题. 什么导致这个错误的任何线索? 感谢帮助! 解决方法
我认为spark app会在您的本地系统中创建临时暂存文件(可能在调用检查点时),并且当上下文停止时尝试清理临时文件而无法删除.有2个选项,文件已被删除或没有删除权限.
(编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |