4

我正在使用示例中的代码来运行使用 spark 的 scala 程序。程序执行得很好,但是当 StreamingContext 试图停止时,我得到了这个错误:

java.io.IOException: Failed to delete: ..\AppData\Local\Temp\spark-53b87fb3-1154-4f0b-a258-8dbeab6601ab
        at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1010)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:65)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1$$anonfun$apply$mcV$sp$3.apply(ShutdownHookManager.scala:62)
        at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
        at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
        at org.apache.spark.util.ShutdownHookManager$$anonfun$1.apply$mcV$sp(ShutdownHookManager.scala:62)
        at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:216)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1951)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(ShutdownHookManager.scala:188)
        at scala.util.Try$.apply(Try.scala:192)
        at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
        at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
        at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)

我没有改变任何代码。只需将其克隆到我的本地文件系统,运行sbt assembly命令生成 .jar 文件,然后使用 .jar 运行程序spark-submit

另外,我以管理员身份运行 windows cmd,所以我认为这不是权限问题。

关于导致此错误的任何线索?

感谢帮助!

4

1 回答 1

0

我认为 spark 应用程序会在您的本地系统中创建临时暂存文件(可能在调用检查点时),并且当上下文停止时会尝试清理临时文件并且无法删除。有 2 个选项,要么文件已删除,要么没有删除权限。

于 2017-03-09T13:56:42.407 回答