1

我是 Apache Spark 的新手,并试图在我的本地 spark 设置上运行示例 Pi 计算应用程序(使用独立集群)。Master、Slave 和 Driver 都在我的本地机器上运行。

我注意到的是,PI 计算成功,但是在从属日志中,我看到 Worker/Executor 被 exitStatus 1 杀死。否​​则我没有看到任何错误/异常记录到控制台。我尝试在类似问题上寻求帮助,但大多数搜索命中都是指 exitStatus 137 等(例如:Spark application kills executor

我悲惨地无法理解为什么 Worker 被杀死而不是在 ' EXITED' 状态下完成执行。我认为这与我执行应用程序的方式有关,但不太清楚我做错了什么。有人可以指导我找出根本原因吗?

下面给出的是我用于 PI 计算的代码以及主、从、驱动程序的日志。

PI 计算应用程序

package sparky

import org.apache.spark.scheduler._
import org.apache.spark.sql.SparkSession

import scala.math.random

object Application {
  def runSpark(args: Array[String] ): Unit = {
    val spark = SparkSession
      .builder
      .appName("Spark Pi")
      .getOrCreate()

    spark.sparkContext.addSparkListener(new MyListener())

    val slices = if (args.length > 0) args(0).toInt else 2
    val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow
    val count = spark.sparkContext.parallelize(1 until n, slices).map { i =>
      val x = random * 2 - 1
      val y = random * 2 - 1
      if (x * x + y * y <= 1) 1 else 0
    }.reduce(_ + _)
    println("Pi is roughly " + 4.0 * count / (n - 1))
    spark.stop()
  }

  def main(args: Array[String]) = {
    Application.runSpark(args)
  }
}

主控台输出

C:\Servers\apache-spark\2.2.0\bin
λ start-master.cmd -h 0.0.0.0
C:\Platforms\Java\jdk1.8.0_65\bin\java -cp "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" -Xmx1g org.apache.spark.deploy.master.Master
18/01/25 09:01:30,099 INFO Master: Started daemon with process name: 14900@somemachine
18/01/25 09:01:30,580 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/25 09:01:30,680 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:30,681 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:30,682 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:30,683 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:30,684 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(someuser); groups with view permissions: Set(); users  with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:31,711 INFO Utils: Successfully started service 'sparkMaster' on port 7077.
18/01/25 09:01:31,829 INFO Master: Starting Spark master at spark://0.0.0.0:7077
18/01/25 09:01:31,833 INFO Master: Running Spark version 2.2.0
18/01/25 09:01:31,903 INFO log: Logging initialized @2692ms
18/01/25 09:01:31,960 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:32,025 INFO Server: Started @2816ms
18/01/25 09:01:32,057 INFO AbstractConnector: Started ServerConnector@106ca013{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
18/01/25 09:01:32,058 INFO Utils: Successfully started service 'MasterUI' on port 8080.
18/01/25 09:01:32,087 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@41cc88b{/app,null,AVAILABLE,@Spark}
18/01/25 09:01:32,088 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1c63bda6{/app/json,null,AVAILABLE,@Spark}
18/01/25 09:01:32,089 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@45ae273f{/,null,AVAILABLE,@Spark}
18/01/25 09:01:32,090 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7a319c60{/json,null,AVAILABLE,@Spark}
18/01/25 09:01:32,098 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@23510beb{/static,null,AVAILABLE,@Spark}

18/01/25 09:01:32,099 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@462c632c{/app/kill,null,AVAILABLE,@Spark}
18/01/25 09:01:32,101 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@436ef27b{/driver/kill,null,AVAILABLE,@Spark}
18/01/25 09:01:32,104 INFO MasterWebUI: Bound MasterWebUI to 0.0.0.0, and started at http://192.168.56.1:8080
18/01/25 09:01:32,119 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:32,130 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6f7d1cba{/,null,AVAILABLE}
18/01/25 09:01:32,134 INFO AbstractConnector: Started ServerConnector@3f9e9637{HTTP/1.1,[http/1.1]}{0.0.0.0:6066}
18/01/25 09:01:32,134 INFO Server: Started @2925ms
18/01/25 09:01:32,134 INFO Utils: Successfully started service on port 6066.
18/01/25 09:01:32,135 INFO StandaloneRestServer: Started REST server for submitting applications on port 6066
18/01/25 09:01:32,358 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7b3e5adb{/metrics/master/json,null,AVAILABLE,@Spark}
18/01/25 09:01:32,362 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@139cbe00{/metrics/applications/json,null,AVAILABLE,@Spark}
18/01/25 09:01:32,399 INFO Master: I have been elected leader! New state: ALIVE
18/01/25 09:01:41,225 INFO Master: Registering worker 192.168.56.1:48591 with 4 cores, 14.4 GB RAM
18/01/25 09:01:53,510 INFO Master: Registering app Spark Pi
18/01/25 09:01:53,515 INFO Master: Registered app Spark Pi with ID app-20180125090153-0000
18/01/25 09:01:53,569 INFO Master: Launching executor app-20180125090153-0000/0 on worker worker-20180125090140-192.168.56.1-48591
18/01/25 09:02:00,262 INFO Master: Received unregister request from application app-20180125090153-0000
18/01/25 09:02:00,269 INFO Master: Removing app app-20180125090153-0000
18/01/25 09:02:00,323 WARN Master: Got status update for unknown executor app-20180125090153-0000/0
18/01/25 09:02:00,338 INFO Master: 127.0.0.1:48625 got disassociated, removing it.
18/01/25 09:02:00,345 INFO Master: 192.168.56.1:48620 got disassociated, removing it.

从控制台输出

C:\Servers\apache-spark\2.2.0\bin
λ start-slave.cmd -h 0.0.0.0
C:\Platforms\Java\jdk1.8.0_65\bin\java -cp "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" -Xmx1g org.apache.spark.deploy.worker.Worker spark://0.0.0.0:7077
18/01/25 09:01:38,054 INFO Worker: Started daemon with process name: 14532@somemachine
18/01/25 09:01:38,546 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/25 09:01:38,644 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:38,645 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:38,646 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:38,647 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:38,648 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(someuser); groups with view permissions: Set(); users  with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:39,655 INFO Utils: Successfully started service 'sparkWorker' on port 48591.
18/01/25 09:01:40,521 INFO Worker: Starting Spark worker 192.168.56.1:48591 with 4 cores, 14.4 GB RAM
18/01/25 09:01:40,526 INFO Worker: Running Spark version 2.2.0
18/01/25 09:01:40,527 INFO Worker: Spark home: C:\Servers\apache-spark\2.2.0\bin\..
18/01/25 09:01:40,586 INFO log: Logging initialized @3430ms
18/01/25 09:01:40,636 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:40,657 INFO Server: Started @3503ms
18/01/25 09:01:40,787 WARN Utils: Service 'WorkerUI' could not bind on port 8081. Attempting port 8082.
18/01/25 09:01:40,797 INFO AbstractConnector: Started ServerConnector@24c54ec4{HTTP/1.1,[http/1.1]}{0.0.0.0:8082}
18/01/25 09:01:40,797 INFO Utils: Successfully started service 'WorkerUI' on port 8082.
18/01/25 09:01:40,832 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6e86345{/logPage,null,AVAILABLE,@Spark}

18/01/25 09:01:40,833 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@43dbfd42{/logPage/json,null,AVAILABLE,@Spark}
18/01/25 09:01:40,834 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@768b7729{/,null,AVAILABLE,@Spark}
18/01/25 09:01:40,836 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@382e7183{/json,null,AVAILABLE,@Spark}
18/01/25 09:01:40,844 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@459d7b70{/static,null,AVAILABLE,@Spark}

18/01/25 09:01:40,845 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5bf4fc9c{/log,null,AVAILABLE,@Spark}
18/01/25 09:01:40,849 INFO WorkerWebUI: Bound WorkerWebUI to 0.0.0.0, and started at http://192.168.56.1:8082
18/01/25 09:01:40,853 INFO Worker: Connecting to master 0.0.0.0:7077...
18/01/25 09:01:40,885 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4e93ba9d{/metrics/json,null,AVAILABLE,@Spark}
18/01/25 09:01:40,971 INFO TransportClientFactory: Successfully created connection to /0.0.0.0:7077 after 82 ms (0 ms spent in bootstraps)
18/01/25 09:01:41,246 INFO Worker: Successfully registered with master spark://0.0.0.0:7077
18/01/25 09:01:53,621 INFO Worker: Asked to launch executor app-20180125090153-0000/0 for Spark Pi
18/01/25 09:01:53,661 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:53,663 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:53,664 INFO SecurityManager: Changing view acls groups to:
18/01/25 09:01:53,668 INFO SecurityManager: Changing modify acls groups to:
18/01/25 09:01:53,669 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(someuser); groups with view permissions: Set(); users  with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:53,695 INFO ExecutorRunner: Launch command: "C:\Platforms\Java\jdk1.8.0_65\bin\java" "-cp" "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" "-Xmx1024M" "-Dspark.driver.port=48620" "org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" "spark://CoarseGrainedScheduler@192.168.56.1:48620" "--executor-id" "0" "--hostname" "192.168.56.1" "--cores" "4" "--app-id" "app-20180125090153-0000" "--worker-url" "spark://Worker@192.168.56.1:48591"
18/01/25 09:02:00,297 INFO Worker: Asked to kill executor app-20180125090153-0000/0
18/01/25 09:02:00,303 INFO ExecutorRunner: Runner thread for executor app-20180125090153-0000/0 interrupted
18/01/25 09:02:00,305 INFO ExecutorRunner: Killing process!
18/01/25 09:02:00,323 INFO Worker: Executor app-20180125090153-0000/0 finished with state KILLED exitStatus 1
18/01/25 09:02:00,336 INFO ExternalShuffleBlockResolver: Application app-20180125090153-0000 removed, cleanupLocalDirs = true
18/01/25 09:02:00,340 INFO Worker: Cleaning up local directories for application app-20180125090153-0000

驱动程序控制台输出

9:01:47 AM: Executing task 'submitToSpark'...

C:\Applications\scala\sparky\app\build\libs\sparky-app-0.0.1.jar
:app:compileJava NO-SOURCE
:app:compileScala UP-TO-DATE
:app:processResources NO-SOURCE
:app:classes UP-TO-DATE
:app:jar UP-TO-DATE
:runner:submitToSpark
C:\Platforms\Java\jdk1.8.0_65\bin\java -cp "C:\Servers\apache-spark\2.2.0\bin\..\conf\;C:\Servers\apache-spark\2.2.0\bin\..\jars\*" -Xmx1g org.apache.spark.deploy.SparkSubmit --master spark://localhost:7077 C:\Applications\scala\sparky\app\build\libs\sparky-app-0.0.1.jar 
18/01/25 09:01:51,111 INFO SparkContext: Running Spark version 2.2.0
18/01/25 09:01:51,465 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18/01/25 09:01:51,677 INFO SparkContext: Submitted application: Spark Pi
18/01/25 09:01:51,711 INFO SecurityManager: Changing view acls to: someuser
18/01/25 09:01:51,712 INFO SecurityManager: Changing modify acls to: someuser
18/01/25 09:01:51,712 INFO SecurityManager: Changing view acls groups to: 
18/01/25 09:01:51,713 INFO SecurityManager: Changing modify acls groups to: 
18/01/25 09:01:51,714 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(someuser); groups with view permissions: Set(); users  with modify permissions: Set(someuser); groups with modify permissions: Set()
18/01/25 09:01:52,639 INFO Utils: Successfully started service 'sparkDriver' on port 48620.
18/01/25 09:01:52,669 INFO SparkEnv: Registering MapOutputTracker
18/01/25 09:01:52,695 INFO SparkEnv: Registering BlockManagerMaster
18/01/25 09:01:52,699 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
18/01/25 09:01:52,700 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
18/01/25 09:01:52,712 INFO DiskBlockManager: Created local directory at C:\Users\someuser\AppData\Local\Temp\blockmgr-f9908c61-a91a-43d5-8d24-e0fd86d55d1c
18/01/25 09:01:52,740 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
18/01/25 09:01:52,808 INFO SparkEnv: Registering OutputCommitCoordinator
18/01/25 09:01:52,924 INFO log: Logging initialized @3539ms
18/01/25 09:01:53,009 INFO Server: jetty-9.3.z-SNAPSHOT
18/01/25 09:01:53,038 INFO Server: Started @3654ms
18/01/25 09:01:53,067 INFO AbstractConnector: Started ServerConnector@21a5fd96{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
18/01/25 09:01:53,067 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18/01/25 09:01:53,099 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@40bffbca{/jobs,null,AVAILABLE,@Spark}
18/01/25 09:01:53,100 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6c4f9535{/jobs/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,100 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@30c31dd7{/jobs/job,null,AVAILABLE,@Spark}
18/01/25 09:01:53,101 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@c1fca1e{/jobs/job/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,102 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@344344fa{/stages,null,AVAILABLE,@Spark}
18/01/25 09:01:53,103 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@70e659aa{/stages/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,103 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@285f09de{/stages/stage,null,AVAILABLE,@Spark}
18/01/25 09:01:53,105 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@48e64352{/stages/stage/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,106 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4362d7df{/stages/pool,null,AVAILABLE,@Spark}
18/01/25 09:01:53,106 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1c25b8a7{/stages/pool/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,107 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@750fe12e{/storage,null,AVAILABLE,@Spark}
18/01/25 09:01:53,108 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3e587920{/storage/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,108 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@24f43aa3{/storage/rdd,null,AVAILABLE,@Spark}
18/01/25 09:01:53,109 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1e11bc55{/storage/rdd/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,110 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@70e0accd{/environment,null,AVAILABLE,@Spark}
18/01/25 09:01:53,112 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6ab72419{/environment/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,112 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@4fdfa676{/executors,null,AVAILABLE,@Spark}
18/01/25 09:01:53,113 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@5be82d43{/executors/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,114 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@345e5a17{/executors/threadDump,null,AVAILABLE,@Spark}
18/01/25 09:01:53,115 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@443dbe42{/executors/threadDump/json,null,AVAILABLE,@Spark}
18/01/25 09:01:53,125 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@1734f68{/static,null,AVAILABLE,@Spark}
18/01/25 09:01:53,125 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@31c269fd{/,null,AVAILABLE,@Spark}
18/01/25 09:01:53,127 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@47747fb9{/api,null,AVAILABLE,@Spark}
18/01/25 09:01:53,128 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@70eecdc2{/jobs/job/kill,null,AVAILABLE,@Spark}
18/01/25 09:01:53,129 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@7db0565c{/stages/stage/kill,null,AVAILABLE,@Spark}
18/01/25 09:01:53,133 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.56.1:4040
18/01/25 09:01:53,174 INFO SparkContext: Added JAR file:/C:/Applications/scala/sparky/app/build/libs/sparky-app-0.0.1.jar at spark://192.168.56.1:48620/jars/sparky-app-0.0.1.jar with timestamp 1516888913174
18/01/25 09:01:53,318 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://localhost:7077...
18/01/25 09:01:53,389 INFO TransportClientFactory: Successfully created connection to localhost/127.0.0.1:7077 after 42 ms (0 ms spent in bootstraps)
18/01/25 09:01:53,554 INFO StandaloneSchedulerBackend: Connected to Spark cluster with app ID app-20180125090153-0000
18/01/25 09:01:53,577 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 48642.
18/01/25 09:01:53,578 INFO NettyBlockTransferService: Server created on 192.168.56.1:48642
18/01/25 09:01:53,582 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
18/01/25 09:01:53,590 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,595 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:48642 with 366.3 MB RAM, BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,600 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,601 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.56.1, 48642, None)
18/01/25 09:01:53,667 INFO StandaloneAppClient$ClientEndpoint: Executor added: app-20180125090153-0000/0 on worker-20180125090140-192.168.56.1-48591 (192.168.56.1:48591) with 4 cores
18/01/25 09:01:53,668 INFO StandaloneSchedulerBackend: Granted executor ID app-20180125090153-0000/0 on hostPort 192.168.56.1:48591 with 4 cores, 1024.0 MB RAM
18/01/25 09:01:53,901 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@74fef3f7{/metrics/json,null,AVAILABLE,@Spark}
18/01/25 09:01:55,026 INFO StandaloneAppClient$ClientEndpoint: Executor updated: app-20180125090153-0000/0 is now RUNNING
18/01/25 09:01:55,096 INFO EventLoggingListener: Logging events to file:///C:/Dustbin/spark-events/app-20180125090153-0000
18/01/25 09:01:55,127 INFO StandaloneSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
18/01/25 09:01:55,218 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/C:/Applications/scala/sparky/runner/spark-warehouse/').
18/01/25 09:01:55,219 INFO SharedState: Warehouse path is 'file:/C:/Applications/scala/sparky/runner/spark-warehouse/'.
18/01/25 09:01:55,228 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@50a691d3{/SQL,null,AVAILABLE,@Spark}
18/01/25 09:01:55,228 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@3b95d13c{/SQL/json,null,AVAILABLE,@Spark}
18/01/25 09:01:55,229 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@54d901aa{/SQL/execution,null,AVAILABLE,@Spark}
18/01/25 09:01:55,230 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@573284a5{/SQL/execution/json,null,AVAILABLE,@Spark}
18/01/25 09:01:55,233 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@507b79f7{/static/sql,null,AVAILABLE,@Spark}
18/01/25 09:01:56,232 INFO StateStoreCoordinatorRef: Registered StateStoreCoordinator endpoint
18/01/25 09:01:56,609 INFO SparkContext: Starting job: reduce at Application.scala:29
18/01/25 09:01:56,636 INFO DAGScheduler: Got job 0 (reduce at Application.scala:29) with 2 output partitions
18/01/25 09:01:56,637 INFO DAGScheduler: Final stage: ResultStage 0 (reduce at Application.scala:29)
18/01/25 09:01:56,638 INFO DAGScheduler: Parents of final stage: List()
18/01/25 09:01:56,640 INFO DAGScheduler: Missing parents: List()
18/01/25 09:01:56,654 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[1] at map at Application.scala:25), which has no missing parents
18/01/25 09:01:56,815 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 1800.0 B, free 366.3 MB)
18/01/25 09:01:56,980 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 1168.0 B, free 366.3 MB)
18/01/25 09:01:56,984 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.56.1:48642 (size: 1168.0 B, free: 366.3 MB)
18/01/25 09:01:56,988 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1006
18/01/25 09:01:57,016 INFO DAGScheduler: Submitting 2 missing tasks from ResultStage 0 (MapPartitionsRDD[1] at map at Application.scala:25) (first 15 tasks are for partitions Vector(0, 1))
18/01/25 09:01:57,018 INFO TaskSchedulerImpl: Adding task set 0.0 with 2 tasks
18/01/25 09:01:58,617 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Registered executor NettyRpcEndpointRef(spark-client://Executor) (192.168.56.1:48660) with ID 0
18/01/25 09:01:58,661 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0, 192.168.56.1, executor 0, partition 0, PROCESS_LOCAL, 4829 bytes)
18/01/25 09:01:58,665 INFO TaskSetManager: Starting task 1.0 in stage 0.0 (TID 1, 192.168.56.1, executor 0, partition 1, PROCESS_LOCAL, 4829 bytes)
18/01/25 09:01:59,242 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.56.1:48678 with 366.3 MB RAM, BlockManagerId(0, 192.168.56.1, 48678, None)
18/01/25 09:01:59,819 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.56.1:48678 (size: 1168.0 B, free: 366.3 MB)
18/01/25 09:02:00,139 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 1500 ms on 192.168.56.1 (executor 0) (1/2)
18/01/25 09:02:00,142 INFO TaskSetManager: Finished task 1.0 in stage 0.0 (TID 1) in 1478 ms on 192.168.56.1 (executor 0) (2/2)
18/01/25 09:02:00,143 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 
18/01/25 09:02:00,150 INFO DAGScheduler: ResultStage 0 (reduce at Application.scala:29) finished in 3.109 s
18/01/25 09:02:00,156 INFO DAGScheduler: Job 0 finished: reduce at Application.scala:29, took 3.546255 s
Pi is roughly 3.1363756818784094
18/01/25 09:02:00,168 INFO AbstractConnector: Stopped Spark@21a5fd96{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
18/01/25 09:02:00,170 INFO SparkUI: Stopped Spark web UI at http://192.168.56.1:4040
18/01/25 09:02:00,247 INFO StandaloneSchedulerBackend: Shutting down all executors
18/01/25 09:02:00,249 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
18/01/25 09:02:00,269 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
18/01/25 09:02:00,300 INFO MemoryStore: MemoryStore cleared
18/01/25 09:02:00,301 INFO BlockManager: BlockManager stopped
18/01/25 09:02:00,321 INFO BlockManagerMaster: BlockManagerMaster stopped
18/01/25 09:02:00,328 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
18/01/25 09:02:00,353 INFO SparkContext: Successfully stopped SparkContext
2018-01-25 09:02:00.353
18/01/25 09:02:00,358 INFO ShutdownHookManager: Shutdown hook called
18/01/25 09:02:00,360 INFO ShutdownHookManager: Deleting directory C:\Users\someuser\AppData\Local\Temp\spark-ac6369a0-abb8-476e-a527-91e0a8011302

BUILD SUCCESSFUL in 13s
3 actionable tasks: 1 executed, 2 up-to-date
9:02:01 AM: Task execution finished 'submitToSpark'.
4

0 回答 0