0

我正在使用 pycharm 在 dbconnect conda enviromemnet 中运行代码,我成功运行了一些代码。但是然后运行一个更大的项目,希望它可以在远程数据块集群上运行,但是即使我在项目中有一个依赖文件,我也会收到这个有线错误,为什么我会收到这个错误?错误出现在输出的末尾,我收到所有 jar 的此错误,但我只发布第一个。

22/01/01 15:14:27 WARN DependencyUtils: Local jar C:\Users\na\PycharmProjects\..\dependencies\udfs-1.0-jar-with-dependencies.jar does not exist, skipping.
22/01/01 15:14:27 WARN DependencyUtils: Local jar C:\Users\name\PycharmProjects\..\dependencies\urlz-1.2-20211108.112618-928-jar-with-dependencies.jar does not exist, skipping.
22/01/01 15:14:27 WARN DependencyUtils: Local jar C:\Users\name\PycharmProjects\..\dependencies\gremlin-1.0-20211108.112604-912-jar-with-dependencies.jar does not exist, skipping.
22/01/01 15:14:27 INFO SparkContext: Running Spark version 3.0.1-SNAPSHOT
22/01/01 15:14:27 INFO ResourceUtils: ==============================================================
22/01/01 15:14:27 INFO ResourceUtils: Resources for spark.driver:

22/01/01 15:14:27 INFO ResourceUtils: ==============================================================
22/01/01 15:14:27 INFO SparkContext: Submitted application: dbconnect_session
22/01/01 15:14:27 INFO SecurityManager: Changing view acls to: name
22/01/01 15:14:27 INFO SecurityManager: Changing modify acls to: name
22/01/01 15:14:27 INFO SecurityManager: Changing view acls groups to: 
22/01/01 15:14:27 INFO SecurityManager: Changing modify acls groups to: 
22/01/01 15:14:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(name); groups with view permissions: Set(); users  with modify permissions: Set(name); groups with modify permissions: Set()
22/01/01 15:14:30 INFO Utils: Successfully started service 'sparkDriver' on port 61506.
22/01/01 15:14:30 INFO SparkEnv: Registering MapOutputTracker
22/01/01 15:14:30 INFO SparkEnv: Registering BlockManagerMaster
22/01/01 15:14:30 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
22/01/01 15:14:30 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
22/01/01 15:14:30 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
22/01/01 15:14:30 INFO DiskBlockManager: Created local directory at C:\Users\name\AppData\Local\Temp\blockmgr-963a7542-d01d-4f70-a0fe-ee430175cc27
22/01/01 15:14:30 INFO MemoryStore: MemoryStore started with capacity 366.3 MiB
22/01/01 15:14:30 INFO SparkEnv: Registering OutputCommitCoordinator
22/01/01 15:14:30 INFO AsyncProfiler: Cannot create AsyncProfiler instance (unsupported platform)
22/01/01 15:14:30 WARN MetricsSystem: Using default name SparkStatusTracker for source because neither spark.metrics.namespace nor spark.app.id is set.
22/01/01 15:14:31 INFO Utils: Successfully started service 'SparkUI' on port 4040.
22/01/01 15:14:31 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://kubernetes.docker.internal:4040



22/01/01 15:14:31 ERROR SparkContext: Failed to add ../dependencies/udfs-1.0-jar-with-dependencies.jar to Spark environment
java.io.FileNotFoundException: Jar C:\Users\name\PycharmProjects\..\dependencies\udfs-1.0-jar-with-dependencies.jar not found
    at org.apache.spark.SparkContext.addLocalJarFile$1(SparkContext.scala:2040)
4

0 回答 0