我正在对 Databricks 上的远程 Spark 集群执行另一次本地 Scala 代码执行,并得到了这个。
Exception in thread "main" com.databricks.service.DependencyCheckWarning: The java class <something> may not be present on the remote cluster. It can be found in <something>/target/scala-2.11/classes. To resolve this, package the classes in <something>/target/scala-2.11/classes into a jar file and then call sc.addJar() on the package jar. You can disable this check by setting the SQL conf spark.databricks.service.client.checkDeps=false.
我尝试重新导入、清理和重新编译 sbt 项目,但无济于事。
有谁知道如何处理这个问题?