我一直在使用 spark-shell 尝试火花。我所有的数据都在sql中。
I used to include external jars using the --jars flag like /bin/spark-shell --jars /path/to/mysql-connector-java-5.1.23-bin.jar --master spark://sparkmaster.com:7077
I have included it in class path by changing the bin/compute-classpath.sh file
I was running succesfully with this config.
现在,当我通过 jobserver 运行独立作业时。我收到以下错误消息
result: {
"message" : "com.mysql.jdbc.Driver"
"errorClass" : "java.lang.classNotFoundException"
"stack" :[.......]
}
我已将 jar 文件包含在我的 local.conf 文件中,如下所示。上下文设置 { ..... 依赖 jar-uris = ["file:///absolute/path/to/the/jarfile"] ...... }