> sc <- spark_connect(master = "local")
Error in sparkapi::start_shell(master = master, spark_home = spark_home, :
Failed to launch Spark shell. Ports file does not exist.
Path: /Users/XXX/Library/Caches/spark/spark-1.6.1-bin-hadoop2.6/bin/spark-submit
Parameters: --jars, '/Users/XXX/Library/R/3.3/library/sparklyr/java/sparklyr.jar', --packages, 'com.databricks:spark-csv_2.11:1.3.0','com.amazonaws:aws-java-sdk-pom:1.10.34', sparkr-shell, /var/folders/dy/jy43zcgd7gv27qc0mzlxxvd1qt7rhg/T//RtmptbAxW4/file357f67d0745a.out
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
我正在尝试在 R 中启动 sparklyr 并获得以上消息。请问有什么解决方案或建议来解决这个错误吗?我正在使用 MAC OSX,以下是 Session Info 的详细信息。即使与
sc <- spark_connect(master = "local", config = list())
> sessionInfo()
R version 3.3.1 (2016-06-21)
Platform: x86_64-apple-darwin13.4.0 (64-bit)
Running under: OS X 10.11.6 (El Capitan)
locale:
[1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] dplyr_0.5.0 sparklyr_0.2.28
loaded via a namespace (and not attached):
[1] Rcpp_0.12.5 sparkapi_0.3.15 digest_0.6.9 withr_1.0.2 assertthat_0.1
[6] rappdirs_0.3.1 R6_2.1.2 DBI_0.4-1 git2r_0.15.0 magrittr_1.5
[11] httr_1.2.1 curl_0.9.7 config_0.1.0 devtools_1.12.0 tools_3.3.1
[16] readr_0.2.2 parallel_3.3.1 yaml_2.1.13 memoise_1.0.0 tibble_1.1