0

我已经安装了 sparkR 包,并且可以运行其他计算作业,例如文档中的 pi 计数或字数计数。但是当我尝试启动 sparkRSql 作业时,它会出现错误。有人可以帮帮我吗?我正在使用 R 版本 3.2.0 和 Spark 1.3.1

> library(SparkR)
> sc1 <- sparkR.init(master="local")
Launching java with command  /usr/lib/jvm/java-7-oracle/bin/java   -Xmx1g -cp '/home/himaanshu/R/x86_64-pc-linux-gnu-library/3.2/SparkR/sparkr-assembly-0.1.jar:' edu.berkeley.cs.amplab.sparkr.SparkRBackend /tmp/Rtmp0tAX4W/backend_port614e1c1c38f6 
15/07/09 18:05:51 WARN Utils: Your hostname, himaanshu-Inspiron-5520 resolves to a loopback address: 127.0.0.1; using 172.17.42.1 instead (on interface docker0)
15/07/09 18:05:51 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
15/07/09 18:05:52 INFO Slf4jLogger: Slf4jLogger started
15/07/09 18:05:54 WARN SparkContext: Using SPARK_MEM to set amount of memory to use per executor process is deprecated, please use spark.executor.memory instead.
> sqlContext <- sparkRSQL.init(sc1)
Error: could not find function "sparkRSQL.init"
````
4

1 回答 1

1

你的 SparkR 版本是错误的。sparkr-assembly-0.1.jar 尚未包含 sparkRSQL.init。

于 2015-07-15T03:42:27.693 回答