1

我正在尝试使用 sparklyr 安装 spark 和

spark_install 

我收到以下错误。

    C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Cannot use compressed or remote archives
C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Error is not recoverable: exiting now
running command 'tar.exe -zxf "C:\Users\MyPC\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7.tgz" -C "C:/Users/LeviVM/AppData/Local/rstudio/spark/Cache"' had status 2�tar.exe -zxf "C:\Users\MyPC\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7.tgz" -C "C:/Users/LeviVM/AppData/Local/rstudio/spark/Cache"� returned error code 2Installation complete.
cannot open file 'C:\Users\MyPc\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7/conf/log4j.properties': No such file or directoryFailed to set logging settingscannot open file 'C:\Users\MyPc\AppData\Local\rstudio\spark\Cache/spark-2.0.1-bin-hadoop2.7/conf/hive-site.xml': No such file or directoryFailed to apply custom hive-site.xml configuration

然后我从网上下载了火花并使用

spark_install_tar 

这给了我同样的错误:

C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Cannot use compressed or remote archives
C:\dsvm\tools\UnxUtils\usr\local\wbin\tar.exe: Error is not recoverable: exiting now

有什么建议吗?

提前致谢。

4

2 回答 2

1

当我升级 sparklyr 使用

devtools::install_github("rstudio/sparklyr") 

问题消失了

于 2016-10-26T19:32:44.997 回答
0

spark_install_tar(tarfile = "path/to/spark_hadoop.tar")

如果仍然出现错误,则解压缩 tar 并将 spark_home 环境变量指向 spark_hadoop 解压缩路径。

然后尝试在 R 控制台中执行以下命令。库(sparklyr)sc <- spark_connect(master = “local”)

于 2017-05-11T05:21:38.590 回答