1

我正在尝试将 apache spark-1.6.1 安装为独立模式。我已关注“ https://github.com/KristianHolsheimer/pyspark-setup-guide ”链接。但是,执行后

$ sbt/sbt assembly

我努力了

$ ./bin/run-example SparkPi 10"

但是,它给出了一个错误,

./bin/run-example: line 26: /home/dasprasun/opt/spark/bin/load-spark env.sh: No such file or directory
Failed to find Spark examples assembly in /home/dasprasun/opt/spark/lib or /home/dasprasun/opt/spark/examples/target
You need to build Spark before running this program

完成所有步骤后,我在 ipython 中给出了以下命令

In [1]: from pyspark import SparkContext

它给出了以下错误:

ImportError Traceback (most recent call last) <ipython-input-1-47c4965c5f0e> in <module>()
----> 1 from pyspark import SparkContext
ImportError: No module named pyspark

我不明白发生了什么。请帮我解决这个问题。

4

0 回答 0