4

我正在学习使用 Spark。我一直关注这篇文章直到现在。当我尝试导入 pyspark 时,出现以下错误。pyspark 中有一个文件 accumulators.py。

>>> import os
>>> import sys
>>> os.environ['SPARK_HOME'] = "E:\\spark-1.2.0"
>>> sys.path.append("E:\\spark-1.2.0\\python")
>>> from pyspark import SparkContext
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "E:\spark-1.2.0\python\pyspark\__init__.py", line 41, in <module>
    from pyspark.context import SparkContext
  File "E:\spark-1.2.0\python\pyspark\context.py", line 30, in <module>
    from pyspark.java_gateway import launch_gateway
  File "E:\spark-1.2.0\python\pyspark\java_gateway.py", line 26, in <module>
    from py4j.java_gateway import java_import, JavaGateway, GatewayClient
ImportError: No module named py4j.java_gateway
>>> sys.path.append("E:\\spark-1.2.0\\python\\build")
>>> from pyspark import SparkContext
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "E:\spark-1.2.0\python\pyspark\__init__.py", line 41, in <module>
    from pyspark.context import SparkContext
  File "E:\spark-1.2.0\python\pyspark\context.py", line 25, in <module>
    from pyspark import accumulators
ImportError: cannot import name accumulators

如何解决此错误?我用windows 7 and java-8. python版本是Python 2.7.6 :: Anaconda 1.9.2 (64-bit)

4

2 回答 2

2

我在同一篇文章之后遇到了同样的问题,并且能够通过更改 00-pyspark-setup.py 脚本以将 SPARK_HOME/python/lib 路径添加到 python 的 sys.path 而不是直接添加到 SPARK_HOME/python 来解决它。

我的完整 00-pyspark-startup.py 脚本现在如下:

import os
import sys

# Configure the environment
#if 'SPARK_HOME' not in os.environ:
#    os.environ['SPARK_HOME'] = '/srv/spark'

# Create a variable for our root path
SPARK_HOME = os.environ['SPARK_HOME']

# Add the PySpark/py4j to the Python Path
sys.path.insert(0, os.path.join(SPARK_HOME, "python", "lib"))
sys.path.insert(0, os.path.join(SPARK_HOME, "python"))
于 2016-02-15T15:10:12.783 回答
0

尝试将 E:\spark-1.2.0\python\lib\py4j-0.8.2.1-src.zip 添加到您的 PYTHONPATH

于 2015-02-09T20:32:35.110 回答