2

我从 pySpark 开始。我已经在Win10的anadonda中安装了它。我复制了一个示例,当我执行代码时,出现此错误:

Traceback (most recent call last):
File ".\testingSpark.py", line 7, in <module>
spark = SparkSession.builder.master("local").getOrCreate()
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\sql\session.py", line 173, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 331, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in __init__
conf, jsc, profiler_cls)
File "D:\Windows\Anaconda3\lib\site-packages\pyspark\context.py", line 188, in _do_init
self._javaAccumulator = self._jvm.PythonAccumulatorV2(host, port)
TypeError: 'JavaPackage' object is not callable

我已经读过它,但我找不到任何东西来解决这个错误。请帮我!

4

0 回答 0