我想添加graphframes library.Normaly这个库是由(例如)添加的:
pyspark --packages graphframes:graphframes:0.7.0-spark2.4-s_2.11
然后你应该得到类似的东西:
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Python version 3.7.0 (default, Sep 25 2018 18:19:16)
SparkSession available as 'spark'.
>>>
我可以导入图框:>>>import graphframes
当我执行 spark-submit test.py 时出现问题,其中 test.py 是:
import numpy as np
from operator import add
from pyspark.sql import *
from pyspark.sql import SparkSession
from pyspark import SparkContext, SparkConf
from pyspark.sql.types import StructType, StructField
from pyspark.sql.types import DoubleType, StringType, IntegerType
from pyspark.sql.functions import udf
from pyspark.sql.types import *
from pyspark.sql import SQLContext
import graphframes
from graphframes import *
if __name__ == "__main__":
print ("==================================================================")
print ("| main |")
print ("==================================================================")
# create Spark context with Spark configuration
conf = SparkConf().setMaster("local[*]").setAppName("test").set('spark.executor.memory', '60G').set('spark.driver.memory', '60G').set('spark.driver.maxResultSize', '10G')
spark = SparkSession.builder.master("localhost").config(conf=conf).getOrCreate()
start_time = time.time()
sc = spark.sparkContext
sqlContext = SQLContext(sparkContext = sc)
# Create a Vertex DataFrame with unique ID column "id"
v = spark.createDataFrame([("a", "Alice", 34),("b", "Bob", 36), ("c", "Charlie", 30),], ["id", "name", "age"])
v.show()
# Create an Edge DataFrame with "src" and "dst" columns
e = spark.createDataFrame([("a", "b", "friend"),("b", "c", "follow"),("c", "b", "follow"),], ["src", "dst", "relationship"])
# Create a GraphFrame
g = GraphFrame(v, e)
# Query: Get in-degree of each vertex.
g.inDegrees.show()
# Query: Count the number of "follow" connections in the graph.
g.edges.filter("relationship = 'follow'").count()
我得到以下异常:
import graphframes
ModuleNotFoundError: No module named 'graphframes'
我认为问题与 --packages 有关,它不能使 python 包可用或可从 Spark 客户端/驱动程序加载。
而且我认为应该将图框添加到 python path 。
1-如何解决这个问题?
- 如何在 Windows 和 Linux 中应用 1) 中提出的解决方案?
我尝试了以下方法:
- 下载graphframes jar
- 提取 JAR 内容
- 导航到“graphframe”目录并压缩其中的内容。
- 将生成的 zip 复制到我的主目录:/home/tam/
在 .bashrc 我设置:
export PYSPARK_PYTHONPYSPARK_PYTHON=/home/tam/.local/easybuild/software/2017/Core/miniconda3/4.3.27/envs/testEnv/bin/python
export PYTHONPATH=${PYTHONPATH}:/home/tam/graphframes.zip
spark-submit test.py,我收到以下错误:
Traceback (most recent call last):
File "/home/tam/test.py", line 3, in <module>
import graphframes
File "/home/tam/graphframes/__init__.py", line 2, in <module>
from .graphframe import GraphFrame
File "/home/tam/graphframes/graphframe.py", line 26, in <module>
from graphframes.lib import Pregel
File "/home/tam/graphframes/lib/__init__.py", line 3, in <module>
from .pregel import Pregel
File "/home/tam/graphframes/lib/pregel.py", line 24, in <module>
from pyspark.ml.wrapper import JavaWrapper, _jvm
File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/spark/2.3.0/python/lib/pyspark.zip/pyspark/ml/__init__.py", line 22, in <module>
File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/spark/2.3.0/python/lib/pyspark.zip/pyspark/ml/base.py", line 24, in <module>
File "/cvmfs/soft.computecanada.ca/easybuild/software/2017/Core/spark/2.3.0/python/lib/pyspark.zip/pyspark/ml/param/__init__.py", line 26, in <module>
ModuleNotFoundError: No module named 'numpy'
log4j:WARN No appenders could be found for logger (org.apache.spark.util.ShutdownHookManager).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties