我在本地模式下使用 spark 1.6.0。我已经创建了 ipython pyspark 配置文件,因此 pyspark 内核将在 jupyter notebook 中启动。所有这些都正常工作。
我想在 jupyter notebook中使用这个包spark-csv 。我试图编辑文件~/.ipython/profile_pyspark/startup/00-pyspark-setup.py
并--packages com.databricks:spark-csv_2.11:1.4.0
在pyspark-shell
命令之后放置,但没有成功。仍然收到此错误消息:
Py4JJavaError: An error occurred while calling o22.load.
: java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at http://spark-packages.org
I have tried also [this solution][2] and many others...none of them worked.
你有什么建议吗?