以下 SOF 问题How to run script in Pyspark and drop into IPython shell when done? 告诉如何启动 pyspark 脚本:
%run -d myscript.py
但是我们如何访问existin spark上下文呢?
只是创建一个新的不起作用:
----> sc = SparkContext("local", 1)
ValueError: Cannot run multiple SparkContexts at once; existing
SparkContext(app=PySparkShell, master=local) created by <module> at
/Library/Python/2.7/site-packages/IPython/utils/py3compat.py:204
但是尝试使用现有的..那么现有的呢?
In [50]: for s in filter(lambda x: 'SparkContext' in repr(x[1]) and len(repr(x[1])) < 150, locals().iteritems()):
print s
('SparkContext', <class 'pyspark.context.SparkContext'>)
即 SparkContext 实例没有变量