4

I have an external custom jar that I would like to use with Azure HDInsight Jupyter notebooks; the Jupyter notebooks in HDI use Spark Magic and Livy.

Within the first cell of the notebook, I'm trying to use the jars configuration:

%%configure -f
{"jars": ["wasb://$container$@$account#.blob.core.windows.net/folder/my-custom-jar.jar"]}

But the error message I receive is:

Starting Spark application
The code failed because of a fatal error:
    Status 'shutting_down' not supported by session..

Some things to try:
a) Make sure Spark has enough available resources for Jupyter to create a Spark context. For instructions on how to assign resources see http://go.microsoft.com/fwlink/?LinkId=717038
b) Contact your cluster administrator to make sure the Spark magics library is configured correctly.
Current session configs: {u'jars': [u'wasb://$container$@$account#.blob.core.windows.net/folder/my-custom-jar.jar'], u'kind': 'spark'}
An error was encountered:
Status 'shutting_down' not supported by session.

I'm wondering if I'm just not understanding how Livy works in this case as I was able to successfully include a spark-package (GraphFrames) on the same cluster:

%%configure -f
{ "conf": {"spark.jars.packages": "graphframes:graphframes:0.3.0-spark2.0-s_2.11" }}

Some additional references that may be handy (just in case I missed something):

4

2 回答 2

3

Oh, I was able to figure it out and forgot to update my question. This can work if you put the jar in the default storage account of your HDI cluster.

HTH!

于 2017-03-29T15:33:33.430 回答
1

in case people come here for adding jars on EMR.

%%configure -f
{"name": "sparkTest", "conf": {"spark.jars": "s3://somebucket/artifacts/jars/spark-avro_2.11-2.4.4.jar"}}

contrary to the document, use jars directly won't work.

于 2021-03-23T18:40:52.663 回答