我正在使用在 jupyter notebook 中建立 pyspark 环境的 python 脚本。内核是Azure ML 3.6
#find SPARK_HOME Variable environment
import findspark
findspark.init()
import pyspark;
import os
# These reference the jars mentioned on the Snowflake documentation page
os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages net.snowflake:snowflake-jdbc:3.6.24,net.snowflake:spark-snowflake_2.11:2.4.12-spark_2.3,com.microsoft.ml.spark:mmlspark_2.11:0.18.0 pyspark-shell'
from pyspark import SparkConf, SparkContext
from pyspark.sql import SQLContext, SparkSession
# Use the SparkSession API for datasets and dataframe API
spark = SparkSession.builder.master('local').appName('test').config('spark.driver.memory', '6G').config('spark.driver.maxResultSize', '4G').config("spark.num.executors","8").config("spark.executor.cores", "8").config('spark.executor.memory', '14G').config("spark.worker.instances", "2").getOrCreate()
但我收到此错误: Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled 在 JVM 中不存在
我不明白为什么。昨天这段代码运行良好,但今天我收到了这个错误。