如果您使用“Analytics Engine”火花集群实例,您可以通过 Ambari 控制台增加默认值。您可以从 console.bluemix.net 中的 IAE 实例获取到 Ambari 控制台的链接和凭据。在 Ambari 控制台中,添加一个新属性
Spark2 -> “自定义 spark2-defaults” -> 添加属性 -> spark.driver.maxResultSize = 2GB
确保 spark.driver.maxResultSize 值小于设置的驱动程序内存
Spark2 -> “高级 spark2-env” -> 内容 -> SPARK_DRIVER_MEMORY
如果您只是尝试创建单个 CSV 文件并且不想更改 spark conf 值,因为您不知道最终文件有多大,那么另一个建议是使用如下函数,该函数使用 hdfs getmerge 函数像 pandas 一样创建一个 csv 文件。
def writeSparkDFAsCSV_HDFS(spark_df, file_location,file_name, csv_sep=',', csv_quote='"'):
"""
It can be used to write large spark dataframe as a csv file without running
into memory issues while converting to pandas dataframe.
It first writes the spark df to a temp hdfs location and uses getmerge to create
a single file. After adding a header, the merged file is moved to hdfs.
Args:
spark_df (spark dataframe) : Data object to be written to file.
file_location (String) : Directory location of the file.
file_name (String) : Name of file to write to.
csv_sep (character) : Field separator to use in csv file
csv_quote (character) : Quote character to use in csv file
"""
# define temp and final paths
file_path= os.path.join(file_location,file_name)
temp_file_location = tempfile.NamedTemporaryFile().name
temp_file_path = os.path.join(temp_file_location,file_name)
print("Create directories")
#create directories if not exist in both local and hdfs
!mkdir $temp_file_location
!hdfs dfs -mkdir $file_location
!hdfs dfs -mkdir $temp_file_location
# write to temp hdfs location
print("Write to temp hdfs location : {}".format("hdfs://" + temp_file_path))
spark_df.write.csv("hdfs://" + temp_file_path, sep=csv_sep, quote=csv_quote)
# merge file from hadoop to local
print("Merge and put file at {}".format(temp_file_path))
!hdfs dfs -getmerge $temp_file_path $temp_file_path
# Add header to the merged file
header = ",".join(spark_df.columns)
!rm $temp_file_location/.*crc
line_prepender(temp_file_path, header)
#move the final file to hdfs
!hdfs dfs -put -f $temp_file_path $file_path
#cleanup temp locations
print("Cleanup..")
!rm -rf $temp_file_location
!hdfs dfs -rm -r $temp_file_location
print("Done!")