当我尝试这个时:
cfg = SparkConf().setAppName('MyApp')
spark = SparkSession.builder.config(conf=cfg).getOrCreate()
lines = spark.readStream.load(format='socket', host='localhost', port=9999,
schema=StructType(StructField('value', StringType, True)))
words = lines.groupBy('value').count()
query = words.writeStream.format('console').outputMode("complete").start()
query.awaitTermination()
然后我得到一些错误:
AssertionError: dataType 应该是 DataType
我在 ./pyspark/sql/types.py 的第 403 行搜索源代码:
assert isinstance(dataType, DataType), "dataType should be DataType"
但是StringType基于AtomicType而不是DataType
class StringType(AtomicType):
"""String data type.
"""
__metaclass__ = DataTypeSingleton
那么有错误吗?