0

我创建了一个我试图查询的 spark RDD 表,但结果不是预期的值。知道出了什么问题。

In [8]:people.take(15)
Out[8]:
[Row(num1=u'27477.23', num2=u'28759.862564'),
 Row(num1=u'14595.27', num2=u'4753.822798'),
 Row(num1=u'16799.17', num2=u'535.51891148'),
 Row(num1=u'171.85602', num2=u'905.14'),
 Row(num1=u'878488.70139', num2=u'1064731.4136'),
 Row(num1=u'1014.59748', num2=u'1105.91'),
 Row(num1=u'184.53171', num2=u'2415.61'),
 Row(num1=u'28113.931963', num2=u'71011.376036'),
 Row(num1=u'1471.75', num2=u'38.0268375'),
 Row(num1=u'33645.52', num2=u'15341.160558'),
 Row(num1=u'5464.95822', num2=u'14457.08'),
 Row(num1=u'753.58258673', num2=u'3243.75'),
 Row(num1=u'26469.395374', num2=u'38398.135846'),
 Row(num1=u'4709.5768681', num2=u'1554.61'),
 Row(num1=u'1593.1114983', num2=u'2786.4538546')]

模式被编码在一个字符串中。

In [9]:
schemaString = "num1 num2"
In [10]:

fields = [StructField(field_name, StringType(), True) for field_name in schemaString.split()]
schema = StructType(fields)
In [11]:

# Apply the schema to the RDD
schemaPeople = sqlContext.applySchema(people, schema)

将 SchemaRDD 注册为表。

In [12]:
schemaPeople.registerTempTable("people")

SQL 可以在已注册为表的 SchemaRDD 上运行。**

In [14]:
results = sqlContext.sql("SELECT sum(num1) FROM people")
In [18]:
results
Out[18]:
MapPartitionsRDD[52] at mapPartitions at SerDeUtil.scala:143
4

1 回答 1

2

与普通 RDD 上的转换相同,Spark SQL 查询只是对所需操作的描述。如果您想获得触发操作的结果:

>>> results.first()
Row(_c0=1040953.1831101299)

只是为了清楚起见,最好转换数据而不依赖于隐式转换:

>>> result = sqlContext.sql("SELECT SUM(CAST(num1 AS FLOAT)) FROM people")
于 2015-08-27T20:10:16.287 回答