如何让 IoT Analytics 在数据存储中为接收到的 JSON 数组中的每个元素创建一个新行?
最简单的方法应该是在您的管道上利用Lambda 活动,并让它将单个 JSON 有效负载解析为所需的结构。这在某种程度上取决于发送到 Channel 的消息的“原始”结构。
因此,例如,我们可以通过CLI batch-put-message向 Channel 发送数据,如下所示:
aws iotanalytics batch-put-message --channel-name sample_channel --messages '[{"messageId": "message1", "payload": "{\"array\": [{\"Field1\": \"Value1\", \"Field2\": \"Value2\", \"Field3\": \"Value3\"},{\"Field1\": \"AnotherValue1\", \"Field2\": \"AnotherValue2\", \"Field3\": \"AnotherValue3\"}]}"}]'
然后,Channel 将具有如下结构的单个消息:
{
"messageId": "message1",
"payload": {
"array": [
{
"Field1": "Value1",
"Field2": "Value2",
"Field3": "Value3"
},
{
"Field1": "AnotherValue1",
"Field2": "AnotherValue2",
"Field3": "AnotherValue3"
}
]
}
}
如果您的管道有 Lambda 活动,则来自通道的消息将在参数中传递给您的 Lambda 函数event
。
我使用 AWS Lambda 控制台内联编辑器创建了一个简单的 Lambda 函数(使用 Python 3.7),并将其命名为sample_lambda
:
import json
import sys
import logging
# Configure logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
streamHandler = logging.StreamHandler(stream=sys.stdout)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)
def lambda_handler(event, context):
# This can be handy to see the raw structure of the incoming event
# will log to the matching CloudWatch log:
# /aws/lambda/<name_of_the_lambda>
# logger.info("raw event: {}".format(event))
parsed_rows = []
# Depending on the batchSize setting of the Lambda Pipeline Activity,
# you may receive multiple messages in a single event
for message_payload in event:
if 'array' in message_payload:
for row in message_payload['array']:
parsed = {}
for key, value in row.items():
parsed[key] = value
parsed_rows.append(parsed)
return parsed_rows
我添加了适当的权限,以便 IoT-Analytics 可以通过 CLI 调用 lambda 函数:
aws lambda add-permission --function-name sample_lambda --statement-id statm01 --principal iotanalytics.amazonaws.com --action lambda:InvokeFunction
重新处理Pipeline,解析后的行放入DataStore;执行数据集,我得到这个净结果:
"array","field1","field2","field3","__dt"
,"Value1","Value2","Value3","2019-04-26 00:00:00.000"
,"AnotherValue1","AnotherValue2","AnotherValue3","2019-04-26 00:00:00.000"