6

AWS Firehose 于今天发布。我正在玩弄它并试图弄清楚如何使用 AWS CLI 将数据放入流中。我有一个简单的 JSON 有效负载和相应的 Redshift 表,其中包含映射到 JSON 属性的列。我尝试了各种组合,但似乎无法通过 cli 传递 JSON 有效负载。

我试过的:

aws firehose put-record --delivery-stream-name test-delivery-stream --record '{ "attribute": 1 }'

aws firehose put-record --delivery-stream-name test-delivery-stream --record { "attribute": 1 }

aws firehose put-record --delivery-stream-name test-delivery-stream --record Data='{ "attribute": 1 }'

aws firehose put-record --delivery-stream-name test-delivery-stream --record Data={ "attribute": 1 }

aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json '{ "attribute": 1 }'

aws firehose put-record --delivery-stream-name test-delivery-stream --cli-input-json { "attribute": 1 }

我查看了没有帮助的 cli 帮助。这篇文章今天发表,但看起来他们使用的命令已经过时,因为参数“--firehose-name”已被“--delivery-stream-name”替换。

4

5 回答 5

8

转义 blob 内键和值周围的双引号:

aws firehose put-record --delivery-stream-name test-delivery-stream --record '{"Data":"{\"attribute\":1}"}'
于 2016-05-02T03:28:24.067 回答
2

这应该有效。转义所有引号。将 strem_name替换为您的流名称。

aws firehose put-record --cli-input-json "{\"DeliveryStreamName\":\"strem_name\",\"Record\":{\"Data\":\"test data\"}}"
于 2016-07-29T09:13:37.750 回答
2

我的凭据和区域存在问题,但这种语法至少让我克服了解析错误:

aws firehose put-record --cli-input-json '{"DeliveryStreamName":"testdata","Record":{"Data":"test data"}}'

于 2015-10-09T21:16:29.657 回答
1

This is what I have tried and it has worked.

Below is the example for sending JSON records with Single Column and multiple columns.

Single Value in the Data:

Example: Sending a single column which is an integer.

aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute\":1}"'

Multiple column values in the data :

Example: Sending Integer and String values via Put-record

aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute_0\":1,\"attribute_1\":\"Sample String Value\"}"'

Example: Sending Integer,String and float values via Put-record

aws firehose put-record --delivery-stream-name test-delivery-stream --record='Data="{\"attribute_0\":1,\"attribute_1\":\"Sample String Value\",\"attribute_2\":\"14.9\"}"'

Acknowledgement of Success :

When the record is sent successfully, kinesis acknowledges it with a record id , which is similar to the one below.

{
"RecordId": "fFKN2aJfUh6O8FsvlrfkowDZCpu0sx+37JWKJBRmN++iKTYbm/yMKE4dQHdubMR4i+0lDP/NF3c+4y1pvY9gOBkqIn6cfp+1DrB9YG4a0jXmopvhjrXrqYpwo+s8I41kRDKTL013c65vRh5kse238PC7jQ2iOWIqf21wq4dPU9R5qUbicH76soa+bZLvyhGVPudNNu2zRyZwCCV0zP/goah54d/HN9trz"

}

This indicates that the put-record command has succeeded.

Streamed Record on S3:

This is how the record the record looks like in S3 after kinesis has processed it into S3.

{"attribute":1}
{"attribute_0":1,"attribute_1":"Sample String Value"}
{"attribute_0":1,"attribute_1":"Sample String Value","attribute_2":"14.9"}

Note : In S3, the records are created in single or multiple files depending on the rate in which we issue the put-record command.

Please do try and comment if this works.

Thanks & Regards, Srivignesh KN

于 2017-07-18T22:09:15.403 回答
0

几件事:

  • 您是否创建了交付流?
  • 通过阅读文档,您似乎应该执行 --cli-input-json '{"Data":"blob"}' 或 --record 'Data=blob'
  • 尝试在 cli 上使用 --generate-cli-skeleton 用于 put-record/firehose 以查看示例
于 2015-10-08T04:37:14.347 回答