1

**我正在尝试从 kafka 流式传输数据并将其转换为数据帧。跟着这个链接

但是当我同时运行生产者和消费者应用程序时,这是我控制台上的输出。**

(0,[B@370ed56a) (1,[B@2edd3e63) (2,[B@3ba2944d) (3,[B@2eb669d1) (4,[B@49dd304c) (5,[B@4f6af565) (6 ,[B@7714e29e)

这实际上是 kafka 生产者的输出,在推送消息之前主题是空的。

这是生产者代码片段:

Properties props = new Properties();
props.put("bootstrap.servers", "##########:9092");
props.put("key.serializer",
        "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer",
        "org.apache.kafka.common.serialization.ByteArraySerializer");
props.put("producer.type", "async");
Schema.Parser parser = new Schema.Parser();
Schema schema = parser.parse(EVENT_SCHEMA);
Injection<GenericRecord, byte[]> records = GenericAvroCodecs.toBinary(schema);

KafkaProducer<String, byte[]> producer = new KafkaProducer<String, byte[]>(props);
for (int i = 0; i < 100; i++) {
    GenericData.Record avroRecord = new GenericData.Record(schema);
    setEventValues(i, avroRecord);
    byte[] messages = records.apply(avroRecord);
    ProducerRecord<String, byte[]> producerRecord = new ProducerRecord<String, byte[]>(
            "topic", String.valueOf(i),messages);
    System.out.println(producerRecord);
    producer.send(producerRecord);
}

它的输出是:

key=0, value=[B@680387a key=1, value=[B@32bfb588 key=2, value=[B@2ac2e1b1 key=3, value=[B@606f4165 key=4, value=[B@282e7f59

这是我用 scala 编写的消费者代码片段,

"group.id" -> "KafkaConsumer",
"zookeeper.connection.timeout.ms" -> "1000000"

val topicMaps = Map("topic" -> 1)
val messages = KafkaUtils.createStream[String, Array[Byte], StringDecoder, DefaultDecoder](ssc, kafkaConf, topicMaps, StorageLevel.MEMORY_ONLY_SER)
messages.print()

我已经在 createStream() 中尝试过 StringDecoder 和 DefaultDecoder。我确信,生产者和消费者是相互遵守的。任何帮助,来自任何人?

4

0 回答 0