Spring Cloud Stream Kafka,KTable 作为输入不起作用
接收器.java
public interface EventSink {
@Input("inputTable")
KTable<?, ?> inputTable();
}
消息接收器.java
@EnableBinding(EventSink .class)
public class MessageReceiver {
@StreamListener
public void process(@Input("inputTable") KTable<String, Event> KTable) {
// below code is just for representation. I need to do lot of things after getting this KTable
KTable.toStream()
.foreach((key, value) -> System.out.println(value));
}
}
应用程序.yml
server:
port: 8083
spring:
cloud:
stream:
kafka:
streams:
binder:
application-id: kafka-stream-demo
configuration:
default:
key:
serde: org.apache.kafka.common.serialization.Serdes$StringSerde
value:
serde: org.springframework.kafka.support.serializer.JsonSerde
bindings:
inputTable:
materialized-as: event_store
binder:
brokers: localhost:9092
bindings:
inputTable:
destination: nscevent
group: nsceventGroup
我得到以下错误
Exception in thread "kafka-stream-demo-1e64cf93-de19-4185-bee4-8fc882275010-StreamThread-1" org.apache.kafka.streams.errors.StreamsException: Deserialization exception handler is set to fail upon a deserialization error. If you would rather have the streaming pipeline continue after a deserialization error, please set the default.deserialization.exception.handler appropriately.
at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:80)
at org.apache.kafka.streams.processor.internals.RecordQueue.addRawRecords(RecordQueue.java:97)
at org.apache.kafka.streams.processor.internals.PartitionGroup.addRawRecords(PartitionGroup.java:117)
at org.apache.kafka.streams.processor.internals.StreamTask.addRecords(StreamTask.java:677)
at org.apache.kafka.streams.processor.internals.StreamThread.addRecordsToTasks(StreamThread.java:943)
at org.apache.kafka.streams.processor.internals.StreamThread.runOnce(StreamThread.java:831)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:767)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:736)
Caused by: java.lang.IllegalStateException: No type information in headers and no default type provided
at org.springframework.util.Assert.state(Assert.java:73)
at org.springframework.kafka.support.serializer.JsonDeserializer.deserialize(JsonDeserializer.java:370)
at org.apache.kafka.streams.processor.internals.SourceNode.deserializeValue(SourceNode.java:63)
at org.apache.kafka.streams.processor.internals.RecordDeserializer.deserialize(RecordDeserializer.java:66)
... 7 more
有人可以请教是什么问题吗?使用 KStream 作为输入,它可以工作,但不能作为 KTable。提前致谢