admin管理员组文章数量:1316847
I am trying to use StreamBridge to publish messages to GCP pubsub in CloudEvents format and use Avro for schema validation. Here is the message building snippet
Message<Person> ce = CloudEventMessageBuilder
.withData(person)
.setId(UUID.randomUUID().toString())
.setSource(URI.create("http://localhost"))
.setType("person_add")
.setDataContentType("application/cloudevents+json")
.setHeader(GcpPubSubHeaders.ORDERING_KEY, "personKey")
.build();
The headers for the received message look like this:
{ "ce_datacontenttype": "application/cloudevents+json", "ce_id": "aeb44aa1-524a-4032-b4fc-8013428c214a", "ce_source": "http://localhost", "ce_specversion": "1.0", "ce_type": "person_add", "contentType": "application/json", "message-type": "cloudevent", "target-protocol": "kafka" }
Notice the added, non CloudEvents headers, contentType, message-type and target-protocol.
I tried to develop an Avro schema for this, but GCP rejected the schema definition because hypens "-" are not valid in Avro names.
Debugging the code I discovered that CloudEventMessageBuilder.build() is post processing the message before publish and changing any headers with "ce-" to "ce_", should it also be converting these added headers as well? Is there a way to stop these additional headers being added?
本文标签: Spring StreamBridge and CloudEvents adding nonAvro complaint headersStack Overflow
版权声明:本文标题:Spring StreamBridge and CloudEvents adding non-Avro complaint headers - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1741996778a2410200.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论