Kinesis Firehose kafka headers

0

Is there an option to collect the Kafka headers when consuming a Kafka topic in kinesis firehose? If not I think it would be a great improvement.

profile picture
已提問 8 個月前檢視次數 307 次
2 個答案
0

Well, upon this documentation, it mentions that you can set the serialization settings once you choose MSK as a source and S3 as a destination:

"An Apache Kafka record carries three fields: value, key and optional headers. With Amazon MSK as the source of the delivery stream, Kinesis Data Firehose ingests the three Kafka record fields - value, key and headers - and delivers them to Amazon S3 as separate files. The Kafka record value is delivered by default. Kafka key and Kafka header are optional and delivered only if you select them. "

Honestly, I haven't tested this feature before.

AWS
已回答 8 個月前
  • I will have a look. Honestly I love the Glue Kafka connection approach that add a field to the message body when you ask to "includeHeaders".

  • Btw this is probably not supported from the console. I cannot see any selectable option to ask for key and headers delivering to S3. Do you?

0

Perhaps if you mean something like Dynamic partitioning, it is still not available for MSK source. I hope this can be added in the future. Alternatively, this possibly can be achieved through a Lambda attached to the Firehose.

Check these resources that may contain additional details:

  1. https://docs.aws.amazon.com/firehose/latest/dev/writing-with-msk.html
  2. https://aws.amazon.com/blogs/aws/amazon-msk-introduces-managed-data-delivery-from-apache-kafka-to-your-data-lake/
AWS
已回答 8 個月前
  • Dynamic partitioning could be handy but what I mean is collecting from kafka the message headers. This is already supported when streaming kakfa topics from Glue by example. I need to collect/store also those keys.

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南