Kinesis Firehose kafka headers

0

Is there an option to collect the Kafka headers when consuming a Kafka topic in kinesis firehose? If not I think it would be a great improvement.

2回答
0

Well, upon this documentation, it mentions that you can set the serialization settings once you choose MSK as a source and S3 as a destination:

"An Apache Kafka record carries three fields: value, key and optional headers. With Amazon MSK as the source of the delivery stream, Kinesis Data Firehose ingests the three Kafka record fields - value, key and headers - and delivers them to Amazon S3 as separate files. The Kafka record value is delivered by default. Kafka key and Kafka header are optional and delivered only if you select them. "

Honestly, I haven't tested this feature before.

AWS
回答済み 8ヶ月前
  • I will have a look. Honestly I love the Glue Kafka connection approach that add a field to the message body when you ask to "includeHeaders".

  • Btw this is probably not supported from the console. I cannot see any selectable option to ask for key and headers delivering to S3. Do you?

0

Perhaps if you mean something like Dynamic partitioning, it is still not available for MSK source. I hope this can be added in the future. Alternatively, this possibly can be achieved through a Lambda attached to the Firehose.

Check these resources that may contain additional details:

  1. https://docs.aws.amazon.com/firehose/latest/dev/writing-with-msk.html
  2. https://aws.amazon.com/blogs/aws/amazon-msk-introduces-managed-data-delivery-from-apache-kafka-to-your-data-lake/
AWS
回答済み 8ヶ月前
  • Dynamic partitioning could be handy but what I mean is collecting from kafka the message headers. This is already supported when streaming kakfa topics from Glue by example. I need to collect/store also those keys.

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ