Kinesis Firehose kafka headers

0

Is there an option to collect the Kafka headers when consuming a Kafka topic in kinesis firehose? If not I think it would be a great improvement.

2개 답변
0

Well, upon this documentation, it mentions that you can set the serialization settings once you choose MSK as a source and S3 as a destination:

"An Apache Kafka record carries three fields: value, key and optional headers. With Amazon MSK as the source of the delivery stream, Kinesis Data Firehose ingests the three Kafka record fields - value, key and headers - and delivers them to Amazon S3 as separate files. The Kafka record value is delivered by default. Kafka key and Kafka header are optional and delivered only if you select them. "

Honestly, I haven't tested this feature before.

AWS
답변함 8달 전
  • I will have a look. Honestly I love the Glue Kafka connection approach that add a field to the message body when you ask to "includeHeaders".

  • Btw this is probably not supported from the console. I cannot see any selectable option to ask for key and headers delivering to S3. Do you?

0

Perhaps if you mean something like Dynamic partitioning, it is still not available for MSK source. I hope this can be added in the future. Alternatively, this possibly can be achieved through a Lambda attached to the Firehose.

Check these resources that may contain additional details:

  1. https://docs.aws.amazon.com/firehose/latest/dev/writing-with-msk.html
  2. https://aws.amazon.com/blogs/aws/amazon-msk-introduces-managed-data-delivery-from-apache-kafka-to-your-data-lake/
AWS
답변함 8달 전
  • Dynamic partitioning could be handy but what I mean is collecting from kafka the message headers. This is already supported when streaming kakfa topics from Glue by example. I need to collect/store also those keys.

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠