AWS re:Post을(를) 사용하면 다음에 동의하게 됩니다. AWS re:Post 이용 약관

When using Kinesis Producer Library to aggregate records, send them to a Kinesis Data Stream, and run a Kinesis Data Analytics application, how is de-aggregation handled?

0

The Kinesis Producer Library aggregates records before sending them to a Kinesis Data Stream.

When consuming records from that Data Stream with the Kinesis Consumer Library, record de-aggregation is automatic.

For consuming records from that Data Stream with a Kinesis Data Analytics application, how is de-aggregation handled?

  • Does the Data Analytics application de-aggregate records automatically?
  • Does this case require running the Kinesis Consumer Library inside a Data Analytics application?
  • Something else?
1개 답변
0

So this is a depends question. The older Flink producer(org.apache.flink.streaming.connectors.kinesis.FlinkKinesisProducer) for kinesis did use de-aggregation by default, I believe, but there was a Flink configuration to turn it off. The newer producer does not use de-aggregation. I think that the version of KDA currently running today uses the older Flink Producer. https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/datastream/kinesis/

As for consumers receiving these de-aggregated records, we use Lambda's written in GoLang to read the de-aggregated records and write them to our data store. We used this in our consumer to help pull out the records: https://github.com/awslabs/kinesis-aggregation

profile picture
답변함 2년 전
profile picture
전문가
검토됨 5달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠