Transfer data periodically from Redshift to DynamoDB

0

Hi Team,

I have a table in my redshift cluster, for which I want to create a script that will run every 1 hour per day and scan this table to get some data based on some conditions, which it would dump to a table in the dynamo DB.

I was checking source and target options under the AWS glue jobs section, but when I select Redshift as the source, there is no option of Dynamo DB in the target.

Is there any way to achieve this ?

已提問 2 年前檢視次數 356 次
1 個回答
1
已接受的答案

Not sure how big the data set is but here are some options

  1. Simple lambda & s3 unload with Redshift scheduled job can do that. Then from s3 to dynamodb https://aws.amazon.com/blogs/database/implementing-bulk-csv-ingestion-to-amazon-dynamodb/ you can see some sample code in this blog. You can also use AWS Data Pipeline
  2. You can also use Glue connection type for DDB , reference : https://docs.aws.amazon.com/glue/latest/dg/aws-glue-programming-etl-connect.html#aws-glue-programming-etl-connect-dynamodb ("connectionType": "dynamodb" with the ETL connector as sink).

Thanks

profile pictureAWS
專家
已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南