- 최신
- 최다 투표
- 가장 많은 댓글
I forgot to update this post with the approach I took. I created a Lambda function that** lists the data from a source table**, transforms it into 'PutRequest' batches of 25, and uses to batchWriteItem method of DynamoDB to upload the data into a destination table
Based on your situation you have 2 options to import the data without having to write any code:
-
DynamoDB Import From S3 (Newly Released)
- Using this approach you can import your data stored on S3 in DDB JSON, ION or even CSV
- The cost of running an import is based on the uncompressed size of the source data in S3, multiplied by a per-GB cost, which is $0.15 per GB in the US East (Northern Virginia) Region. Tables with one or more global secondary indexes (GSIs) defined incur no additional cost, but the size of any failed records adds to the total cost.
- More reading here: blog
-
- Using this approach you can easily restore your backed up data from AWS Backup Service
- The cost for restoring from a warm back up is $0.15 per GB n the US East (Northern Virginia) Region
Thank you for the information, this creates a new table in DynamoDB right? I am new to AWS and I am currently building a web app in Amplify, do you know how to edit the AppSync schema or CloudFormation template to point to this new restored table?
I've tried deleting a table after backing it up and then restore it with the same name but
amplify push
will not work and return errors like 'Resource is not in the state stackUpdateComplete'DynamoDB Import From S3 does not let you import data to existing DynamoDB table. You have to create new table. It would be great if they let you import to existing table too.
You can restore from backup. Take a look at this - https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Restore.Tutorial.html
Can we do this for replicating data to new empty dynamodb that's been created on another account? (IAC)