Using AWS Glue to export ~500TB of DynamoDB table to S3 bucket


We have use case where we want to export ~500TB of DynamoDb data to a S3, one of the possible approaches that I found was making use of AWS Glue Job. Also while exporting the data to S3, we need to perform certain kind of transformation in the DynamoDB data for which we need to make a service call to Java package (transformation logic isn't that heavy and should get completed in ms). Is AWS glue a better approach to export ~500TB of data

asked 3 months ago229 views
2 Answers

You can use AWS Batch for exporting and transforming teh 500TB of data from DynamoDB to an S3 bucket.

  • Start by using the native export functionality of DynamoDB to export your data directly to an S3 bucket. This approach is highly efficient for large datasets and does not impact the performance of your DynamoDB table.
  • Develop a Docker container with your transformation logic and upload it to Amazon ECR. Then, configure an AWS Batch environment specifying the necessary compute resources. Then, define job definitions in AWS Batch, detailing how jobs should run using your container. Last, submit transformation jobs to AWS Batch to process data from S3 and store the transformed data back to S3 or another location.
  • Optionally, use AWS Step Functions to manage the workflow, particularly if the process involves multiple steps.

If this has resolved your issue or was helpful, accepting the answer would be greatly appreciated. Thank you!

profile picture
answered 3 months ago


Yes, Glue is the ETL service on AWS for such tasks: it allows to process / transform data as you export from DDB to S3.

Here is a good article detailling how to do it:



profile pictureAWS
answered 3 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions