1 Answer
- Newest
- Most votes
- Most comments
0
You'll have to update the items to trigger the stream events again (and actually make a mutation to the items, as a write that doesn't change anything doesn't emit a stream event).
Depending on why you're asking for this, you might be better served to do a full export to S3 and have a copy of the data there. People often do this to bootstrap a downstream system, then use streams (or incremental exports) to keep it current.
answered 10 months ago
Relevant content
- asked 3 years ago
- asked 2 years ago
- asked 5 months ago
- AWS OFFICIALUpdated 2 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated a month ago
- AWS OFFICIALUpdated 2 years ago
j@zhunter, thanks for your comment. This use case is because we had some records in DynamoDB & the DDB streams were enabled way after the Table was created and some data was already present in the Table. Now we have some past records in the dynamoDB Table that we want to process through the streams . This is one off manual effort . But we do not want to make any changes to the data to trigger the stream events for these old records but we do need them. Is there a way we can achieve this .
Nope, no way to ask the table to "restream" its contents. You could probably have some fun writing a Scan which would create its own event stream that would invoke your lambda fn. It's prob easier to have a different code path that can directly work against the full table data for bootstrapping.