Is there a way to batch-check the existence of specific object versions in AWS S3?

0

I am writing an R package which needs to check the existence of a specific version of each AWS S3 object in its data store. The version of a given object is the version ID recorded in the local metadata, and the recorded version may or may not be the most current version in the bucket. Currently, the package accomplishes this by sending a HEAD request for each relevant object-version pair.

Is there a more efficient/batched way to do this for each version/object pair? list_object_versions() returns every version of every object of interest, which is way too many versions to download efficiently, and neither list_objects() nor list_objects_v2() return any version IDs at all. It would be great to have something like delete_objects(), but instead of deleting the objects, accept the supplied key-version pairs and return the ETag and custom metadata of each one that exists.

asked a year ago470 views
1 Answer
0

Hello, you may have already tried the prefix parameter that's part of the list_object_versions call, that can be used to filter down the results, if the prefix is known or common.

CLI example:

$ aws s3api list-object-versions --bucket EXAMPLE-BUCKET --prefix EXAMPLE-PREFIX

If you know the object and versiondID you can directly call get-object-attributeswith the object-attributes parameter. This will return the values specified in the object-attributes parameter, along with the LastModified and VersionID values.

CLI example:

$ aws s3api get-object-attributes --bucket EXAMPLE-BUCKET --key EXAMPLE-PREFIX/OBJECT.html --object-attributes "ETag" 

Hope this somewhat helped.

AWS
answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions