Using S3 Batch to perform an expedited restore

0

Hi, I have very little experience with AWS Batch and am hoping someone can guide me with a prospective redevelopment approach.

I'm looking to use S3 Batch to replace an existing restore/copy process which is not proving scalable. I intended to use S3 Batch using a S3InitiateRestoreObject operation to restore from Glacier Flexible Storage, but then found out that this supports STANDARD and BULK retrieval tiers, but not the EXPEDITED retrieval tier according to this documentation.

It's very important that I also support the EXPEDITED retrieval tier as well. I'm wondering whether I could use LambdaInvoke instead to perform an EXPEDITED restore? I appreciate that might be quite pricey since I'd be paying for the lambda and the expedited restore, but the first question is: Would it work? And in any case, for what reason is the EXPEDITED retrieval tier not included when using a S3InitiateRestoreObject operation?

已提问 9 个月前333 查看次数
2 回答
2

Yes, that right, S3 batch operations don't support the Expedited retrieval tier.

In your use case, I'd prefer to choose custom CLI script. Please refer this re:Post Knowledge Center Article if you haven't already gone through this.

The above mentioned article tells how you can work on retrieval of s3 objects from tier which S3 batch doesn't support.

Additional Reference:

s3 batch Create Job CLI

Hope you find this useful.

Abhishek

profile pictureAWS
专家
已回答 9 个月前
1
已接受的回答

AWS S3 Batch Operations is a powerful service that can perform large-scale operations on S3 objects, such as copying, restoring, or applying tags. However, as you've noted, the S3InitiateRestoreObject operation currently only supports STANDARD and BULK retrieval tiers, not the EXPEDITED retrieval tier.

The reason for this limitation is not explicitly stated in the AWS documentation, but it could be due to the fact that expedited retrievals are designed for urgent situations where you need your data back in less than 5 minutes. This is a more costly operation, and it may not be suitable for large-scale batch operations where cost-effectiveness is often a priority.

As for your question about using AWS Lambda to perform an EXPEDITED restore, yes, it is technically possible. You can write a Lambda function that uses the AWS SDK to call the restore_object method with the Expedited retrieval option. This Lambda function can then be invoked as part of your S3 Batch Operations job.

Here's a basic example of what the Lambda function could look like in Python:

import boto3

def lambda_handler(event, context):
    s3 = boto3.client('s3')

    for record in event['Records']:
        bucket = record['s3']['bucket']['name']
        key = record['s3']['object']['key']

        s3.restore_object(
            Bucket=bucket,
            Key=key,
            RestoreRequest={
                'Days': 1,
                'GlacierJobParameters': {
                    'Tier': 'Expedited'
                }
            }
        )

This function would be triggered for each object in your S3 Batch Operations job, and it would initiate an expedited restore for that object.

Keep in mind that using Lambda in this way will indeed incur additional costs, both for the Lambda invocations and for the expedited retrievals. Also, expedited retrievals are not always available if AWS is experiencing peak demand, so you should have a fallback plan in place.

profile picture
已回答 9 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则