Upload large files to S3 via CLI

0

We are trying to use the CLI on Linux to upload a file larger than 110GB to AWS S3, however, we are getting the following error.

An error occurred (EntityTooLarge) when calling the PutObject operation: Your proposed upload exceeds the maximum allowed size
aws s3api put-object \
        --bucket $bucket \
        --key "$key" \
        --body "$body" \
        --sse-customer-algorithm AES256 \
        --sse-customer-key "$customer-key" \
        --sse-customer-key-md5 "$customer-key-md5"

Another obstacle we have is that we are encrypting the file with SSE C.

How can we upload this?

已提問 2 個月前檢視次數 334 次
3 個答案
1

To upload a file larger than 110GB to Amazon S3 using the AWS CLI, you can use the multipart upload feature. However, there are some limitations on the maximum object size that you should be aware of:

  1. The maximum object size that can be uploaded in a single PUT operation is 5GB. For objects larger than 100MB, it is recommended to use the multipart upload feature.

  2. The maximum size of an individual part in a multipart upload is 5GB. This means that for a file larger than 110GB, you will need to split it into multiple parts and upload them separately.

  3. To perform a multipart upload using the AWS CLI, you can use the following steps:

    aws s3 mb s3://your-bucket-name
    aws s3 cp /path/to/large/file.zip s3://your-bucket-name/file.zip --recursive --multipart-upload --chunk-size 5GB
    
    • The --multipart-upload option enables the multipart upload feature.
    • The --chunk-size option specifies the size of each part, which should be less than or equal to 5GB.
  4. You can monitor the progress of the multipart upload using the aws s3 ls command:

    aws s3 ls s3://your-bucket-name/file.zip --human-readable --summarize
    
  5. If the upload is interrupted for any reason, you can resume the upload from the last successful part using the aws s3 cp command with the --continue option:

    aws s3 cp /path/to/large/file.zip s3://your-bucket-name/file.zip --continue
    

For more detailed information on uploading large files to Amazon S3, you can refer to the

Amazon S3 Developer Guide.

Upload large files to S3 | AWS re:Post

AWS
AWS TAM
已回答 2 個月前
profile picture
專家
已審閱 2 個月前
profile picture
專家
Steve_M
已審閱 2 個月前
0

And how do I encrypt with SSE C?

已回答 2 個月前
0

You need to use multipart upload as there is a limit on a single upload of 5 GB size..

You can use the high-level aws s3 cp command instead of the low-level aws s3api put-object command to have the multipart done automatically.

Please refer to this link for more information on the exact usage.

profile pictureAWS
專家
已回答 2 個月前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南