How can I get the size of an Amazon S3 bucket?

1

s3cmd seems to access every file individually and isn't very scalable. Is there a scalable way to get the answer?

Edit: Ideally via the command line.

jedberg
已提问 3 年前2029 查看次数
4 回答
5

In the CLI you can list recursively and summarize in a human readable format.

aws s3 ls s3://mybucket --recursive --human-readable --summarize
... [snipped output] ...
Total Objects: 46
   Total Size: 29.5 MiB
profile picture
专家
bwhaley
已回答 3 年前
profile pictureAWS
专家
已审核 2 年前
  • Will that work if your bucket has, say, 50 million objects in it?

  • It should work, though it will take a long time. I just listed a bucket with 6,363,094 objects, 24.9 GiB in size. Took 54 minutes.

4

Using the BucketSizeBytes CloudWatch Metric (ref: https://docs.aws.amazon.com/AmazonS3/latest/userguide/metrics-dimensions.html) is probably your best bet - and being it CloudWatch, you can also alarm on size, use triggers etc as you would do with any other metric.

Need to be careful though, this metric might be a few hours behind.

AWS
已回答 3 年前
profile pictureAWS
专家
已审核 2 年前
1

In the S3 console you can go navigate to your bucket and select all objects in the bucket. Once all objects are selected you can go to "Actions" > "Calculate total size" Hope that helps

AWS
Mike_C
已回答 3 年前
  • Is there a command line version of that?

  • Not that i am aware of. You could try this command though. Output is in bytes fyi:

    aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"

0

There are alternative ways.

  • Cloudwatch Metrics.
  • Cost Explorer API, UI or Billing.
  • Generate an S3 Inventory, use Athena to run query to sum sizes. Might be overkill though, but you can filter as you wish if you need more than just the usage data.
已回答 2 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则