AWS S3 Keep Calculating Folder Size and Freezed

0

Hey AWS Community, This is Viktor, Recently, we are calculating a specific folder's size under a bucket, and want to get the storage cost. The first approach is using AWS UI, select the folder, and goto action, then clicked Calculate total size, but it freeze while it get around 35TB data The second approach is using AWS Cli, recursively calculate the total storage under the folder, but it show up session time out after calculate 1 hr.

So, is there any solution we can calculate a big folder's size under a bucket?

Viktor
質問済み 1年前1822ビュー
3回答
2
承認された回答

I understand, you are not able to calculate the total size of folder and I have also faced that earlier quite a few times where folder size was more than 100TBs.

I'd encourage you to use s3 inventory feature, which would work great in your situation, only thing is that you may have to wait for 24-48 hours to get inventory file available.

Also, please take a look at this Storage Blog, which lists plenty of options along with s3 inventory. S3 inventory creation process is quite self explanatory and it'd give you the bucket policy, which you would need to update on the bucket -> bucket policy, where you'd save the inventory file. Usually it doesn't take that long.

If you don't want to do this through S3 inventory, then I'd suggest you to check your CLI configuration, what are the timeout configurations, are those default, if so then you can configure them as well by following this AWS CLI Configuration Guide and then run the command, which is already listed in above mentioned guide:

  aws s3 ls --summarize --human-readable --recursive s3://<bucket-name>/

Let me know how it goes.

profile pictureAWS
エキスパート
回答済み 1年前
profile picture
エキスパート
レビュー済み 1年前
0

Am I correct in assuming that the following commands will cause a timeout?

aws s3 ls s3://buckname/dirname/ --summarize --recursive --human-readable

I have never experienced this, but it may stop if the number of files is large or the size is too large.
Therefore, it may be effective to create a shell script that uses "aws s3api list-objects" to generate size totals when "NextMarker" is included in the response.
https://awscli.amazonaws.com/v2/documentation/api/latest/reference/s3api/list-objects.html

profile picture
エキスパート
回答済み 1年前
profile picture
エキスパート
レビュー済み 1年前
0

Have you taken a look at Storage Lens?

profile pictureAWS
エキスパート
kentrad
回答済み 1年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ