Network Error when Uploading large (158 Gigabyte) file to S3 Bucket.

0

Hello, I've been trying to upload a Large 158 Gigabyte file to an S3 Bucket, but I keep getting a "Network Error" and the connection breaks. Is there a reliable way to do this? And if so, is there documentation about it? Thanks in advance. Rajnesh

質問済み 2年前5730ビュー
3回答
0

Hi Rainesh,

unless you have already tried those options, there are a couple of approaches:

hope it helps ;)

profile picture
エキスパート
回答済み 2年前
0

You didn't say how you are uploading the file. I wrote a python application that uploads backups from my backup software, which are large, too. I'm uploading a 90 GB file right now. If that's how you are doing this, I can give you more pointers.

There's another pointer that might help, too, that doesn't matter which method you use. It has to do with increasing the amount of retries and you change your config file for that. Here's the documentation for that. Let me know if you need more information on the python method. https://boto3.amazonaws.com/v1/documentation/api/latest/guide/retries.html

回答済み 2年前
0

Also, since you are uploading large files and they are failing, you may have failed multipart uploads that AWS charges for. I have two articles on how to create a lifecycle rule that deletes these failed uploads. I would set the time period to a day at first, so you don't have to wait long for them to be deleted. You can't see them in your bucket, but you can see them using storage lens. You may want to open a support ticket with AWS if you've been paying for these files. Be prepared to tell them which buckets and how far to go back in time.

https://aws.amazon.com/blogs/aws-cloud-financial-management/discovering-and-deleting-incomplete-multipart-uploads-to-lower-amazon-s3-costs/

https://aws.amazon.com/blogs/aws/s3-lifecycle-management-update-support-for-multipart-uploads-and-delete-markers/

回答済み 2年前

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ