Network Error when Uploading large (158 Gigabyte) file to S3 Bucket.

0

Hello, I've been trying to upload a Large 158 Gigabyte file to an S3 Bucket, but I keep getting a "Network Error" and the connection breaks. Is there a reliable way to do this? And if so, is there documentation about it? Thanks in advance. Rajnesh

已提問 2 年前檢視次數 5743 次
3 個答案
0

Hi Rainesh,

unless you have already tried those options, there are a couple of approaches:

hope it helps ;)

profile picture
專家
已回答 2 年前
0

You didn't say how you are uploading the file. I wrote a python application that uploads backups from my backup software, which are large, too. I'm uploading a 90 GB file right now. If that's how you are doing this, I can give you more pointers.

There's another pointer that might help, too, that doesn't matter which method you use. It has to do with increasing the amount of retries and you change your config file for that. Here's the documentation for that. Let me know if you need more information on the python method. https://boto3.amazonaws.com/v1/documentation/api/latest/guide/retries.html

已回答 2 年前
0

Also, since you are uploading large files and they are failing, you may have failed multipart uploads that AWS charges for. I have two articles on how to create a lifecycle rule that deletes these failed uploads. I would set the time period to a day at first, so you don't have to wait long for them to be deleted. You can't see them in your bucket, but you can see them using storage lens. You may want to open a support ticket with AWS if you've been paying for these files. Be prepared to tell them which buckets and how far to go back in time.

https://aws.amazon.com/blogs/aws-cloud-financial-management/discovering-and-deleting-incomplete-multipart-uploads-to-lower-amazon-s3-costs/

https://aws.amazon.com/blogs/aws/s3-lifecycle-management-update-support-for-multipart-uploads-and-delete-markers/

已回答 2 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南