Access a Large File from a Docker Container

0

I want to build a Docker container for a Machine Learning product. However, the Machine Learning model's size is ~2 GB. Instead of including it inside the Docker container I want to store it in AWS and mount it. I do not know which storage service is the right one, my first choice was S3, but I do not know if I can mount an S3 bucket as a volume.

Which storage service should to store my model and how can I mount it?

2 回答
0

Hi,

s3fs is a very easy way to access files stored in s3 from Python code: it presents S3 storage as a disk with directories and files.

See https://s3fs.readthedocs.io/en/latest/ for the documentation

Best,

Didier

profile pictureAWS
专家
已回答 5 个月前
0

You can store the image in a private ECR repository in your AWS account [1], here are the limits, you should not have issues with 2GB [2]

If you want to store it in S3. You perfectly can, upload your image to a s3 bucket, then to download the image you don't need to mount the s3 bucket, but you can copy the file to your local disk when the instance is starting example:

aws s3 cp s3://bucket/key local_file_path

Remember to assign a profile role to your instance (or configure access keys with aws configure), to have permissions

[1]https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html

[2]https://docs.aws.amazon.com/AmazonECR/latest/userguide/service-quotas.html

profile pictureAWS
已回答 5 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则