Access a Large File from a Docker Container

0

I want to build a Docker container for a Machine Learning product. However, the Machine Learning model's size is ~2 GB. Instead of including it inside the Docker container I want to store it in AWS and mount it. I do not know which storage service is the right one, my first choice was S3, but I do not know if I can mount an S3 bucket as a volume.

Which storage service should to store my model and how can I mount it?

2개 답변
0

Hi,

s3fs is a very easy way to access files stored in s3 from Python code: it presents S3 storage as a disk with directories and files.

See https://s3fs.readthedocs.io/en/latest/ for the documentation

Best,

Didier

profile pictureAWS
전문가
답변함 5달 전
0

You can store the image in a private ECR repository in your AWS account [1], here are the limits, you should not have issues with 2GB [2]

If you want to store it in S3. You perfectly can, upload your image to a s3 bucket, then to download the image you don't need to mount the s3 bucket, but you can copy the file to your local disk when the instance is starting example:

aws s3 cp s3://bucket/key local_file_path

Remember to assign a profile role to your instance (or configure access keys with aws configure), to have permissions

[1]https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html

[2]https://docs.aws.amazon.com/AmazonECR/latest/userguide/service-quotas.html

profile pictureAWS
답변함 5달 전

로그인하지 않았습니다. 로그인해야 답변을 게시할 수 있습니다.

좋은 답변은 질문에 명확하게 답하고 건설적인 피드백을 제공하며 질문자의 전문적인 성장을 장려합니다.

질문 답변하기에 대한 가이드라인

관련 콘텐츠