Access a Large File from a Docker Container

0

I want to build a Docker container for a Machine Learning product. However, the Machine Learning model's size is ~2 GB. Instead of including it inside the Docker container I want to store it in AWS and mount it. I do not know which storage service is the right one, my first choice was S3, but I do not know if I can mount an S3 bucket as a volume.

Which storage service should to store my model and how can I mount it?

2 Antworten
0

Hi,

s3fs is a very easy way to access files stored in s3 from Python code: it presents S3 storage as a disk with directories and files.

See https://s3fs.readthedocs.io/en/latest/ for the documentation

Best,

Didier

profile pictureAWS
EXPERTE
beantwortet vor 5 Monaten
0

You can store the image in a private ECR repository in your AWS account [1], here are the limits, you should not have issues with 2GB [2]

If you want to store it in S3. You perfectly can, upload your image to a s3 bucket, then to download the image you don't need to mount the s3 bucket, but you can copy the file to your local disk when the instance is starting example:

aws s3 cp s3://bucket/key local_file_path

Remember to assign a profile role to your instance (or configure access keys with aws configure), to have permissions

[1]https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html

[2]https://docs.aws.amazon.com/AmazonECR/latest/userguide/service-quotas.html

profile pictureAWS
beantwortet vor 5 Monaten

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.

Richtlinien für die Beantwortung von Fragen