Direkt zum Inhalt

Easiest Method For Batch Uploading and Replacing Files / Folder into S3 Buckets? NewB

0

Hi There - thanks in advance. for your advice.

I am new to S3 and I used Google Gemini to run me through setting up my first bucket. I am used to using FTP to upload to host provided servers so I set up FTP using AWS Transfer Family on AWS up for ease of use but I have had $250+ bills for having this FTP feature I didn't realise was charged by the hour at $0.30 an hour.

So can you help me with the below as AI led me astray... I have read a number of help articles but there are such an overwhelming number of services and more complex queries like this none seem to offer a straightforward answer that doesn't require command line.

GOAL To be able to upload a series of 50+ local PC folders (with subfolders in) or smaller individual uploads into S3 and so replace the cloud versions and fill in missing files / folders. Previously I would use FTP and an FTP client which gives prompts to replace / overwrite / ignore. Clearly FTP is a wasted expense if I do this once a month at most. I suppose I could enable and then disable FTP each time but it doesn't seem ideal

QUESTION - what is the easiest way to do this task. (I'm on Windows 11)

My ultimate aim is to have my entire image / media catalogue backed up in my S3 Bucket.

I have avoided command line as I am not a coder.

Can you suggest what the easiest and most cost effective path would be.

gefragt vor einem Jahr1449 Aufrufe
4 Antworten
2

Hello,

Alternatively, you can upload S3 via AWS CLI tool using the sync command.

aws s3 sync local_folder s3://bucket-name

You can use this method to batch upload files to S3 very fast.

Refer This Document for step by step process : https://aws.amazon.com/getting-started/hands-on/backup-to-s3-cli/

EXPERTE
beantwortet vor einem Jahr
0

Hi,

AWS Datasync is what you want to use for such data migration: see https://aws.amazon.com/blogs/storage/migrating-google-cloud-storage-to-amazon-s3-using-aws-datasync/

This tutorial is close to your use case: https://docs.aws.amazon.com/datasync/latest/userguide/s3-cross-account-transfer.html

Official doc is here: https://aws.amazon.com/datasync/

Best,

Didier

EXPERTE
beantwortet vor einem Jahr
  • Thanks Didier. I have looked at Datasync and it is totally overwhelming - I think if this is truely the best option then I have chosen the wrong service and need something more user friendly and less technical.

    Ultimately I need to host a large number of image files at static URLS. Historically I my website hosting package but its not well suited and priced f0or a different purpose. I thought S3 made sense given is popularity and cost. but where I am familiar with FTP and I did code and build websites, this is just a whole level of confusion, all the terminology is so different. I mean words like Hypervisor?!

    I don't think I can manage with the DataSync - is there something more passive that can be run manually? Like an ftp? without virtual servers etc.

0

The easiest is to use S3 API. It supports multiparty upload and available through CLI, SDK and Rest API

AWS
beantwortet vor einem Jahr
0

As NARRAVULA pointed out, if you feel DataSync is overwhelmed, then using AWS CLI and running a command

https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html

You can first dry-run the command to make sure it shows what it will or going to do and then run the command without dryrun switch.

aws s3 sync <local directory> s3://<destination bucket>/prefix/  --dryrun 
beantwortet vor einem Jahr

Du bist nicht angemeldet. Anmelden um eine Antwort zu veröffentlichen.

Eine gute Antwort beantwortet die Frage klar, gibt konstruktives Feedback und fördert die berufliche Weiterentwicklung des Fragenstellers.