- 最新
- 最多得票
- 最多評論
Hello,
Alternatively, you can upload S3 via AWS CLI tool using the sync command.
aws s3 sync local_folder s3://bucket-name
You can use this method to batch upload files to S3 very fast.
Refer This Document for step by step process : https://aws.amazon.com/getting-started/hands-on/backup-to-s3-cli/
Hi,
AWS Datasync is what you want to use for such data migration: see https://aws.amazon.com/blogs/storage/migrating-google-cloud-storage-to-amazon-s3-using-aws-datasync/
This tutorial is close to your use case: https://docs.aws.amazon.com/datasync/latest/userguide/s3-cross-account-transfer.html
Official doc is here: https://aws.amazon.com/datasync/
Best,
Didier
The easiest is to use S3 API. It supports multiparty upload and available through CLI, SDK and Rest API
As NARRAVULA pointed out, if you feel DataSync is overwhelmed, then using AWS CLI and running a command
https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
You can first dry-run the command to make sure it shows what it will or going to do and then run the command without dryrun switch.
aws s3 sync <local directory> s3://<destination bucket>/prefix/ --dryrun
相關內容
- 已提問 5 個月前
- AWS 官方已更新 2 年前
- AWS 官方已更新 3 年前
- AWS 官方已更新 1 年前
Thanks Didier. I have looked at Datasync and it is totally overwhelming - I think if this is truely the best option then I have chosen the wrong service and need something more user friendly and less technical.
Ultimately I need to host a large number of image files at static URLS. Historically I my website hosting package but its not well suited and priced f0or a different purpose. I thought S3 made sense given is popularity and cost. but where I am familiar with FTP and I did code and build websites, this is just a whole level of confusion, all the terminology is so different. I mean words like Hypervisor?!
I don't think I can manage with the DataSync - is there something more passive that can be run manually? Like an ftp? without virtual servers etc.