Replica of volume of 20tb

1

I need to implement automation of a scenario Copy the snapshot based on instance fron other account to my account then prepare volume id based on snapshot with kms keys Then perform few activities on the volume and also support this upto 20 tbs .

Could you please help me best solution for automation on the above scenario

質問済み 1ヶ月前118ビュー
1回答
3
承認された回答

I recommend starting by copying the snapshot from the other account to yours. Next, create a volume from the copied snapshot using your preferred KMS key. Finally, you can carry out various operations on the volume, such as attaching it to an EC2 instance, formatting it, and mounting it. This solution supports volumes up to 20TB using the Provisioned IOPS SSD (io1 or io2) volume type, which can handle volume sizes up to 64TB.

Here's a streamlined overview of the process:

# Copy Snapshot from Other Account
aws ec2 copy-snapshot --source-region <source-region> --source-snapshot-id <source-snapshot-id> --kms-key-id <kms-key-id> --destination-region <destination-region> --encrypted

# Create Volume from Copied Snapshot with KMS Key
aws ec2 create-volume --snapshot-id <copied-snapshot-id> --volume-type io1 --iops 3000 --size 20000 --kms-key-id <kms-key-id> --availability-zone <availability-zone>

# Attach Volume to EC2 Instance
aws ec2 attach-volume --volume-id <volume-id> --instance-id <instance-id> --device /dev/sdf

# Format and Mount Volume
mkfs -t ext4 /dev/sdf
mount /dev/sdf /mnt

ℹ️ You would need to replace <source-region>, <source-snapshot-id>, <kms-key-id>, <destination-region>, <copied-snapshot-id>, <availability-zone>, <volume-id>, and <instance-id> with your specific values.


⚠️ Ensure you have an IAM role with a cross-account policy.

Key Resources:


💡 Additionally, you can consider using AWS Lambda to orchestrate the entire process, which would provide more flexibility and maintainability in the long run.

profile picture
エキスパート
回答済み 1ヶ月前
profile picture
エキスパート
レビュー済み 1ヶ月前
profile pictureAWS
エキスパート
レビュー済み 1ヶ月前
  • Thanks for the response , already i implemented step function with lambda but we have limitation for lambda execution time 900 sec. Now i am divide the task multiple lambda and achieving Max. 80 GB . Could you please suggest how my solution will support upto 20 TB.

  • Alright, you're correct about the Lambda execution time limitation. In this case, I'd suggest using AWS Data Pipeline. With Data Pipeline, you can define workflow activities similar to what I described earlier, such as copying snapshots, creating volumes, attaching volumes, and formatting and mounting them.

    ℹ️ Data Pipeline is not limited by the Lambda execution time and allows you to efficiently move up to 20TB of data. Additionally, with Data Pipeline, you only pay for the services you use, making it a cost-effective solution for your needs.

ログインしていません。 ログイン 回答を投稿する。

優れた回答とは、質問に明確に答え、建設的なフィードバックを提供し、質問者の専門分野におけるスキルの向上を促すものです。

質問に答えるためのガイドライン

関連するコンテンツ