Does S3 bucket keys work for cross account Athena Acess of S3 data

0

HI,

I have 2 AWS accounts. I have enabled cross account access for S3 and Glue artifacts.

Acc A has data in S3 Buckets. Acc B is reading the data using Athena.

I am using SSE with KMS (SSE-KMS)

I am incurring a heavy KMS cost on Acc B.

  1. Whats. the best way to reduce the KMS costs?
  2. I am trying to use S3 Bucket keys, But doesn't seem to work for cross account access.. https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-key.html Has anyone tried this, Am I missing something?

Thanks in advance. Ravi

1 Answer
0
Accepted Answer

Hello Ravi,

S3 bucket keys, also known as default encryption with S3-managed keys (SSE-S3), do not directly address the KMS cost issue when accessing S3 data from another AWS account. SSE-S3 encrypts objects using keys managed by AWS, but it doesn't change the encryption method or the associated KMS costs for cross-account access.

To reduce KMS costs when accessing data stored in S3 from another AWS account, you can consider the following options:

  1. Cross-Account Access with Your Own KMS Key:

    • Instead of relying on the owner account's KMS key for encryption, you can set up your own KMS key in your AWS account.
    • Request the owner of the S3 bucket to update the bucket policy to allow your AWS account to use your KMS key for encryption.
    • Configure your Athena queries to use your KMS key for decryption.
  2. Use AWS Glue DataBrew:

    • AWS Glue DataBrew can help with data preparation and ETL tasks. It can read data from S3 and write it back to S3 in your own account, allowing you to use your KMS key for encryption.
  3. Consider SSE with S3 Object Ownership:

    • If your use case allows it, you can consider using S3 Object Ownership. This allows the destination AWS account to own the objects copied from the source account. In this scenario, you could use your own KMS key for encryption once the objects are owned by your account.
  4. Data Lake House Architecture:

    • Consider adopting a data lake house architecture using services like AWS Lake Formation, which can simplify data sharing and governance across AWS accounts.
  5. Reduce Data Transfer Costs:

    • Optimize your data transfer costs by using Amazon S3 Transfer Acceleration or selecting the appropriate AWS region for data transfer, as data transfer across regions can incur additional costs.
  6. KMS Key Usage Optimization:

    • Review your KMS key policies to ensure they are properly scoped to the necessary services and actions. Limiting unnecessary access can help reduce KMS costs.
  7. Reserved Capacity Pricing:

    • If KMS costs are a significant concern and you have predictable workloads, consider using AWS Key Management Service (KMS) Custom Key Store with reserved capacity pricing to potentially reduce costs.

Remember to assess the specific requirements and constraints of your use case to determine the best approach. Depending on your needs, a combination of these strategies might be the most cost-effective solution.

Please give a thumbs up if my suggestion helps

profile picture
answered 8 months ago
profile pictureAWS
EXPERT
reviewed 8 months ago
  • Thanks Gabriel for the Quick response.. The first option definitely sounds intersting and most suitable for my needs. I will definitely try that . Thank you, Appreciate your time.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions