Questions tagged with AWS Cost and Usage Report
Content language: English
Sort by most recent
In Sagemaker, I have 3 domains and have to setup budget alert for each domain
Hi AWS and AWS users,
I am creating CUR(Cost and Usage Report) with the following configurations
- Default content
- Refresh automatically
- Report data time granularity : Hourly
- Report versioning : Create new report version
with the settings above, I was expecting a single report containing hourly logs to be created when AWS refreshes the CUR(up to 3 times a day).
However I am receiving multiple CUR in my bucket such as
- 20230312T081141Z
- MyCostReport-00001.csv.gz (55.6 MB)
- MyCostReport-00002.csv.gz (3.6 MB)
- MyCostReport-00003.csv.gz (1.9 KB)
- 20230314T152624Z
- MyCostReport-00001.csv.gz (58.3 MB)
- MyCostReport-00002.csv.gz (58.2 MB)
- MyCostReport-00003.csv.gz (4.4 MB)
- MyCostReport-00004.csv.gz (1.9 KB)
To sum up, I'd like to ask why
- Multiple CURs are created (with such different sizes)
- Numbers of CURs created are inconsistent (sometimes 3, sometimes 4)
I'd appreciate any help possible with the issue.
Thank you.
I'm attempting to check what our daily data out usage is, but the cost explorer only shows the daily lightsail costs. I understand I don't have any costs yet, but want to see what we're using to make sure this is the right product/service for us.
Athena queries data directly from Amazon S3. There are no additional storage charges for querying your data with Athena. You are charged standard S3 rates for storage, requests, and data transfer. By default, query results are stored in an S3 bucket of your choice and are also billed at standard S3 rates.
Please clarify my understand is it right . This mean if the data bucket in s3 is lower than 5GB , then Athena should also be free right ?
Hi All,
I am getting permission error while running the alter table - load partitions from a partitioned cur report table- query on Amazon Athena.
[ErrorCategory:USER_ERROR, ErrorCode:PERMISSION_ERROR], Detail:Amazon Athena experienced a permission error. Please provide proper permission and submitting the query again.
Can anyone suggest a solution for this. Thanks.
We have a bucket of data / objects that are no longer needed / viable. Before we delete we'd like to know how much it will cost us to do so. The bucket can remain and may get re-used in the future, but all the data inside of it needs to be wiped out.
The formulas seem a bit convoluted as the data / objects are spread in Standard S3 and in Glacier S3. Is there an easy formula to plug in our values?
There are approximately 21M objects in the bucket. I don't see a way to correlate objects to storage type (Standard / Glacier).
Hello,
We are using a number of services which are accessing data from AWS S3 via IAM Roles and Credentials. We are looking to get a breakdown of the costs by each IAM role for the data transfer. Is there a way to get a breakdown of costs by the IAM role? Alternatively, what would be the best way to find the API calls that are billed by AWS for data transfer.
Thanks,
Hello,
I am being tasked to manage a AWS accounts, where ec2's are not tagged. However, now requirement is to have cost report of individual EC2 server on daily/monthly basis. I enabled cost report from billing section( CUR). However, under s3 bucket I can see there are multiple CUR zip files report are generated and I am not able to understand under which file should check the specific EC2.
Need help here to get cost report for individual ec2's. either in cost explorer or through CUR. Thanks.
Hello,
I am getting into using Amazon Pinpoint to send emails via campaigns and I have noticed that the biggest cost I have is the addition of endpoints (proportionally the cost of sending is zero).

I currently load the endpoints via an API call (updateUserEndpoint) every time I send a campaign.
Is it possible to reduce and curb these costs by combining Amazon Pinpoint with another service such as Amazon S3 (or others) ?
Can endpoints be saved on some Amazon service to reduce MTA costs ?
hello,
If we already have a Compute plan purchased in our organization, if we then purchase a Saving Plan by instance type.
What's going on?
Does only Compute Apply or for the instance purchased type only apply the new plan or apply both plans?
How does it work when we have plans of both types?
Hi ,
I'm trying to write to A bucket in Account A from the cost and usage report interface service in Account B .
i've built an IAM role for cross-account access between two accounts ,but somehow still can't see the bucket name appearing in the dropdown box when i choose were to send these reports in the service.
so i wanna make sure is it something can be done ?
I recently started to use AWS services and periodically I check the usage quantity of the various resources that I have allocated.
My AWS configuration consists of AppSync with a Lambda resolver which interacts with an RDS MySQL DB through a RDS Proxy. While the Lambda authenticates to the RDS Proxy through an IAM role, the authentication between proxy and MySQL database is through a password stored as a secret in the AWS Secret Manager. I am sure that my database has been queried less than 400 times however, in the billing page I see that more than 60000 API requests have been performed to the secret manager.
Why so many API requests? Is there a way to monitor the amount of requests destined to the secret manager?