- Newest
- Most votes
- Most comments
Hi,
I would strongly suggest that you read section "Identity and Access Management" of https://aws.amazon.com/blogs/machine-learning/getting-started-with-cross-region-inference-in-amazon-bedrock/
It fully details the IAM policy that you need to set up for cross-region inferences witb Bedrock. Based on their example, a sample policy is
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["bedrock:InvokeModel*"],
"Resource: [
"arn:aws:bedrock:us-east-1:<account_id>:inference-profile/*",
"arn:aws:bedrock:us-east-1::foundation-model/*",
"arn:aws:bedrock:us-west-2::foundation-model/*"
]
}
]
}
In such policies, the part "arn:aws:bedrock:us-east-1:<account_id>:inference-profile/*"
is new for cross-region inference and may well be what's missing in your use case.
Best,
Didier
To test your agent using cross-region inference, you need to ensure that your IAM role has the correct permissions for cross-region inference API actions. The "access denied" error you're encountering suggests that your current IAM role lacks the necessary permissions for cross-region inference.
Here are the steps you should take:
-
Review your IAM role: Make sure it has permissions for the cross-region inference API actions.
-
Update IAM policy: You need to add permissions for both the inference profile and the foundation models in the regions you want to use. For example, if you're using us-east-1 and us-west-2, your policy should include permissions for these regions.
-
Example policy: An IAM policy that allows cross-region inference might look like this:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": ["bedrock:InvokeModel*"], "Resource": [ "arn:aws:bedrock:us-east-1:<account_id>:inference-profile/", "arn:aws:bedrock:us-east-1::foundation-model/", "arn:aws:bedrock:us-west-2::foundation-model/*" ] } ] }
Replace <account_id> with your AWS account ID and adjust the regions as needed.
-
Apply the updated policy: After updating the IAM policy, apply it to the role used by your Bedrock agent.
-
Retry the test: Once the permissions are correctly set, try testing your agent again with cross-region inference.
Regarding the "request rate too high" error you encountered with on-demand inference, this is related to the service quotas. Cross-region inference can actually help with this issue, as it allows for higher throughput - up to double the allocated quotas in some cases.
If you continue to face issues after updating your IAM permissions, you may need to contact AWS support for further assistance.
Sources
Getting started with cross-region inference in Amazon Bedrock | AWS Machine Learning Blog
Improve resilience with cross-region inference - Amazon Bedrock
I do have AmazonBedrockFullAccess policy attached, and still invocation returns:
Error 403: You don't have access to the model with the specified model ID.
I am totally confused. I have been granted access to Anthropic models in Model Catalog. The experience in AWS Bedrock has been terrible so far.
Please advise.
Roberto
Relevant content
- Accepted Answerasked 2 months ago
- asked 5 months ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 6 months ago