AWS Bedrock Setup
LlamaCloud supports AWS Bedrock as part of its multimodal AI capabilities. To enable AWS Bedrock support, update the configuration to turn on the Bedrock integration. This page guides you through the necessary configuration changes and permission settings to integrate AWS Bedrock into your deployment.
Recommended Setup
- A valid AWS account
- Access & Quota to the supported models:
- For LlamaParse advanced parsing features:
- Anthropic Claude Sonnet 3.7, specifically
anthropic.claude-3-7-sonnet-20250219-v1:0
. If this version is not available an alternative versionName can be configure through the environment variableBEDROCK_ANTHROPIC_SONNET_3_7_VERSION_NAME
- Note: This model is required if you want to use the
parse_page_with_lvm
,parse_page_with_agent
orparse_page_document_with_agent
parsing modes.
- Note: This model is required if you want to use the
- Anthropic Claude Sonnet 3.5, specifically
anthropic.claude-3-5-sonnet-20240620-v1:0
. If this version is not available an alternative versionName can be configure through the environment variableBEDROCK_ANTHROPIC_SONNET_3_5_VERSION_NAME
- Note: This model is required if you want to use the
parse_page_with_lvm
orparse_page_with_agent
parsing modes.
- Note: This model is required if you want to use the
- Anthropic Claude Sonnet 3.7, specifically
- For LlamaCloud Playground:
- Anthropic Claude Sonnet 3.5 V2
- Cohere Rerank 3.5 (Optional)
- Note: Please check if this model is available in your region.
- For LlamaParse advanced parsing features:
Connecting to Bedrock
Below are two ways to configure a connection to AWS Bedrock:
(Recommended) IAM Role for Service Accounts
We recommend creating a new IAM Role and Policy for LlamaCloud. You can then attach the role ARN as a service account annotation.
Below is a loose-goosey example policy to serve as a reference. Please tighten the policy as needed.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream",
"bedrock:CreateModelInvocationJob"
],
"Resource": [
"arn:aws:bedrock:*:*:foundation-model/*",
"arn:aws:bedrock:*:*:inference-profile/*"
]
}
]
}
After creating a policy similar to the above, update the LlamaParse service accounts with the EKS annotation:
# Example for the backend service account. Repeat for each of the services listed above.
llamaParse:
config:
awsBedrock:
enabled: true
serviceAccount:
annotations:
eks.amazonaws.com/role-arn: arn:aws:iam::<account-id>:role/<role-name>
For more information, refer to the official AWS documentation about this topic.
IAM Role w/ Static AWS Credentials
Create a user with a policy attached for Bedrock Model Invocation. Afterwards, configure the platform to use the AWS credentials of that user by setting the following values in your values.yaml
file:
llamaparse:
config:
awsBedrock:
enabled: true
region: "<your-aws-region>"
accessKeyId: "<your-access-key-id>"
secretAccessKey: "<your-secret-access-key>"
# Optional: If the default version is not available, you can specify an alternative versionName
# LlamaParse will use the alternative versionName to invoke the model
# We support 3.5 and 3.7 in different LlamaParse parsing modes.
sonnet3_5ModelVersionName: "anthropic.claude-3-5-sonnet-20240620-v1:0"
sonnet3_7ModelVersionName: "anthropic.claude-3-7-sonnet-20250219-v1:0"