ChatBedrockAnthropic
ChatBedrockAnthropic(=None,
model=4096,
max_tokens=None,
aws_secret_key=None,
aws_access_key=None,
aws_region=None,
aws_profile=None,
aws_session_token=None,
base_url=None,
system_prompt=None,
turns=None,
kwargs )
Chat with an AWS bedrock model.
AWS Bedrock provides a number of chat based models, including those Anthropic’s Claude.
Prerequisites
Consider using the approach outlined in this guide to manage your AWS credentials: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/credentials.html
ChatBedrockAnthropic
, requires the anthropic
package with the bedrock
extras (e.g., pip install anthropic[bedrock]
).
Examples
from chatlas import ChatBedrockAnthropic
= ChatBedrockAnthropic(
chat ="...",
aws_profile="us-east",
aws_region="...",
aws_secret_key="...",
aws_access_key="...",
aws_session_token
)"What is the capital of France?") chat.chat(
Parameters
Name | Type | Description | Default |
---|---|---|---|
model | Optional[str] | The model to use for the chat. | None |
max_tokens | int | Maximum number of tokens to generate before stopping. | 4096 |
aws_secret_key | Optional[str] | The AWS secret key to use for authentication. | None |
aws_access_key | Optional[str] | The AWS access key to use for authentication. | None |
aws_region | Optional[str] | The AWS region to use. Defaults to the AWS_REGION environment variable. If that is not set, defaults to 'us-east-1' . |
None |
aws_profile | Optional[str] | The AWS profile to use. | None |
aws_session_token | Optional[str] | The AWS session token to use. | None |
base_url | Optional[str] | The base URL to use. Defaults to the ANTHROPIC_BEDROCK_BASE_URL environment variable. If that is not set, defaults to f"https://bedrock-runtime.{aws_region}.amazonaws.com" . |
None |
system_prompt | Optional[str] | A system prompt to set the behavior of the assistant. | None |
turns | Optional[list[Turn]] | A list of turns to start the chat with (i.e., continuing a previous conversation). If not provided, the conversation begins from scratch. Do not provide non-None values for both turns and system_prompt . Each message in the list should be a dictionary with at least role (usually system , user , or assistant , but tool is also possible). Normally there is also a content field, which is a string. |
None |
kwargs | Optional['ChatBedrockClientArgs'] | Additional arguments to pass to the anthropic.AnthropicBedrock() client constructor. |
None |
Troubleshooting
If you encounter 400 or 403 errors when trying to use the model, keep the following in mind:
If the model name is completely incorrect, you’ll see an error like Error code: 400 - {'message': 'The provided model identifier is invalid.'}
Make sure the model name is correct and active in the specified region.
If you encounter errors similar to Error code: 403 - {'message': "You don't have access to the model with the specified model ID."}
, make sure your model is active in the relevant aws_region
.
Keep in mind, if aws_region
is not specified, and AWS_REGION is not set, the region defaults to us-east-1, which may not match to your AWS config’s default region.
In some cases, even if you have the right model and the right region, you may still encounter an error like Error code: 400 - {'message': 'Invocation of model ID anthropic.claude-3-5-sonnet-20240620-v1:0 with on-demand throughput isn't supported. Retry your request with the ID or ARN of an inference profile that contains this model.'}
In this case, you’ll need to look up the ‘cross region inference ID’ for your model. This might required opening your aws-console
and navigating to the ‘Anthropic Bedrock’ service page. From there, go to the ‘cross region inference’ tab and copy the relevant ID.
For example, if the desired model ID is anthropic.claude-3-5-sonnet-20240620-v1:0
, the cross region ID might look something like us.anthropic.claude-3-5-sonnet-20240620-v1:0
.
Returns
Name | Type | Description |
---|---|---|
Chat | A Chat object. |