AWS Bedrock model provider
If you have most of your infrastructure hosted on AWS and want to keep your data in the AWS ecosystem, AWS Bedrock is a great option. It offers a wide range of models and configurations, and you can choose from open and closed models.
To get started with AWS Bedrock, you’ll need to set up an IAM role with permissions to access AWS Bedrock. Then you’ll need to request access to the models you’d like to use. See the AWS Bedrock getting started guide for more information.
Next, use the access key ID and secret access key you’ve created to set the BEDROCK_ACCESS_KEY_ID
and BEDROCK_SECRET_ACCESS_KEY
environment variable in your Docker Compose file, or on the command line when using Docker directly. See the configuration options guide for more information about environment variables:
services: ai-assistant: environment: - BEDROCK_ACCESS_KEY_ID=your-bedrock-access-key-id - BEDROCK_SECRET_ACCESS_KEY=your-bedrock-secret-access-key ...
Service configuration file
To specify the models used by AI Assistant, you’ll need to create a service configuration file, as explained in the model-provider configuration guide.
Currently, we suggest using the Llama 3.1 70B (meta.llama3-1-70b-instruct-v1:0
) model. If this model isn’t available in your region of operation, you may also use Mistral AI Mistral Large (mistral.mistral-large-2402-v1:0
). Using smaller models such as Llama 3.1 8B can provide a more cost-effective solution, although these are expected to see degraded results. Reach out to customer support if this is your use case.
For an embedding model, we suggest using the Amazon Titan Embeddings G1 - Text model (amazon.titan-embed-text-v1
), or Amazon Titan Text Embeddings V2 model (amazon.titan-embed-text-v2:0
). Both provide good results, but the choice primarily depends on the availability in your region:
version: '1' aiServices: chat: provider: name: 'bedrock' region: 'us-west-2' accessKeyId: 'your-iam-access-key-id' # Optional secretAccessKey: 'your-iam-secret-access-key' # Optional model: 'meta.llama3-1-70b-instruct-v1:0' textEmbeddings: provider: name: 'bedrock' region: 'us-west-2' accessKeyId: 'your-iam-access-key-id' # Optional secretAccessKey: 'your-iam-secret-access-key' # Optional model: 'amazon.titan-embed-text-v1'
-
provider
:-
name
: The name of the provider. Set this tobedrock
. -
region
: The AWS region where the model is hosted. For example,us-west-2
. -
accessKeyId
: The access key ID for the AWS Bedrock service. Optionally set here, or via theBEDROCK_ACCESS_KEY_ID
environment variable. You can retrieve your access key ID when you’ve created an IAM user. See the AWS Bedrock getting started guide for more information. -
secretAccessKey
: The secret access key for the AWS Bedrock service. Optionally set here, or via theBEDROCK_SECRET_ACCESS_KEY
environment variable. You can retrieve your secret access key when you’ve created an IAM user. See the AWS Bedrock getting started guide for more information.
-
-
model
: The name of the model you want to use. For example,meta.llama3-1-70b-instruct-v1:0
for the chat service, oramazon.titan-embed-text-v1
for the embedding service.