Skip to main content

AWS Bedrock

You can deploy Mistral AI's open and commercial models on your AWS account using the AWS Bedrock service. This page provides a straightforward guide on how to get started on using Mistral Large as an AWS Bedrock foundational model.

Pre-requisites

In order to query the model you will need:

  • Access to an AWS account within a region that supports the AWS Bedrock service and offers access to Mistral Large: see the AWS documentation for model availability per region.
  • An AWS IAM principal (user, role) with sufficient permissions, see the AWS documentation for more details.
  • Access to the Mistral AI models enabled from the AWS Bedrock home page, see the AWS documentation for more details.
  • A local code environment set up with the relevant AWS SDK components, namely:

Querying the model

Before starting, make sure to properly configure the authentication credentials for your development environment. The AWS documentation provides an in-depth explanation on the required steps.

import boto3
import json

client = boto3.client("bedrock-runtime")
modelId = "mistral.mistral-large-2402-v1:0"

prompt = "<s>[INST] What is the best French cheese ? [/INST]"
body = json.dumps({
"prompt": prompt,
"max_tokens": 512,
"top_p": 0.8,
"temperature": 0.5
})
accept = "application/json"
contentType = "application/json"
resp = client.invoke_model(
body=body,
modelId=modelId,
accept="application/json",
contentType="application/json"
)

print(json.loads(resp.get("body").read()))

Going further

For more usage examples and details you can check the following links: