Skip to main content

Client code

We provide client codes in both Python and Javascript.

Installation

Follow installation instructions in the repository for our Python Client or Javascript Client.

Chat Completion

The chat completion API allows you to chat with a model fine-tuned to follow instructions.

No streaming

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-large-latest"

client = MistralClient(api_key=api_key)

messages = [
ChatMessage(role="user", content="What is the best French cheese?")
]

# No streaming
chat_response = client.chat(
model=model,
messages=messages,
)

print(chat_response.choices[0].message.content)

With streaming

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-large-latest"

client = MistralClient(api_key=api_key)

messages = [
ChatMessage(role="user", content="What is the best French cheese?")
]

# With streaming
stream_response = client.chat_stream(model=model, messages=messages)

for chunk in stream_response:
print(chunk.choices[0].delta.content)

With async

from mistralai.async_client import MistralAsyncClient
from mistralai.models.chat_completion import ChatMessage

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-large-latest"

client = MistralAsyncClient(api_key=api_key)

messages = [
ChatMessage(role="user", content="What is the best French cheese?")
]

# With async
async_response = client.chat_stream(model=model, messages=messages)

async for chunk in async_response:
print(chunk.choices[0].delta.content)

We allow users to provide a custom system prompt (see API reference). We also allow a convenient safe_prompt flag to force chat completion to be moderated against sensitive content (see Guardrailing).

JSON mode

Uers have the option to set response_format to {"type": "json_object"} to enable JSON mode. It's important to explicitly ask the model to generate JSON output in your message.

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-large-latest"

client = MistralClient(api_key=api_key)

messages = [
ChatMessage(role="user", content="What is the best French cheese? Return the product and produce location in JSON format")
]

chat_response = client.chat(
model=model,
response_format={"type": "json_object"},
messages=messages,
)

print(chat_response.choices[0].message.content)

Embeddings

The embeddings API allows you to embed sentences.

from mistralai.client import MistralClient

api_key = os.environ["MISTRAL_API_KEY"]
client = MistralClient(api_key=api_key)

embeddings_batch_response = client.embeddings(
model="mistral-embed",
input=["Embed this sentence.", "As well as this one."],
)

Third-Party Clients

Here are some clients built by the community for various other languages:

CLI

icebaker/nano-bots

Go

Gage-Technologies

Ruby

gbaptista/mistral-ai