Skip to main content

Quickstart

tip

Looking for La Plateforme? Head to console.mistral.ai

Account setup

  • To get started, create a Mistral account or sign in at console.mistral.ai.
  • Then, navigate to "Workspace" and "Billing" to add your payment information and activate payments on your account.
  • After that, go to the "API keys" page and make a new API key by clicking "Create new key". Make sure to copy the API key, save it safely, and do not share it with anyone.

Getting started with Mistral AI API

Open In Colab

Mistral AI API provides a seamless way for developers to integrate Mistral's state-of-the-art models into their applications and production workflows with just a few lines of code. Our API is currently available through La Plateforme. You need to activate payments on your account to enable your API keys. After a few moments, you will be able to use our chat endpoint:

import os
from mistralai import Mistral

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-large-latest"

client = Mistral(api_key=api_key)

chat_response = client.chat.complete(
model= model,
messages = [
{
"role": "user",
"content": "What is the best French cheese?",
},
]
)
print(chat_response.choices[0].message.content)

To generate text embeddings using Mistral AI's embeddings API, we can make a request to the API endpoint and specify the embedding model mistral-embed, along with providing a list of input texts. The API will then return the corresponding embeddings as numerical vectors, which can be used for further analysis or processing in NLP applications.

import os
from mistralai import Mistral

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-embed"

client = Mistral(api_key=api_key)

embeddings_response = client.embeddings.create(
model=model,
inputs=["Embed this sentence.", "As well as this one."]
)

print(embeddings_response)

For a full description of the models offered on the API, head on to the model documentation.