Skip to main content

Bienvenue to Mistral AI Documentation

Mistral AI is a research lab building the best open source models in the world. La Plateforme enables developers and enterprises to build new products and applications, powered by Mistral’s open source and commercial LLMs.

Mistral AI Large Language Models (LLMs)

We release both open source and commercial models, driving innovation and convenience for our developer community. Our models are state-of-the-art for their multilingual, code generation, maths, and advanced reasoning capabilities.

Open Source

  • Mistral 7b, our first dense model released September 2023
  • Mixtral 8x7b, our first sparse mixture-of-experts released December 2023
  • Mixtral 8x22b, our best open source model to date released April 2024

Commercial

  • Mistral Small, our cost-efficient reasoning model for low-latency workloads
  • Mistral Medium, useful for intermediate tasks that require moderate reasoning; please note that this model will be deprecated in the coming months
  • Mistral Large, our top-tier reasoning model for high-complexity tasks
  • Mistral Embeddings, our state-of-the-art semantic for extracting representation of text extracts

For our commercial models, we are always improving and iteratively deploying them. Keep up to date on our model versioning here.

Explore the Mistral AI APIs

The Mistral AI APIs empower LLM applications via:

  • Text generation, enables streaming and provides the ability to display partial model results in real-time
  • Embeddings, useful for RAG where it represents the meaning of text as a list of numbers
  • Function calling, enables Mistral models to connect to external tools
  • JSON mode, enables developers to set the response format to json_object
  • Guardrailing, enables developers to enforce policies at the system level of Mistral models