Skip to main content

Snowflake Cortex

Introduction

Mistral AI's open and commercial models can be leveraged from the Snowflake Cortex platform as fully managed endpoints. Mistral models on Snowflake Cortex are serverless services so you don't have to manage any infrastructure.

As of today, the following models are available:

  • Mistral Large
  • Mistral 7B

For more details, visit the models page.

Getting started

The following sections outline the steps to query the latest version of Mistral Large on the Snowflake Cortex platform.

Getting access to the model

The following items are required:

  • The associated Snowflake account must be in a compatible region (see the region list in the Snowflake documentation).
  • The principal that is calling the model must have the CORTEX_USER database role.

Querying the model (chat completion)

The model can be called either directly in SQL or in Python using Snowpark ML. It is exposed via the COMPLETE LLM function.

SELECT SNOWFLAKE.CORTEX.COMPLETE('mistral-large2', 'Who is the best French painter? Answer in one short sentence.');

Going further

For more information and examples, you can check the Snowflake documentation for: