[Deployment]

Snowflake Cortex

Mistral AI's open and commercial models can be leveraged from the Snowflake Cortex platform as fully managed endpoints. Mistral models on Snowflake Cortex are serverless services, so you don't have to manage any infrastructure.

As of today, the following models are available:

  • Mistral Large
  • Mistral 7B

For more details, visit the models page.

Getting Started

Getting Started

The following sections outline the steps to query the latest version of Mistral Large on the Snowflake Cortex platform.

Getting Access to the Model

Getting Access to the Model

The following items are required:

  • The associated Snowflake account must be in a compatible region (see the region list in the Snowflake documentation).
  • The principal calling the model must have the CORTEX_USER database role.
Querying the Model (Chat Completion)

Querying the Model (Chat Completion)

The model can be called either directly in SQL or in Python using Snowpark ML. It is exposed via the COMPLETE LLM function.

Execute this code either from a hosted Snowflake notebook or from your local machine.

For local execution, you need to:

  • Create a virtual environment with the following package:
    • snowflake-ml-python (tested with version 1.6.1)
  • Ensure that you have a configuration file with the proper credentials on your system. The example below assumes that you have a named connection called mistral that is configured appropriately.
from snowflake.snowpark import Session
from snowflake.ml.utils import connection_params
from snowflake.cortex import Complete

# Start session (local execution only)
params = connection_params.SnowflakeLoginOptions(connection_name="mistral")
session = Session.builder.configs(params).create()

# Query the model
prompt = "Who is the best French painter? Answer in one short sentence."
completion = Complete(model="mistral-large2", prompt=prompt)
print(completion)
Going Further

Going Further

For more information and examples, you can check the Snowflake documentation for: