[Models]

Cat
December 11, 2023BlogTechnical Report
v0.1

Mixtral 8x7B

Our first open MoE model, SOTA competing against models up to 2 to 3x larger.

Modalities
Context
32k
Retirement date3/30/2025

Weights

WeightsLicenseParameters (B)Active (B)≈ GPU RAM at bf16 - fp4 (GB)Context Size (tokens)
4713
94 - 13
32k
4713
94 - 13
32k