December 11, 2023v0.1
Mixtral 8x7B
Our first open MoE model, SOTA competing against models up to 2 to 3x larger.
Modalities
Retirement date3/30/2025
ReplacementMistral Small 3.2
Weights
| Weights | License | Parameters (B) | Active (B) | ≈ Min GPU RAM (GB) | Context Size (tokens) |
|---|---|---|---|---|---|
| 47 | 13 | 94 | 32k | ||
| 47 | 13 | 94 | 32k |
Other Models