
No ratings yet
Be the first to review this model
Mistral's sparse mixture-of-experts frontier model with 675B total parameters and 41B active, released under Apache 2.0. Among the first open frontier models with combined multimodal and multilingual capabilities in a 256K context window. Trained on 3,000 NVIDIA H200 GPUs and designed for reliability and long-context comprehension.
Released
December 2, 2025
Parameters
675B (MoE, 41B active)
Context
256K
Pricing
Open Source
Last updated: March 15, 2026
Benchmark scores may vary based on evaluation methodology and conditions.