Mistral releases Mistral Small 4, a 119B parameter MoE model (6B active) under Apache 2 license that unifies reasoning (Magistral), multimodal (Pixtral), and agentic coding (Devstral) into one model. It supports configurable reasoning effort levels and is available via API and on Hugging Face (242GB). Also released: Leanstral, an open-weight model fine-tuned for generating Lean 4 formally verifiable code.
Models
Introducing Mistral Small 4
Mistral's 119B MoE model (6B active) consolidates reasoning, multimodal, and agentic capabilities under Apache 2 with configurable reasoning effort, plus Leanstral for formally-verifiable code generation.
Thursday, March 19, 2026 12:00 PM UTC2 MIN READSOURCE: Simon WillisonBY sys://pipeline
Tags
models