Introducing: Devstral 2 and Mistral Vibe CLI.

-


Today, we’re releasing Devstral 2—our next-generation coding model family available in two sizes: Devstral 2 (123B) and Devstral Small 2 (24B). Devstral 2 ships under a modified MIT license, while Devstral Small 2 uses Apache 2.0. Each are open-source and permissively licensed to speed up distributed intelligence.

Devstral 2 is currently free to make use of via our API.

We’re also introducing Mistral Vibe, a native CLI built for Devstral that permits end-to-end code automation.

Highlights.

  1. Devstral 2: SOTA open model for code agents with a fraction of the parameters of its competitors and achieving 72.2% on SWE-bench Verified.

  2. As much as 7x more cost-efficient than Claude Sonnet at real-world tasks.

  3. Mistral Vibe CLI: Native, open-source agent in your terminal solving software engineering tasks autonomously.

  4. Devstral Small 2: 24B parameter model available via API or deployable locally on consumer hardware.

  5. Compatible with on-prem deployment and custom fine-tuning.

Devstral: the subsequent generation of SOTA coding.

Devstral 2 is a 123B-parameter dense transformer supporting a 256K context window. It reaches 72.2% on SWE-bench Verified—establishing it as top-of-the-line open-weight models while remaining highly cost efficient. Released under a modified MIT license, Devstral sets the open state-of-the-art for code agents.

Devstral Small 2 scores 68.0% on SWE-bench Verified, and places firmly amongst models as much as five times its size while being able to running locally on consumer hardware.

Devstral   Swe Bench Verified  Open Weights Vs Proprietary Models (light) (1)

Devstral 2 (123B) and Devstral Small 2 (24B) are 5x and 28x smaller than DeepSeek V3.2, and 8x and 41x smaller than Kimi K2—proving that compact models can match or exceed the performance of much larger competitors. Their reduced size makes deployment practical on limited hardware, lowering barriers for developers, small businesses, and hobbyists.hardware.

Devstral   Swe Bench Verified Regular Performance X Modelsize (light) (1)

Built for production-grade workflows.

Devstral 2 supports exploring codebases and orchestrating changes across multiple files while maintaining architecture-level context. It tracks framework dependencies, detects failures, and retries with corrections—solving challenges like bug fixing and modernizing legacy systems.

The model will be fine-tuned to prioritize specific languages or optimize for big enterprise codebases.

We evaluated Devstral 2 against DeepSeek V3.2 and Claude Sonnet 4.5 using human evaluations conducted by an independent annotation provider, with tasks scaffolded through Cline. Devstral 2 shows a transparent advantage over DeepSeek V3.2, with a 42.8% win rate versus 28.6% loss rate. Nonetheless, Claude Sonnet 4.5 stays significantly preferred, indicating a niche with closed-source models persists.

Devstral   Model Performance Comparison (light) (1)

“Devstral 2 is on the frontier of open-source coding models. In Cline, it delivers a tool-calling success rate on par with the most effective closed models; it is a remarkably smooth driver. This can be a massive contribution to the open-source ecosystem.” — Cline.

“Devstral 2 was one in all our most successful stealth launches yet, surpassing 17B tokens in the primary 24 hours. Mistral AI is moving at Kilo Speed with a cost-efficient model that really works at scale.” — Kilo Code.

Devstral Small 2, a 24B-parameter model with the identical 256K context window and released under Apache 2.0, brings these capabilities to a compact, locally deployable form. Its size enables fast inference, tight feedback loops, and straightforward customization—with fully private, on-device runtime. It also supports image inputs, and may power multimodal agents. 

Mistral Vibe CLI.

Mistral Vibe CLI is an open-source command-line coding assistant powered by Devstral. It explores, modifies, and executes changes across your codebase using natural language—in your terminal or integrated into your selected IDE via the Agent Communication Protocol. It’s released under the Apache 2.0 license.

Vibe CLI provides an interactive chat interface with tools for file manipulation, code searching, version control, and command execution. Key features:

  • Project-aware context: Routinely scans your file structure and Git status to offer relevant context

  • Smart references: Reference files with @ autocomplete, execute shell commands with !, and use slash commands for configuration changes

  • Multi-file orchestration: Understands your entire codebase—not only the file you are editing—enabling architecture-level reasoning that may halve your PR cycle time

  • Persistent history, autocompletion, and customizable themes.

You’ll be able to run Vibe CLI programmatically for scripting, toggle auto-approval for tool execution, configure local models and providers through a straightforward config.toml, and control tool permissions to match your workflow.

Start.

Devstral 2 is currently offered free via our API. After the free period, the API pricing might be $0.40/$2.00 per million tokens (input/output) for Devstral 2 and $0.10/$0.30 for Devstral Small 2.

We’ve partnered with leading, open agent tools Kilo Code and Cline to bring Devstral 2 to where you already construct.

Mistral Vibe CLI is obtainable as an extension in Zed, so you should utilize it directly inside your IDE.

Really helpful deployment for Devstral.

Devstral 2 is optimized for data center GPUs and requires a minimum of 4 H100-class GPUs for deployment. You’ll be able to try it today on construct.nvidia.com. Devstral Small 2 is built for single-GPU operation and runs across a broad range of NVIDIA systems, including DGX Spark and GeForce RTX. NVIDIA NIM support might be available soon.

Devstral Small runs on consumer-grade GPUs in addition to CPU-only configurations with no dedicated GPU required.

For optimal performance, we recommend a temperature of 0.2 and following the most effective practices defined for Mistral Vibe CLI.

Contact us.

We’re excited to see what you’ll construct with Devstral 2, Devstral Small 2, and Vibe CLI!

Share your projects, questions, or discoveries with us on X/Twitter, Discord, or GitHub.

We’re hiring!

For those who’re interested by shaping open-source research and constructing world-class interfaces that bring truly open, frontier AI to users, we welcome you to apply to affix our team.



Source link

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x