Zac Zuo

Mistral Small 3 - High performance in a 24b open-source model

Mistral Small 3 is the most efficient and versatile model of Mistral. Pre-trained and instructed version, Apache 2.0, 24B, 81% MMLU, 150 token/s. No synthetic data so great base for anything reasoning.

Add a comment

Replies

Best
Luis Pereira
Finally an A.I that is small enough for me to add to my mobile app, thanks for your work on this!
Ali Mustufa Shaikh
For someone like me who enjoys SLMs, this is fantastic news! Looking forward to trying this;