Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
Qwen 1.5 MoE Reviews
The community submitted 2 reviews to tell us what they like about Qwen 1.5 MoE, what Qwen 1.5 MoE can do better, and more.
5.0
Based on 2 reviews
Review Qwen 1.5 MoE?
Reviews
Most Informative


