Best Products
Launches
Launch archive
Most-loved launches by the community
Launch Guide
Checklists and pro tips for launching
News
Newsletter
The best of Product Hunt, every day
Stories
Tech news, interviews, and tips from makers
Changelog
New Product Hunt features and releases
Forums
Forums
Ask questions, find support, and connect
Streaks
The most active community members
Events
Meet others online and in-person
Advertise
Subscribe
Sign in
Qwen 1.5 MoE
Highly efficient mixture-of-expert (MoE) model from Alibaba
5.0
•
2 reviews
•
55 followers
Highly efficient mixture-of-expert (MoE) model from Alibaba
5.0
•
2 reviews
•
55 followers
Visit website
LLMs
•
AI Infrastructure Tools
Qwen1.5-MoE-A2.7B is a small mixture-of-expert (MoE) model with only 2.7 billion activated parameters yet matches the performance of state-of-the-art 7B models like Mistral 7B and Qwen1.5-7B.
Overview
Reviews
2
Alternatives
Customers
Team
More
Free
Launch tags:
Open Source
•
Artificial Intelligence
•
GitHub
Launch Team
Show more
Show more
Subscribe
Sign in
JudeAI 2.0
— AI-first real estate command center for serious agents
AI-first real estate command center for serious agents
Promoted
What do you think? …
Login to comment
Chris Messina
Raycast
Hunter
📌
Don't sleep on the work of Alibaba's impressive AGI team
Qwen
! Performance, efficiency, and cost-effectiveness — in a nice open source wrapper!
Upvote
(1)
Report
Share
2yr ago
levene
Insou AI
Congratulations on launch! Qwen 1.5 MoE
Upvote
Report
Share
2yr ago
Fei C.
Intellectia.AI
Congrats on the launch!
Upvote
Report
Share
2yr ago
5.0
Based on 2 reviews
Review Qwen 1.5 MoE?
Leave a review
Reviews
Most Informative
View all
Salman Paracha
used
Qwen 1.5 MoE
to build
Arch
•
6 reviews
Highly performant base models that can be used for task-specific training. Such as the function calling experience built into Arch
Helpful
Share
Report
1yr ago
Raycast
Insou AI
Intellectia.AI