
Locally AI
Run AI models locally on your iPhone, iPad, and Mac
155 followers
Run AI models locally on your iPhone, iPad, and Mac
155 followers
Run AI models like Llama, Gemma, Qwen, DeepSeek, and more locally on your iPhone, iPad, and Mac. Chat through a clean and native UI, completely offline, fully private, no login required, with models optimized for Apple Silicon.
This is the 3rd launch from Locally AI. View more
Locally AI + Qwen
Launched this week
Run Qwenβs latest models locally on your iPhone and iPad. Powerful models with advanced vision understanding and hybrid reasoning.





Free
Launch Team / Built With



Locally AI
@adrgrondin Seeing a vision + reasoning toggle for the Qwen 3.5 small lineup (0.8B, 2B, 4B, 9B) makes on-device model choice feel less like a gamble. Do you show per-device RAM, disk, and battery estimates before download? A default picker and a timeout for reasoning mode would save a ton of trial and error.
@adrgrondinΒ The 0.8B option is a smart inclusion. Most apps in this space only ship the biggest model they can fit and then wonder why people bounce after waiting 30 seconds for a response. Having that range lets people actually find what works for their device instead of guessing.
Great launch π
Running powerful models locally on iPhone and iPad is exactly where things are heading β privacy-first, fully offline, no logins, no cloud dependency.
Excited to see Qwen integrated here. Strong reasoning + vision capabilities, and having that fully on-device is a big step forward in user control and data ownership.
Curious about:
β inference speed across different devices
β memory usage and optimization
β how the model download and UX flow are handled
If performance holds up, this could be a serious alternative to cloud-based AI apps. Congrats on the launch π
Locally AI
@mx_mt I would recommend the latest iPhones, but even older models work well. There are models for all iPhones; you choose which size you want to run. The models are not bundled in the app; you choose which one to download once the app is installed.
Hope this makes things clearer!
"Offline + private + no login = the holy trinity that 90% of AI apps ignore because cloud is easier to monetize. Locally AI is betting on the right side of the privacy conversation.
As someone building Fillix, a Chrome extension that makes job hunting embarrassingly easy, the 'no login required' UX decision hits close to home. The best tools get out of your way instantly. Apple Silicon optimization is the cherry on top. Congrats on shipping!
Running Qwen 3.5 on-device with vision + reasoning toggle is impressive β how's battery drain on the 4B and 9B models during extended sessions? Are you seeing any thermal throttling on older iPhones, or do you recommend a minimum spec?
Qwen 3.5 2B vision on an iPhone 16 Pro is astonishing. This is absolutely the future of device AI. I can't wait for OSes that use AI as the kernel for everything.
When is Qwen 3.5 coming to the Mac App?!?
It's killing me to not have it yet - I want to use the 9B model on my Mac!
Flowtica Scribe
Literally the best app to experience the latest @Qwen3 local AI models on your phone! π
Locally AI
@zaczuoΒ Thanks a lot! Many improvements are planned to make the experience even better π