Zac Zuo

Ollama Desktop App - The easiest way to chat with local AI

Ollama's new official desktop app for macOS and Windows makes it easy to run open-source models locally. Chat with LLMs, use multimodal models with images, or reason about files, all from a simple, private interface.

Add a comment

Replies

Best
Aries Xing

Great product. This is huge for non-technical users! Any plans for model discovery/browsing within the app?

Vasanth

I really love this app! It would be great if it included web search functionality.

Britestak

Awesome Launch!!! Much useful! thank you very much for this ❤️

Ash Grover

Anything that allows users run LLMs locally is just awesome in my book. I think everyone should have access to LLMs without the cost associated with the cloud as most people have basic needs to assist with their day to day tasks and don't have the requirements of a machine learning researcher. I do think integrated local LLMs within the operating systems will allow us to be more productive than cloud based ones in the future.

Good to see the progress on this!

Alexandre Droual
Finally !! Congrats to the whole team on this huge achievement. Can’t wait for next iteration, maybe a vibe coding extension ?
Shane Mhlanga
Absolutely brilliant update! I first started with ollama and OpenWebui. Until I found other native apps. But this has been the core. It was annoying having to do extra steps to just run a local model quickly, now this is it! Well done!
vivek sharma

Ollama v0.7 quietly rewrites the rules for running multimodal AI on local machines. Llama 4 and Gemma 3 in vision mode? Huge. Improved memory management, reliability, and accuracy make this more than just a version bump it’s a fresh foundation for the next wave of local-first LLMs.

Mu Joe

Love that you can just drag and drop images and chat with vision models now—no more command line headaches, this is super smart, tbh. Makers really nailed it!

장연주
Pretty design
Ajay Sahoo

The convenience has become more impactful over and over usage of new tech tools for personal and professional operations, and even from tip to toe even i have a query for me or for someone else, or for something i am using i am using to get solutions to the doubt from Ollama and other previously using tasks based chatbots. Wonderful and embarked usability and preference of all LLMs in one.