Arnav Gupta

Vox - The first local AI that actually does things on your Mac

by
Everyone's building AI chatboxes. Jan, LM Studio, Ollama — they run models locally, but all you get is a chat window. Cloud tools like ChatGPT and Claude are smarter, but cost $20-100/mo and can't touch your files. Vox is different. A local AI that acts: voice commands, iMessage gateway, screen overlay, file ops, background agents. All offline, all free. Mac-first today. Windows and Linux are next — and it's open source, so you can help bring it there.

Add a comment

Replies

Best
Arnav Gupta
Maker
📌
Hey everyone! I built Vox because I was tired of the false choice: smart-but-cloud or local-but-useless. Jan, LM Studio, Ollama, great for chatting with a model. But when I want AI to read my iMessages, control my screen, or run a task while I'm away, nothing exists. Claude Cowork comes close but it's $100/mo, cloud-only, and still just operates on folders. So I built what I actually wanted: a Mac-native AI assistant that runs Qwen3 locally via llama.cpp and can actually do things - voice input, iMessage replies, screen capture, file search, background agents. Is it as smart as GPT-4? No. A 4B model won't out-think a trillion-parameter cloud model. But it can do things no cloud AI can: reply to your texts, run while you sleep, work without WiFi, and never send a byte anywhere. Why Mac first? macOS gave us the tightest integration path - AppleScript for iMessage and Mail, native Electron overlays, unified memory for running models. But the core (llama.cpp, the agent loop, voice pipeline, overlay) is all cross-platform. Windows and Linux support is the #1 item on our roadmap, and the repo has open issues for it right now. The future of AI isn't just smarter models - it's AI that runs on your hardware and integrates with your life. This is v1. Open source. Come build with us.
Adit Rustagi
I expected Vox to be another flashy desktop demo that feels impressive for ten minutes and then becomes useless. It turned out to be more practical than I expected. The fact that it can work with files, search indexed folders, and keep track of longer tasks gives it more substance than the usual chat interface. It is still early and there are rough edges, but I came away thinking there is something real here.
Ahaan Singh

I tried Vox AI after getting frustrated with traditional chatbots and was surprised by how different it feels. Once I installed the DMG and granted the necessary permissions, Vox immediately began indexing my local files and desktop not by sending anything to a server, but by running entirely on my Mac.

This local‑first design means there’s a short delay when tasks are heavy, but it also gives Vox a remarkable “set it and forget it” feel. I can tell it to research a topic, summarise my inbox or draft a reply, and it quietly works in the background while I carry on with my day.

One feature I recently started using a lot is Mac control, I can now leave my laptop at home, and get stuff done just by texting my laptop. Vox emails me my files, edits them and researches all while I am maybe exercising or eating.

Because Vox runs entirely on your Mac, there are no usage limits or hidden costs . It also means your data is private by default: there is no login system, no analytics, and you can even disconnect from the internet and keep using it .

Arnav Gupta

Vox's first major update is live

You can now connect Vox to apps like WhatsApp and Telegram to automate messaging.

Vox also understands your Mac shortcuts — playing music on Spotify, setting reminders, and everyday actions are now effortless.