wangzhang wu

ora - Your personal simultaneous interpreter, on your Mac

Simultaneous interpreters used to be reserved for heads of state. Ora puts one on your Mac. Speak any language, see live translations stream into a floating caption card — entirely on Apple Silicon. No cloud. No account. Free forever.

Add a comment

Replies

Best
wangzhang wu
👋 Hey Product Hunt — I'm the maker of Ora. Simultaneous interpreters — the ones you see behind glass booths at the UN — are some of the most elite professionals in the world. They translate in real time as the speaker is still talking, often juggling three or four languages in a session. It's one of the highest cognitive-load jobs on earth, and one of the most expensive services you can hire. Ora puts one on your Mac. Free. Hit ⌘⇧T, start speaking, and translations stream into a floating caption card as you talk — usually before you've finished the sentence. It's not "transcribe, then translate" — it's a rolling, live interpreter that keeps pace with you. Everything runs on your Mac's Metal GPU via MLX. No cloud. No account. No subscription. After the first model download, the whole thing works on an airplane. Chinese ↔ English ↔ Japanese ↔ Korean ↔ Spanish ↔ French ↔ German and more. Swap source and target from the menu bar. Signed + notarized .dmg, macOS 15+ on Apple Silicon. There's also a Python reference implementation in the repo if you want to see the pipeline. I'll be in the comments all day. Really curious which conversations you'd bring Ora to — travel, meetings, family calls, conference talks. 🙏
Lakshay Gupta

Not relying on cloud tools or apis is a smart move. What trade offs did you make between latency vs accuracy when running everything locally on Apple Silicon??

wangzhang wu

@lak7 Thanks — for us it’s really a tradeoff between waiting for more context and staying live enough to be useful.

The pipeline is built around VAD endpointing plus rolling partial updates: Ora starts showing translation while you’re still speaking, keeps revising as the utterance grows, and only commits the final version after a short pause. That gets the experience much closer to simultaneous interpretation than “transcribe first, translate later.”

Then the quality tier is the second knob: bigger local models improve nuance/terminology, but they’re slower and heavier. So we expose that choice instead of hard-coding one point on the curve.

For real conversations, we’ve found users usually prefer something that lands on time and gets refined in place, rather than something more polished that arrives too late.

Scott Sp

Nice, but what is the business model if it's really "Free"? What am I sharing?

wangzhang wu

@scott_rs 

Thanks — very fair question.

Ora is free because the heavy part runs locally on your Mac, on Apple Silicon. We are not paying cloud inference costs for every minute you speak, so the personal version can stay free.

You are not sharing your conversations with us. No account is required, and the audio/transcription/translation pipeline is designed to stay on-device. We do not sell user data, use recordings for training, or run an ad-based model.

The business model is simple: keep the core personal Mac interpreter free, and monetize optional advanced features later — things like team/enterprise deployment, admin controls, custom integrations, or priority support.

So the short answer: you are not the product. The local app is free; future paid features will be for heavier professional/team use cases.