
0xCal
Type + capture your meals to log cals w/ Apple Health Sync
230 followers
Type + capture your meals to log cals w/ Apple Health Sync
230 followers
Most calorie apps feel like a chore – cluttered interfaces, endless database searches, designs stuck in 2015. 0xCal is different. Built for people who care about the apps they use. Describe what you ate naturally, or snap a photo. AI handles the rest. Dark mode first. Minimal UI. Native iOS feel. No ads, no clutter – just a beautiful tool that helps you reach your goals without compromising your home screen. The calorie tracker that finally belongs on your phone.










0xCal
@namedix Great launch!
Only one question - how do you know that nutrition are accurate?
0xCal
@walarowska Thanks!
Honestly — it's not about being 100% perfect. It's about being accurate enough to know if you're on track or not.
The accuracy scales with how much detail you give:
→ "Had eggs for breakfast" = rough ballpark
→ "2 scrambled eggs with butter and toast" = solid estimate
→ "100g eggs, 10g butter, 50g bread" = very accurate
For known stuff like a Big Mac or Starbucks latte — it pulls exact public nutrition data.
Most days you don't need to know you ate exactly 487 kcal. You just need to know "I'm good" or "maybe go lighter for dinner."
The goal was removing friction, not chasing perfection. If you need precision, you can go detailed. If you just want a quick log, rough estimate works too.
Product Hunt
0xCal
@curiouskitty This is the core UX challenge — and honestly what I spent the most time on.
For portion size: gram always shows its assumption. Instead of just saying "scrambled eggs – 350 kcal", it says "scrambled eggs (~150g) – 350 kcal". So you immediately see what it's guessing and can call it out if it's wrong.
For invisible ingredients (oil, butter, sauces): AI makes educated guesses based on how dishes are typically made. Scrambled eggs? Assumes some butter. Stir fry? Assumes oil. It's not perfect, but it's a realistic starting point rather than pretending those calories don't exist.
The balance between speed and control is where the chat comes in. Instead of tapping into fields and dropdowns to make corrections, you just... talk:
"Actually it was 3 eggs, no butter"
"Add cheese"
"Portion was smaller, maybe 100g"
The AI adjusts and you confirm. Feels like a quick back-and-forth, not data entry.
The goal: 90% of the time, you accept the first estimate and move on in 5 seconds. The other 10%, you have a fast way to correct without it feeling like homework.
Still not perfect — but way less friction than traditional tracking.
This is starting to become my favorite app. I love the simplicity and I hope it stays that way.
I do have a few points of feedback:
I tried changing my goals to lose weight instead of maintain it, but I couldn't find that.
I'm ending up entering some food in my own language as those foods don't really have direct name translations. Those get picked up really well by the AI, but I feel like energetic values aren't pulled super accurately. Is there a way to expand the database?
Is there a case to be made for tracking fiber intake?
Any lifetime licenses for early adopters & testers?
That's about it. Great app and I enjoy using it due to its simplicity.
0xCal
@neven_jevtic1 This means so much — thank you! Simplicity is the core promise, and I'm committed to keeping it that way. 🙏
To your feedback:
Changing goals: You're right, it's not possible yet — but I'm literally working on this right now! Goal editing + weekly summaries dropping this week. 🚀
Local language foods: Regional dishes can be tricky since the AI relies on what it "knows" from training data rather than a fixed database. More detailed input helps (like "Polish żurek with sausage, 300ml bowl"). I'm exploring ways to improve this, maybe letting you correct/save custom foods.
Fiber tracking: Great call. Adding it to the feature requests — you can actually vote on upcoming features inside the app now (Profile → Feature Requests).
Lifetime license: DM me your email on x and I'll sort something out for early supporters like you. 🤝
Thanks for the thoughtful feedback — users like you make building this worth it!
Interesting. Can I set a goal and have the app give me advice? For example, the goal is to lose 10 kg. I enter my parameters into the app, take a photo of my food, and it advises me: to reach your goal, reduce the portion by 20% next time. I think this is something that could significantly improve the UX.
0xCal
@mykyta_semenov_ Actually the app does give some feedback already — you get visual cards showing if you're above your limit (red) or if you earned bonus calories from a workout (green). So you always know where you stand.
What it doesn't do yet is the contextual portion advice you mentioned — like "reduce this portion by 20% next time" or "swap the fries for salad to stay on track."
That's a great next step though. Right now it tells you "you're over" — but telling you exactly what to adjust would be way more actionable.
Adding to the roadmap! Thanks for the idea.
0xCal
@mykyta_semenov_ Interesting idea! Right now 0xCal focuses on being a really good calorie tracker — visual feedback with green/red cards showing if you're on track, plus how workouts and steps balance things out.
Proactive diet advice is a bit outside the current scope — I want to keep it simple and do one thing well rather than become another bloated health app. But I hear you, and maybe down the road as the core gets more solid.
For now, keeping it minimal is the priority. Thanks for the suggestion though! 🙏
The "apps as punishment" analogy resonates! Clean UI matters when you're using something daily. How does 0xCal handle restaurant meals where you don't have exact nutrition info? Does the AI learn your preferences over time?
0xCal
@nora_studiohedera Thanks! Yeah, using an app you hate every day is a recipe for quitting by February 😅
For restaurant meals – just describe what you ordered. "Chicken Caesar salad from Sweetgreen" or "pad thai from the place down the street." The AI estimates based on typical restaurant portions.
For chains like McDonald's, Chipotle, Starbucks – it pulls exact nutrition since that data is public.
And accuracy scales with how much detail you give:
→ "Had pasta for lunch" = rough ballpark
→ "Spaghetti carbonara, medium portion" = better estimate
→ "About 200g pasta, creamy sauce, bacon bits" = pretty accurate
You don't always need precision – most days you just want to know "am I roughly on track?" But when it matters, you can go detailed.
As for learning preferences – not yet, but it's on the roadmap! Would love to have it remember "my usual breakfast" or your go-to orders. Coming soon!
I track meals regularly for fitness, and 0xCal immediately caught my attetion — I downloaded it right away to try it out. The UI and typography are genuinely fun, and it feels refreshingly minimal compared to most calorie apps.
Quick question: AI photo recognition in most apps isn’t very accurate(especially for mixed or homemade meals). Curious how 0xCal improves accuracy and makes corrections easy.
0xCal
@sylvia_weng99
Thank you – really glad the design landed for you! That was the whole mission.
You're totally right that photo AI isn't magic, especially for homemade meals where it can't know your exact recipe. Here's how 0xCal handles it:
The photo is a starting point, not the final answer. After you snap, you land in a chat where you can refine it naturally:
→ Gram: "Looks like scrambled eggs with vegetables and toast, ~400g total – ~450 kcal"
→ You: "It was 3 eggs and add some cheese"
→ Gram: "Updated – ~520 kcal"
So instead of hunting through fields and dropdowns to make corrections, you just... talk to it. Tell it what's wrong and it adjusts.
For complex homemade meals, I usually just describe ingredients: "Made a stir fry – chicken, broccoli, bell peppers, soy sauce, bit of oil." Works better than hoping the camera figures it out.
@namedix Thanks for the explanation! This really resonates with me — a lot of calorie apps have inaccurate AI recognition, and I eventually ended up doing a "take a food photo + chate with Gemini to log it" workaround...which weirdly worked pretty well 😄
What you're building feels like a much polished and intentional version of the workflow, and the interaction design makes a lot of sense. I'm excited to use it more seriously!
0xCal
@sylvia_weng99 That’s awesome to hear — and honestly your “photo + chat with Gemini” workaround is exactly the workflow I kept doing too 😄
0xCal is basically my attempt to turn that hack into something polished and fast: snap/describe → get assumptions (incl. portion) → correct in chat if needed → done.
If you end up using it seriously, I’d love to hear what feels great and what feels annoying in real life use. Thanks again for trying it!
0xCal
@zahran_dabbagh Thanks! Good news — that feature already exists! 😄
You can snap a photo of your food (or a nutrition label) and the AI analyzes it instantly. Works from the camera or photo library.
It estimates the dish, portion size, and nutrition — then you land in the chat where you can correct anything that looks off. So if it guesses "300g pasta" but it was actually a small portion, you just tell it and it adjusts.
For packaged food, snapping the nutrition label gives you the most accurate data since it reads the actual values.
Give it a try and let me know how it works for you!