Launched this week

Caloris: AI Calorie Tracker
Snap a photo. Know exactly what you ate. Track it all.
12 followers
Snap a photo. Know exactly what you ate. Track it all.
12 followers
Most calorie apps fail because logging takes too long. Caloris fixes that. Point your camera at any meal and get an instant calorie and macro breakdown. No manual searching, no guessing. Snap, confirm, done. Built for real people — not just gym obsessives. Whether you track macros, or just want to eat better, Caloris makes it effortless. - AI food scanner — any meal, any dish - Macro tracker for protein, carbs, fats - Personalized calorie targets - Barcode scanner for packaged foods











@noumanbinsabir Congrats on the launch! I really like the idea of removing friction from calorie logging. Using the camera as the main interaction feels very natural for this kind of product.
From a UX perspective I'm also curious about the transparency of the AI estimation. Many people still log meals manually because they want to trust the numbers.
It could be interesting to show how the estimate was calculated, for example ingredient breakdown or portion assumptions, so users feel more confident about the result.
Also curious how the correction flow works if the AI estimate is slightly off, for example the dish or the portion size. Being able to quickly adjust the result seems really important for trust
@annatimopheeva Thank you so much, Anna, this is exactly the kind of thoughtful feedback that makes a launch day worthwhile.
Great news! Caloris already does this! When you scan a meal, you get a complete breakdown showing detected ingredients, portion assumptions, calories, macros (protein, carbs, fats), and even micronutrients and vitamins. The reasoning is fully visible, not just the final number, the app also shows nutrition per 100g of the food so users can see exactly how the estimate was calculated and feel confident in the result.
On the correction flow, there is an option where user can give feedback to AI for instant fixes in the result. Plus there is also an option for manually editing the result.
Would love for you to try it and see the breakdown in action, curious whether it hits the transparency bar you had in mind. Your perspective on the UX would be really valuable. 🙏
@noumanbinsabir Thanks for the detailed explanation! The ingredient and portion breakdown sounds like a great way to build trust in the AI estimate.
I’ll give it a try and see how the correction flow feels in practice. The transparency around how the numbers are calculated is definitely an interesting UX challenge for AI products.
Solo dev shipping a complete AI food scanner is impressive. The camera-first approach is right, logging meals manually is exactly why most people quit tracking within a week. Curious how it handles Indian food where one plate has 5 different items mixed together. Congrats on the launch, Nouman.
@siddhant_khurana Thank you so much, Siddhant. For Indian food or any dish that contains multiple items, the app analyzes the meal and breaks it down into all the identified components. Users can view each item detected within the dish, while the app currently provides a combined estimate for the entire portion. In the future, I’m planning to add an option that will also display estimates for each identified item separately, making the insights even more detailed and useful.