Ash draws strong praise for its compassionate tone, late‑night availability, and the trust it builds, with many users saying it helped them open up when human support felt out of reach. Reviews highlight a calming, well-crafted UI and intuitive flow that makes guidance feel immediate and supportive. Critiques note slow or uninterruptible responses and occasional overly long replies, which can frustrate more technical users. Overall, sentiment leans highly positive: users credit Ash with meaningful emotional support, thoughtful design, and a useful bridge to healthier habits.
How will this change the way we interact with therapists? Do you think there's a world in which there are AI clones of your therapist that you talk to?
Hi Team @Ash , congratulations for the launch. As a UX designer checking the application, I noticed a disconnect in the onboarding flow. After being asked what’s on my mind and choosing “keyboard” to type my response, I expected the chatbot to continue the conversation based on my input. Instead, it jumped to a blank chat screen with a generic greeting: “Hey, great to meet you. Ready to get started?”
This created a sense of mismatch between my input and the system’s response. As a user, I expected continuity—some acknowledgment or integration of what I had just shared. I’d love to see this addressed in a future iteration, as closing this feedback loop would create a more cohesive and engaging user experience, especially for first-time users encountering the product.
Hey @neil_parikh5 Do you plan to provide services in languages other than English? For example, Turkish
An AI designed specifically for therapy is both bold and much needed. The space between mental health support and technology has long felt like a gap, and Ash seems to be bridging it with real care. The interface feels calming and intentional, and it's refreshing to see a tool that doesn’t just talk mental wellness but is actually built for it.
How does Ash ensure emotional safety and handle edge cases like crisis scenarios? Would love to hear how you're balancing AI support with real human needs.
Congrats on this powerful launch
This looks lovely! I’m curious, though—with all the discussions around sharing personal feelings with an AI, unlike a therapist or doctor, there’s no confidentiality agreement in place. How are our conversations stored and protected in this case?
Tough Tongue AI
"It won't always agree with you." I think this one is one of the big plus. Even during coding, it just frustrates me that Claude or Open AI will just agree with whatever you say.
Flowmapp
Great idea! And this is something we have been dreaming off with AI appearence: a caring friend who will always help.
It seems we are getting closer and closer to the ‘Her’ movie 😄