14 days to build. Voice input, AI extraction, relationship intelligence layer , the whole thing.
The idea: walk out of any important conversation, vent for 60 seconds, and let AI pull out everything that matters. Prices quoted. Commitments made. The personal detail they dropped in passing. All of it structured and surfaced before the next conversation.
Here's what surprised me building this:
Voice input is a completely different product than text input
As an SAP ABAP Developer, I had app ideas sitting in my head for years. Before AI, the learning curve for mobile development felt impossibly steep. Now? I shipped my iOS app in weeks.
But here's my honest question:
How many of us vibe coders are actually building sustainable products?
With improved generative models now being widely available, we re reaching a point where we can get full front-end code and simple functioning code for apps from a single prompt. What are the factors that determine whether development roles can be replaced by models? What s our added value as humans?
Hi friends, recently started vibe coding as a way to explore a product that I've been wanting to build for myself and others. I started in the world of V0 and Lovable then into Cursor, but moved away when I felt like I was wasting more credits on debugging and re-providing context. I feel like I've finally hit my stride with Claude Code I would appreciate any tips that you would have given yourself at this point in my journey. Thanks all!
Elon Musk openly admitted that Claude Opus 4.5 is outstanding, then added that Tesla engineers still prefer Grok. The most predictable plot twist ever: Guy who owns Grok says his own AI is better.
Obviously there is a ton of hype around OpenClaw and everyone rushing to get Mac Minis and set it up. But what happens now that Anthropic has created a seemingly more secure and easier to set up version of OpenClaw with Channels? Do you think OpenAI will adapt quick enough and evolve OpenClaw to be at the forefront of the localized agent space? Or will Anthropic run away with it? Would love to hear what you all think. Btw, our current operation at Honestly is using Claude Code Channels, check out our socials accessed through our #4 PH rank to follow along with how we've been using them!
I ve been spending a lot of time thinking about how people actually work with prompts while building a tool in this space, and I realized I have way more questions than answers.
When my wife Noa and I heard that MTV was officially shutting down, it felt like the end of an era. As 90s kids, we missed that specific "linear" experience the joy of just turning on the TV and being surprised by a music video without an algorithm getting in the way.
As a data engineer who has little experience in full-stack software development, I ve been experimenting with vibe coding tools to move fast in the early stages.
My flow looked like this:
Prototyped the main UI in V0 (after 300+ iterations/conversations back and forth)
Been noticing how quickly vibe coding has become a real workflow lately.
A few months ago most of us were still writing everything manually or just using AI for small snippets. Now it feels like the process has shifted to describing what you want and iterating with the AI until the product behaves the way you imagine.