Tessa Kriesel

User interviews are lying to you

Not because developers lie to you.

Because they can't accurately report on what they actually do. Memory distorts. Social desirability bias kicks in. Nobody says "I was completely lost for the first 10 minutes"—they say "the onboarding was a little confusing at first."

That's not dishonest. It's just how humans work.

I've run hundreds of developer-focused interviews across my career. They're genuinely useful for some things: understanding motivation, surfacing feature ideas, learning vocabulary, building relationships.

They're terrible for one specific thing: understanding where and why developers drop off when they first try your product.

The problem is that this one specific thing is exactly what most early-stage dev tool founders are trying to figure out when they schedule those interviews.

The developers who dropped off aren't responding to your interview request. The ones who show up are the ones who stayed—which means you're doing exit interviews with people who didn't exit. Your "user research" is a biased sample and it's steering your roadmap.

What actually tells you the truth: watching a developer who matches your target profile try your product for the first time, without you in the room, with no prompting, captured on video with full behavioral data.

That's an uncomfortable shift. It means your 20-interview backlog might be pointing in the wrong direction.

Curious whether this resonates—or if you've found a way to make interviews actually work for early activation research.

What's your honest experience?

2 views

Add a comment

Replies

Be the first to comment