What's something you measured that completely changed how you build product?
For months, we were building features based on what users said they wanted. Feature requests.
Sales calls. "It would be great if you added X."
We built X. Nobody used it.
So we stopped trusting what people said and started tracking what they actually did.
The dataset
We pulled 12 months of usage data from Rankfender. 1,247 active users. 27 distinct features across 8 modules. We looked at three metrics for each feature:
Adoption rate — % of users who tried it at least once
Retention rate — % who used it again after 30 days
Request volume — how many users asked for it
The gap between what people asked for and what they actually used was massive.
What the data said
Feature | Adoption | Retention (30d) | User requests | What we decided |
|---|---|---|---|---|
Workflow Engine (automation) | 78% | 64% | 3 requests | Prioritize now |
RAIVE (AI visibility tracking) | 71% | 58% | 7 requests | Prioritize now |
RCGE (content generation) | 68% | 52% | 11 requests | Prioritize now |
ROSE (on-site optimization) | 45% | 31% | 9 requests | Keep, enhance later |
Social/Community Engine | 14% | 4% | 19 requests | Delay, needs more validation |
RASE (app store tracking) | 11% | 3% | 22 requests | Delay, low retention signal |
Ranklink (backlink discovery) | 9% | 2% | 24 requests | Delay, not sticky yet |
The features people asked for most (RASE, Ranklink, Social Engine) were used by less than 15% of users. The features they barely mentioned (Workflow Engine, RAIVE, RCGE) had adoption rates above 68%.
What we learned
People are terrible at predicting what they'll actually use. They ask for things that sound good. They use things that solve a real pain.
The Workflow Engine had 3 requests. Not 30. Not 300. Three. But 78% of users touched it. And 64% came back to it. That's not a feature request. That's a product.
RASE had 22 requests. We built an MVP. 11% of users tried it. 3% came back. We spent 4 months building something people asked for but didn't actually stick with.
The signals we track now
Before building anything, we now look at three things:
1. Usage patterns — What do power users do every day? We tracked the top 10% of accounts by engagement. Their most-used feature was the Workflow Engine. Not because they requested it. Because it saved them time. They built automation for content publishing, citation alerts, and competitor tracking. That told us more than any sales call.
2. Support ticket clusters — What problems keep coming up? We used RAISA to cluster 1,200 support tickets. The largest cluster (34%) was "automation rules are confusing." Not "add app store tracking." Not "add backlink discovery." People wanted us to fix what was already there, not add new stuff. So we paused new features for 6 weeks and just fixed automation.
3. Churn signals — What do people stop using before they leave? We tracked feature usage in the 30 days before cancellation. Users who stopped using RAIVE were 4.2x more likely to churn. Users who never touched the Workflow Engine were 3.7x more likely to churn. Users who used RASE had no correlation with retention. That told us where to invest.
What we decided
Building now:
Workflow Engine v2 — Conditional logic, scheduled runs, webhook triggers. 78% adoption, 64% retention. This is what keeps people around.
RAIVE enhancements — Real-time citation alerts, competitor share of voice, platform-specific tracking (ChatGPT vs Perplexity vs Gemini). 71% adoption, 58% retention.
RCGE improvements — Proofreader v2, information gain scoring, bulk content generation. 68% adoption, 52% retention.
Delayed (not killed, just not now):
RASE (app store tracking) — 11% adoption, 3% retention. The signal isn't there yet. We'll revisit when app store AI visibility becomes a bigger pain point.
Ranklink (backlink discovery) — 9% adoption, 2% retention. People asked for it, but they didn't stick with it. We need to understand why before investing more.
Social/Community Engine — 14% adoption, 4% retention. The idea is good. The execution isn't there yet. We're collecting more data on what people actually need from social listening.
What we fixed instead of building new:
Automation UI — Simplified the workflow builder. Support tickets on automation dropped 47%.
RAIVE data freshness — Increased citation update frequency from daily to twice daily. User satisfaction scores improved 22%.
RCGE proofreader — Added "information gain" scoring. Content quality scores improved 34%.
The results
After 6 months of building based on usage, not requests:
Metric | Before | After | Change |
|---|---|---|---|
Monthly churn | 8.2% | 5.7% | -30% |
Support tickets/month | 247 | 173 | -30% |
Time to value (days) | 14 | 6 | -57% |
NPS | 42 | 61 | +19 pts |
Feature adoption (avg) | 34% | 51% | +50% |
We didn't add more features. We fixed the ones people actually used. That was the lesson.
What this means for you
If you're building something, stop asking what people want. Look at what they do.
Pull usage data. What features do your power users touch every day? That's your roadmap.
Cluster support tickets. What problems keep coming up? Fix those before adding new things.
Track churn signals. What do people stop doing before they leave? That's where you're losing them.
You don't need more data. You need to look at the data you already have.
What I'm curious about
What's something you measured that completely changed how you build? Not what people said. What the data showed.
Imed Radhouani
Founder & CTO – Rankfender
Evidence-based product development



Replies
YES! "I would use that feature 24/7" translates to "I will use it once or twice before never touching that feature ever again". One of the most useful data is the churn signal, what is the last feature used before leaving. Solving the pain points are almost always better than implementing new features.
Rankfender
@syaman You nailed it. The "I would use that 24/7" line is almost always a lie. Not because people are dishonest. Because they genuinely believe it. They just have no idea what they'll actually do.
The churn signal is the one that saved us. We were chasing new features because they sounded exciting. The data showed that people who left weren't missing new features. They were frustrated with the ones we already had. Slow loading. Buggy automation. Confusing UI.
Fixing the boring stuff reduced churn more than any new feature ever did.
What's the most surprising churn signal you've found?
@imed_radhouani Actually the most surprising was the onboarding. We kept it minimal to reduce friction at first (very minimal), but failed to communicate users our value proposition. We had terrible activation rates on free trials. We added more meaningful steps to onboarding, clearly communicating what they would expect using our tool. Activation rates jumped high.
Rankfender
@syaman That's counterintuitive. Most people think "less friction" is always better. But if they don't understand the value, they won't stick around anyway.
You added steps and got better results. That's the kind of data that goes against every "best practice" post on LinkedIn.
What did you add that made the biggest difference?
We added this small tool called insighto which helps users upvote the features that are actually needed and with that we got quite a few insights the feature we though would be a game changer was not that popular or requested by the users, but surely this helped a lot!