
A while ago, our team was asked to redesign a website. It wasn’t the first time we’d had such a request. But this time, we took a different approach:
"Let’s first analyze real user behavior. Let’s find out what’s not working — before we change the design."
We explained that collecting data, identifying pain points, and measuring key UX metrics could help us redesign with confidence, not assumptions.
What We Could Have Done
If the client had accepted our suggestion, here’s how things might have gone:
Using tools like Microsoft Clarity, Google Analytics, or Mixpanel, we could have:
Tracked user flows and drop-off points
Watched real session replays
Studied heatmaps to see attention areas
Measured bounce rates, time on task, and error rates
We could have prepared a report that answered:
Which parts of the site confuse users?
Where do users quit?
Are CTAs being noticed and clicked?
Instead of redesigning based on opinion or trends, we would redesign based on evidence — fixing what actually matters.
Post-launch, we could re-measure the same UX metrics to see what improved:
Did Task Success Rate go up?
Did Error Rate drop?
Did Time on Task improve?
Was user satisfaction higher?
What UX Metrics Tell Us (And Why They Matter)
Here are a few important metrics we wanted to measure:
How long does it take for users to complete something?
If it’s too long, something may be unclear.
Can users complete tasks without help or errors?
Higher success means better usability.
How many mistakes do users make while interacting?
Errors show friction in the design.
Where do users click? Where do they hesitate?
These tools reveal real interaction patterns.
Do users leave right away?
High bounce often means poor first impressions or confusing content..
Sources: Nielsen Norman Group, MeasuringU, Microsoft Clarity, Hotjar
Why This Approach Matters
What we proposed wasn’t just "extra work".
It was standard UX practice based on leading design principles.
According to the Nielsen Norman Group, measuring behavior before and after a design change is essential for understanding if your redesign was truly effective.
Without that data, redesigns are just educated guesses.
Final Thoughts
In the end, the client got their new interface.
But we were left wondering — did it really solve their users’ problems?
If we had been able to use data-driven insights, we could have delivered:
A smoother user flow
Measurable improvements
Fewer errors and drop-offs
Higher user satisfaction
Sometimes, the hardest part of UX isn’t the design.
It’s helping people see the value of research and measurement.
Next time someone says “just make it look better,” we’ll keep asking:
“What exactly needs to be better?”
Because without data, design is just a guess — and users deserve better.