ویرگول
ورودثبت نام
دیر و زود
دیر و زودسریال زندگی من...
دیر و زود
دیر و زود
خواندن ۲ دقیقه·۵ ماه پیش

What Happened Before the Redesign

"We suggested a UX-based redesign strategy — but the client said no."

A while ago, our team was asked to redesign a website. It wasn’t the first time we’d had such a request. But this time, we took a different approach:

"Let’s first analyze real user behavior. Let’s find out what’s not working — before we change the design."

We explained that collecting data, identifying pain points, and measuring key UX metrics could help us redesign with confidence, not assumptions.

Unfortunately… The client declined. They wanted the new interface — fast.


What We Could Have Done

If the client had accepted our suggestion, here’s how things might have gone:

1. Collect User Behavior Data

Using tools like Microsoft Clarity, Google Analytics, or Mixpanel, we could have:

  • Tracked user flows and drop-off points

  • Watched real session replays

  • Studied heatmaps to see attention areas

  • Measured bounce rates, time on task, and error rates

2. Create a UX Audit Report

We could have prepared a report that answered:

  • Which parts of the site confuse users?

  • Where do users quit?

  • Are CTAs being noticed and clicked?

3. Redesign Based on Evidence

Instead of redesigning based on opinion or trends, we would redesign based on evidence — fixing what actually matters.

4. Compare Before vs. After

Post-launch, we could re-measure the same UX metrics to see what improved:

  • Did Task Success Rate go up?

  • Did Error Rate drop?

  • Did Time on Task improve?

  • Was user satisfaction higher?


What UX Metrics Tell Us (And Why They Matter)

Here are a few important metrics we wanted to measure:

Time on Task

How long does it take for users to complete something?
If it’s too long, something may be unclear.

Task Success Rate

Can users complete tasks without help or errors?
Higher success means better usability.

Error Rate

How many mistakes do users make while interacting?
Errors show friction in the design.

Heatmaps & Session Replays

Where do users click? Where do they hesitate?
These tools reveal real interaction patterns.

Bounce Rate

Do users leave right away?
High bounce often means poor first impressions or confusing content..

Sources: Nielsen Norman Group, MeasuringU, Microsoft Clarity, Hotjar


Why This Approach Matters

What we proposed wasn’t just "extra work".
It was standard UX practice based on leading design principles.

According to the Nielsen Norman Group, measuring behavior before and after a design change is essential for understanding if your redesign was truly effective.

Without that data, redesigns are just educated guesses.


Final Thoughts

In the end, the client got their new interface.
But we were left wondering — did it really solve their users’ problems?

If we had been able to use data-driven insights, we could have delivered:

  • A smoother user flow

  • Measurable improvements

  • Fewer errors and drop-offs

  • Higher user satisfaction

Sometimes, the hardest part of UX isn’t the design.
It’s helping people see the value of research and measurement.

Next time someone says “just make it look better,” we’ll keep asking:
“What exactly needs to be better?”

Because without data, design is just a guess — and users deserve better.

uxتجربه کاربریبازطراحیطراحی رابط کاربری
۲
۰
دیر و زود
دیر و زود
سریال زندگی من...
شاید از این پست‌ها خوشتان بیاید