PushBackLog

User-Centred Design

Soft enforcement Complete by PushBackLog team
Topic: design Topic: ux Topic: research Skillset: design Technology: generic Stage: discovery Stage: refinement

User-Centred Design

Status: Complete Category: Design Default enforcement: Soft Author: PushBackLog team


Tags

  • Topic: design, ux, research
  • Skillset: design
  • Technology: generic
  • Stage: discovery, refinement

Summary

User-Centred Design (UCD) is a design philosophy and iterative process in which the needs, goals, behaviours, and constraints of end users are the primary driver of product decisions. Rather than building what is technically convenient or internally assumed to be useful, UCD grounds every decision in validated user understanding.


Rationale

Software that engineers find intuitive is frequently software that only engineers find intuitive. The assumptions embedded in a product by people who understand it deeply are often invisible to people encountering it for the first time. User-Centred Design makes those assumptions explicit and tests them against reality before they are built into production code.

The cost of correcting a design error found in user research is negligible. The cost of correcting it after it has been engineered, QA’d, and shipped is significant. UCD is fundamentally a cost management practice, not merely an empathy exercise.


Guidance

The UCD process

UCD is iterative rather than linear. The core loop:

  1. Understand — Research who the users are, what they need, and what their current experience looks like
  2. Define — Frame the problem to be solved in user terms, not feature terms
  3. Ideate — Generate design solutions against the defined problem
  4. Prototype — Build the minimum fidelity model needed to test the most important assumption
  5. Test — Put the prototype in front of users; observe, listen, and measure
  6. Iterate — Incorporate findings and repeat

The loop does not end at release. User research in production (analytics, session observation, support analysis) feeds back into the next cycle.

User research methods

MethodFidelityBest for
Stakeholder interviewsLowEstablishing initial direction, organisational constraints
User interviewsLow-mediumUnderstanding mental models, workflows, pain points
Contextual inquiryMediumObserving actual use in real context
SurveysLowQuantifying attitudes and behaviours at scale
Usability testingMedium-highValidating specific interface decisions
A/B testingHighMeasuring behavioural impact of two design variants
Analytics reviewHighUnderstanding aggregate behaviour patterns in production

Personas and jobs-to-be-done

User research output should be synthesised into artefacts the product team can reason from:

User personas describe representative user archetypes — their goals, behaviours, pain points, and context. They prevent “the user” from meaning “someone like us.”

Jobs-to-be-done reframe features as user outcomes: “When [situation], I want to [motivation], so I can [expected outcome].” This framing keeps the team focused on what the user is trying to accomplish rather than the feature mechanism.

Information architecture

Before any visual design begins, the structure of the product should be resolved:

  • What are the primary tasks the user needs to accomplish?
  • How do they navigate between them?
  • How is information grouped and labelled?
  • What is the hierarchy of content on each screen?

Card sorting and tree testing are lightweight methods for validating information architecture before it is built.

Prototype fidelity

Match prototype fidelity to the question being asked:

  • Paper / whiteboard — layout and flow validation; fast, cheap
  • Wireframe — structural and navigational validation
  • Interactive prototype — interaction and task completion validation
  • Functional prototype — performance and integration validation

Never build high-fidelity prototypes for questions that a whiteboard can answer.


Common failure modes

FailureDescription
Assumed usersProduct decisions made for a hypothetical user nobody has actually spoken to
Research theatreUser research conducted but findings not incorporated into decisions
Single-pass researchResearch done at the start of a project and never revisited
Designing for power usersInterface optimised for people who use it all day; unusable for new users
Conflating usability with aestheticsVisual polish addressed while fundamental usability problems remain

Examples

User interview script skeleton

Interview guide — Checkout flow research
Participant: [role, context]
Duration: 45 minutes

1. Warm-up (5 min)
   • Tell me about your role and how you use [product] day-to-day.
   • How often do you [task under study]?

2. Current experience (15 min)
   • Walk me through the last time you [task]. What were you trying to accomplish?
   • Where did you start? What did you do next?
   • Were there any moments where you felt stuck or uncertain?
   • What would have made that easier?

3. Concept exploration (15 min)
   [Show prototype or screenshot]
   • Before you click anything — what do you notice first?
   • What would you expect to happen if you clicked [element]?
   • Try to [task]. Think aloud as you go.

4. Wrap-up (10 min)
   • Is there anything else about this experience that we should know?
   • What would make this 10x better for you?

Key interview principles: ask about behaviour, not opinions. “What did you do?” yields more signal than “What would you do?”. Ask “Why?” at least once per answer.

Jobs-to-be-done framing example

Feature framing (to avoid):

“As a user, I want to filter results by date so that I can find recent items.”

JTBD framing (preferred):

“When I’m investigating a suspected data issue, I want to narrow the dataset to the time window where the problem occurred, so I can quickly isolate which records are affected without wading through unrelated data.”

The JTBD framing surfaces the context, the motivation, and the expected outcome. A filter UI is one solution — a pre-built “recent issues” view might be better. The feature-first framing forecloses that conversation.

Prototype fidelity decision matrix

Question to answerPrototype neededWhy
Does this navigation structure make sense?Paper / whiteboardNo visual polish needed; structure is the question
Can users complete this 3-step flow?Clickable wireframeInteractions matter; visual design does not yet
Does this microcopy reduce support requests?High-fidelity HTMLExact wording and context required
Does this feature increase conversion?A/B test in productionBehavioural data at scale needed


Part of the PushBackLog Best Practices Library. Suggest improvements →