User Interviews
User Interviews are a qualitative research method that systematically collects user needs, behavioral motivations, and experience feedback through one-on-one or small group conversations. They help teams deeply understand users' genuine thoughts, providing direct evidence for product decisions and avoiding design biases based on assumptions.
What It Is
User Interviews are a structured conversational technique where researchers guide users through open-ended questions to share their real experiences, feelings, and expectations when using a product, completing tasks, or navigating scenarios. Unlike surveys, interviews allow follow-up and clarification, uncovering deep motivations behind surface behaviors. The core value lies in obtaining the 'why' beyond just the 'what', providing contextual insights for design decisions.
Origins and Key Figures
The practice of user interviews traces back to early 20th-century sociological and anthropological fieldwork, such as Malinowski's participant observation in the Trobriand Islands. In business and design, its systematic application began with the user-centered design movement in the 1980s. Key figures include Donald Norman, who emphasized understanding users' mental models in *The Design of Everyday Things*; and IDEO, which integrated deep interviews into the design thinking process, making it a standard preliminary step for innovation projects.
How to Use
- Define interview goals and scope: Before starting, determine the core problem to address, e.g., "What obstacles do new users face during their first product interaction?". Criteria: Goals should be specific and verifiable, avoiding broad statements like "understand user needs."
- Recruit representative participants: Screen 5-8 typical users based on goals, ensuring coverage of key user segments. Criteria: Participants must have real experience with relevant products or be in target scenarios, not just interested parties.
- Design a semi-structured guide: Prepare a backbone of open-ended questions, such as "Describe the last time you completed XX task," leaving room for flexible follow-ups. Criteria: Questions should be neutral and non-leading, ordered from general to specific.
- Conduct interviews and listen deeply: Perform in a quiet setting, note non-verbal cues, and ask follow-ups like "Can you give an example?" for details. Criteria: Interviewers should listen 80% of the time, controlling their own expression urge.
- Analyze data and extract insights: Transcribe recordings, categorize feedback using affinity diagrams or coding, and identify recurring patterns and outliers. Criteria: Insights should be based on direct quotes, not subjective interpretations, distinguishing facts from opinions.
Case Study
A fintech team aimed to optimize the mobile money transfer flow, but existing data only showed "15% of users abandon the operation" without clear reasons. The team decided to conduct user interviews for diagnosis.
Background and constraints: The product was in mid-iteration, requiring actionable improvement suggestions within two weeks; resource limits allowed interviews with 6 high-frequency users.
Problem diagnosis: Initial analysis suspected interface complexity, but interviews revealed a deeper issue: users' anxiety about security verification steps far exceeded expectations, especially fearing errors leading to financial loss.
Phased actions: First-phase interviews focused on the end-to-end transfer experience, finding that during the key action "confirming payee information," users repeatedly checked causing pauses; second-phase follow-ups on security concerns showed most users relied on SMS verification codes as a psychological safety anchor.
Outcome comparison: Post-interviews, the team adjusted the design by front-loading security prompts and simplifying confirmation steps. After one month of launch, observable metrics improved: transfer abandonment rate dropped from 15% to 8%, and average operation time shortened by 20 seconds.
Retrospective and transferable learnings: The key takeaway was that surface efficiency issues often root in emotional trust gaps; transferable learning: for financial products, interviews must particularly address risk perception dimensions, not just interface flow optimization.
Strengths and Limitations
Applicability boundaries: User interviews are best for exploratory scenarios, such as early needs discovery or complex problem diagnosis; when statistical significance data is needed (e.g., evaluating market share), combine with quantitative methods.
Potential risks: Results may be influenced by researcher bias, e.g., question phrasing hinting at "correct" answers; or participants hiding negative feedback due to social desirability bias.
Mitigation strategies: Use collaborative analysis with multiple team members to cross-verify notes; emphasize "no right or wrong answers" at the interview start and employ neutral wording.
Trade-off suggestions: Under tight timelines with clear problems, shorten interview duration but increase sample diversity; if resources are extremely limited, prioritize interviewing extreme users (e.g., novices and experts) over average users.
Common Questions
Q: How many users are enough for interviews?
A: Criteria depend on problem complexity, not statistical representation. Typically, 5-8 users cover major patterns; stop recruiting if 3 consecutive users yield no new insights.
Q: How to avoid users giving false praise?
A: Operational advice is to ask about specific behaviors rather than general feelings, e.g., "When did you last use this feature?" not "Do you like this feature?", and observe consistency between words and actions.
Q: How to share interview data with the team?
A: Actionable steps include creating a "voice of the user" board with key quotes and photos, regularly organizing workshops to discuss insights, avoiding long reports alone.
Recommended Resources
- Book: *Observing the User Experience* (by Elizabeth Goodman et al.), offering practical interview script examples.
- Article: Nielsen Norman Group's "User Interview Guide," covering common pitfalls and responses.
- Tool: Otter.ai for audio transcription, Miro for collaborative analysis.
Related Methods
Core Quote
"Don't ask users what they want; observe what they do and understand why they do it."
If you find this helpful, consider buying me a coffee ☕