User interviews guide: questions & frameworks The ultimate guide to conducting effective user interviews

Unlock actionable insights for user research, market research, and every stage of the development process with user interviews.
How to use this guide
This is a reference guide, not a tutorial. Think of it as your question bank and framework library for user interviews.
What this guide includes:
- 200+ proven interview questions organized by research goal
- 8 interview frameworks for different situations
- Question templates you can customize
- Bad vs. good question examples
- Follow-up question strategies
- Guidance on asking the right questions to unlock actionable insights and valuable insights
What this guide does NOT cover:
- How to plan and conduct interviews (see our step-by-step guide)
- How to recruit participants
- How to analyze interview data (covered in other guides)
How to use it:
- Identify your research goal (discovery, validation, etc.)
- Jump to the relevant section
- Copy questions that fit your needs
- Customize them for your product/context
- Add follow-up questions as needed
- During interviews, focus on actively listening—ask follow-up questions, allow for silence, and engage thoughtfully to gain better insights.
Let’s get started.
This guide is grounded in qualitative research methods, helping you collect rich qualitative data through open-ended interviews. By combining qualitative data from interviews with quantitative data from analytics or surveys, you gain a more complete and nuanced understanding of user behavior, motivations, and pain points.
The golden rules of interview questions
Before diving into specific questions, remember these principles:
A good interview is a conversation that aims for better understanding of your participants’ behaviors, motivations, and needs, not just a Q&A session. This approach helps you gain deeper insights and more authentic feedback.
Rule 1: Ask open-ended questions
Open-ended questions encourage participants to share their thoughts in their own words, leading to richer and more nuanced insights. Avoid yes/no questions unless you’re clarifying something specific.
Rule 2: Focus on the participant’s experience
Frame questions around the participant’s actual experiences, not hypotheticals or assumptions. This helps you uncover real behaviors and pain points.
Rule 3: Never ask leading questions
Leading questions can bias the participant’s answers and invalidate your findings. Instead of asking, “How easy was it to use our product?” try, “Can you tell me about your experience using our product?”
Leading questions can also cause social desirability bias, where participants give responses they think are expected or socially acceptable, rather than their true opinions. This is a form of response bias, and being aware of social desirability and response bias is crucial for collecting genuine data.
It’s important not to lead participants and to be mindful of cultural differences in how questions are interpreted, as these can influence the authenticity of responses and the effectiveness of your research.
Rule 1: Ask about past behavior, not future intentions
❌ Bad: “Would you use a feature that does X?”
✅ Good: “Tell me about the last time you tried to do X. What happened?”
Why: People are terrible at predicting their future behavior but great at recalling what they actually did. Focusing on their recent experience helps ensure responses are more accurate and detailed, as memories of recent interactions are fresher and more specific.
Rule 2: Get specifics, not generalities
❌ Bad: “How often do you experience this problem?”
✅ Good: “Walk me through the last three times this happened. When was each one?”
Why: “Often” means different things to different people. Specific instances reveal truth. By focusing on specific scenarios and using the funnel technique—starting with broad questions and then narrowing down to a specific question—you gather richer, context-rich insights that improve your understanding of user experiences.
Rule 3: Never ask leading questions
❌ Bad: "Don't you think this design is easier to use?"
✅ Good: "How would you compare this to what you use today?"
Why: Leading questions get you the answers you want to hear, not the truth.
Rule 4: Use "why" to dig deeper
Ask "why" 3-5 times to get to the real reason:
- User: "I stopped using the app."
- You: "Why did you stop?"
- User: "It was too complicated."
- You: "What specifically was complicated?"
- User: "I couldn't find the export button."
- You: "Why was exporting important?"
- User: "I need to send reports to my boss weekly."
Now you know the real problem: The app didn't fit their workflow (weekly reporting), not that it was "complicated."
Rule 5: Embrace awkward silences
When a user pauses, don't fill the silence. This principle is particularly important in qualitative research methods, where allowing participants time to reflect often leads to deeper insights.
Count to 5 in your head. They're thinking. Let them. For more insight on leveraging expert networks, see our complete implementation strategy guide.
The best insights often come after a long pause.
Discovery questions
Use these when: Exploring unknown problems, early product research, understanding user needs
Goal: Uncover problems you didn’t know existed
Discovery questions are designed to be open ended questions that allow users to freely share their experiences in their own words. This approach helps you gather qualitative data, uncover user behaviors and pain points, and gain deeper insight into their motivations and challenges.
General discovery
Starting broad:
- “Tell me about your role. What does a typical day look like?”
- “What are the biggest challenges you face in your work?”
- “Walk me through your workflow for [task]. What happens step by step?”
- “What frustrates you most about [current process]?”
- “If you could wave a magic wand and fix one thing about [area], what would it be?”
- “Can you describe your daily routines as they relate to [product/task]?”
Getting specific:
- “Tell me about the last time you experienced [problem]. What happened?”
- “Can you show me how you currently do [task]?” (screen share or in-person)
- “What workarounds have you created to solve [problem]?”
- “What tools do you use for [task]? Why those specifically?”
- “Where do you waste the most time in your day?”
- “Walk me through your user journey from first hearing about [product] to using it regularly.”
Understanding impact:
- “How does [problem] affect your work?”
- “What would change if [problem] was solved?”
- “Has [problem] ever caused you to miss a deadline or lose an opportunity?”
- “On a scale of 1-10, how painful is [problem]? Why that number?”
- “What have you tried to solve this? Why didn’t it work?”
Pain point discovery
Finding hidden frustrations:
- "What part of your workflow do you dread most? Why?"
- "Tell me about a time when [tool/process] let you down."
- "What makes you want to quit and start over?"
- "What do you do when [tool] doesn't work as expected?"
- "What questions do you ask yourself most often during [task]?"
Emotional triggers:
- "When was the last time you felt really frustrated at work? What caused it?"
- "What makes you anxious about [process]?"
- "What keeps you up at night related to [area]?"
- "Tell me about a time you felt proud of solving [problem]. How did you do it?"
- "What would your teammates say is your biggest challenge?"
Opportunity discovery
Finding unmet needs:
- “What do you wish you could do but can’t right now?”
- “If you had an extra hour in your day, what would you work on?”
- “What capabilities would make you twice as effective?”
- “What information do you wish you had access to?”
- “What would the ideal [tool/process] look like for you?”
- “If we were to add a new feature to [product], what would be most helpful to you?”
- “Are there any new features you wish existed that would make your workflow easier?”
- “Would you be interested in participating in concept testing for upcoming features?”
Problem validation questions
Use these when: You’ve identified a potential problem and need to validate it’s real and widespread
Goal: Confirm the problem exists and is worth solving. When crafting problem validation questions, ensure they are tailored to your research focus and the specific goals of your research project to gather the most meaningful insights.
Problem existence
- "Do you currently experience [specific problem]?"
- "How often does [problem] occur? Daily? Weekly? Monthly?"
- "When was the most recent time this happened? Walk me through it."
- "Is this getting better, worse, or staying the same over time?"
- "Do your colleagues/teammates experience this too?"
Problem severity
- "On a scale of 1-10, how severe is this problem? Why that number?"
- "What's the worst consequence you've experienced from this problem?"
- "How much time does this problem cost you per week?"
- "Has this ever caused you to lose money/customers/opportunities?"
- "If this problem disappeared tomorrow, what would change?"
Current solutions
- "What are you doing today to deal with this problem?"
- "Have you tried any tools or services to solve this?"
- "What worked? What didn't? Why?"
- "What prevents you from solving this problem yourself?"
- "How much are you currently spending (time/money) to manage this?"
Willingness to pay
- "How much would you pay to solve this problem completely?"
- "What's this problem costing you now (time, money, lost opportunities)?"
- "What's your budget for tools/services in this category?"
- "Have you purchased anything similar? What did it cost?"
- "Would you pay for a solution even if you had free alternatives? Why?"
Solution validation questions
Use these when: You have a proposed solution and need to test if it actually solves the problem. Solution validation often involves user testing and usability tests to observe real user interactions, identify usability issues, and ensure your solution meets user needs.
Goal: Validate your solution approach before building
Initial reaction
- "In your own words, what do you think this is?"
- "What problems does this solve for you?"
- "How is this different from what you use today?"
- "What's your first impression?"
- "Does this make sense? What's confusing?"
Value assessment
- "Would this solve the problem we discussed earlier? How?"
- "What would change in your workflow if you had this?"
- "On a scale of 1-10, how valuable would this be? Why that number?"
- "Is this a must-have, nice-to-have, or not needed?"
- "Would you switch from your current solution to this? Why or why not?"
Feature importance
- "Which of these features matters most to you? Rank them."
- "What's missing that would make this a must-have?"
- "Which features could we remove and you wouldn't care?"
- "If you could only have 3 features, which would they be?"
- "What would make this 10x better than what you use today?"
Usage intent
- "Walk me through how you'd use this in your workflow."
- "How often would you use this? Daily? Weekly?"
- "Who else would need to use this for it to work?"
- "What would stop you from using this?"
- "What needs to happen for you to start using this?"
Pricing & willingness to buy
- "What would you expect this to cost?"
- "At what price would this be a no-brainer to buy?"
- "At what price would this be too expensive?"
- "What would you compare this to when evaluating price?"
- "Would you recommend this to a colleague? Why or why not?"
Workflow & context questions
Use these when: Understanding how users actually work, identifying integration points, mapping processes
Goal: See the full context of how your product fits into their world. Workflow questions are especially valuable for digital product teams, as they reveal how users interact with your digital product in real-world scenarios and help map the user journey.
Daily workflow
- "Walk me through a typical day. What do you do first?"
- "What tools do you open when you start work?"
- "How do you decide what to work on first?"
- "What interrupts your workflow most often?"
- "How do you know when you're done with a task?"
Process mapping
- "Show me how you do [specific task] from start to finish."
- "What happens before this step? What happens after?"
- "Who else is involved in this process?"
- "What information do you need to complete this?"
- "Where do bottlenecks occur?"
Tool usage
- "What tools do you use for [task]?"
- "How do these tools connect to each other?"
- "What do you wish these tools did that they don't?"
- "Which tool could you not live without? Why?"
- "What tools have you tried and abandoned? Why?"
Context & environment
- "Where are you when you need to do [task]?"
- "What device do you use? Why that one?"
- "Are you ever offline when you need to do this?"
- "Who else can see what you're working on?"
- "What time of day do you typically do this?"
Behavioral & motivational questions
Use these when: Understanding why users make decisions, uncovering motivations, identifying triggers
Goal: Understand the psychology behind user behavior
Behavioral and motivational questions are designed to dig deeper into the reasons behind user actions. These often include opinion questions that encourage open talking, allowing participants to share their thoughts and attitudes freely. By focusing on how users are answering and analyzing the responses they provide, researchers can uncover deeper motivations and gain richer insights.
Decision-making
- "Walk me through the last time you chose a [product/service]. What was your process?"
- "What criteria did you use to evaluate options?"
- "Who else was involved in the decision?"
- "What almost made you choose something different?"
- "How long did the decision take? What held you up?"
Motivations
- "Why is this important to you?"
- "What are you trying to achieve ultimately?"
- "What would success look like?"
- "What drives you to do [activity]?"
- "What would you do if you couldn't do this anymore?"
Triggers & habits
- "What prompts you to use [tool/do task]?"
- "What would cause you to stop using [product]?"
- "When do you think about this problem most?"
- "What's your routine around [activity]?"
- "What happens right before you need to do [task]?"
Goals & outcomes
- "What are you trying to accomplish?"
- "How do you measure success?"
- "What would good look like?"
- "What's your ultimate goal with [activity]?"
- "How will you know when you're done?"
Feature feedback questions
Use these when: Testing specific features, prioritizing development, iterating on existing product
Goal: Get actionable feedback on features. To ensure the feedback is relevant and actionable, tailor your questions to your target audience by considering their specific needs, language, and context.
Feature usability
- "Can you show me how you'd use [feature]?"
- "What would you expect to happen when you click this?"
- "Was that what you expected? Why or why not?"
- "What would you do if [feature] didn't work?"
- "How would you describe this feature to a colleague?"
Feature value
- "How useful is this feature to you on a scale of 1-10?"
- "How often would you use this?"
- "Would you pay extra for this feature?"
- "What problem does this solve for you?"
- "Could you accomplish your goals without this feature?"
Feature comparison
- "How does this compare to [competitor feature]?"
- "What does [competitor] do better?"
- "What do we do better?"
- "What's missing here that competitors have?"
- "What do we have that competitors don't?"
Messaging & positioning questions
Use these when: Testing value propositions, headlines, marketing copy, positioning
Goal: Ensure your messaging resonates
When you ask messaging questions, you can uncover a big difference in how users perceive your product compared to competitors. This insight helps you refine your positioning and highlight what truly sets you apart.
Message comprehension
- "In your own words, what does this product do?"
- "Who is this for?"
- "What problem does this solve?"
- "How is this different from alternatives?"
- "What would you expect to happen if you signed up?"
Message appeal
- "What stands out to you about this?"
- "What would make you want to learn more?"
- "What makes this credible/believable?"
- "What concerns do you have after reading this?"
- "Would you click on this? Why or why not?"
Language & clarity
- "Is there any terminology that's confusing?"
- "What words would you use to describe this?"
- "Is anything unclear?"
- "What questions do you have after reading this?"
- "How would you explain this to a colleague?"
Competitive analysis questions
Use these when: Understanding the competitive landscape, identifying differentiation opportunities
Goal: Learn what users think of alternatives
Competitive analysis interviews are a valuable opportunity to openly discuss what users like and dislike about alternative solutions. This open dialogue helps uncover honest feedback, diverse perspectives, and potential challenges users may face with competitors.
Current solutions
- "What are you using today to solve this?"
- "How did you choose that solution?"
- "What do you like about it?"
- "What frustrates you about it?"
- "Have you considered switching? What held you back?"
Competitor comparison
- "What other options did you evaluate?"
- "How do those compare to what you chose?"
- "What would make you switch to an alternative?"
- "What would your ideal solution look like?"
- "If [competitor] didn't exist, what would you use?"
Switching considerations
- "What would it take for you to switch?"
- "What's preventing you from switching now?"
- "How painful would it be to migrate?"
- "What data would you need to keep?"
- "Who would need to approve a switch?"
Follow-up question strategies
Great interviews aren’t just about the prepared questions—they’re about how you follow up. Effective interviewing and the interview process involve not just asking questions, but also interviewing users with attention to verbal cues and maintaining a natural conversation.
The "tell me more" technique
When you get a surface-level answer:
- "Tell me more about that."
- "Can you give me an example?"
- "What do you mean by [their word]?"
- "Help me understand that better."
To learn more about different research approaches, see Generative vs Evaluative Research: Which Approach Fits Your Needs?.
The "why" ladder
Dig 3-5 levels deep:
User: "I don't like the interface."
You: "Why don't you like it?"
User: "It's confusing."
You: "What specifically is confusing?"
User: "I can't find the export button."
You: "Why is that important?"
User: "I need to export reports weekly."
You: "What happens if you can't export?"
User: "My boss doesn't get the data, and I get in trouble."
Now you understand the real problem: Weekly reporting workflow, not "confusing interface."
The specificity probe
When answers are vague:
Vague: "I use it sometimes."
Probe: "When was the last time? What were you doing?"
Vague: "It costs too much."
Probe: "What would be a reasonable price? What's your budget?"
Vague: "My team needs this."
Probe: "Who specifically? Can you name them and their roles?"
The comparison question
When evaluating reactions: it’s critical to recruit the right participants for your research.
- "How does this compare to what you use now?"
- "Is this better, worse, or just different?"
- "If you had to choose between [A] and [B], which would you pick? Why?"
The hypothetical scenario
Test willingness to act:
- "If this was available tomorrow, what would you do?"
- "If this cost $50/month, would you sign up today?"
- "If I could give you early access right now, what's your email?"
Note: These reveal intent but aren't perfect predictors. Watch for hesitation.
Interview frameworks by situation
Framework 1: The problem discovery interview (30-45 min)
When to use: Early product research, discovering unknown problems
Structure:
Part 1: Context (5 min)
- Warm-up and rapport building
- Role and responsibilities
- Typical day overview
Part 2: Problem exploration (20 min)
If you need to efficiently recruit research participants or industry experts to support this stage, consider using CleverX.
- Current challenges
- Specific problem instances
- Impact and severity
- Current workarounds
Part 3: Solutions & priorities (10 min)
- What they've tried
- What would help most
- Willingness to pay
For more information or answers to common questions, see the CleverX FAQs.
Part 4: Wrap-up (5 min)
- Anything we missed?
- Who else should we talk to?
Sample question flow:
- "Tell me about your role..."
- "What are your biggest challenges?"
- "Tell me about the last time [problem] happened..."
- "How did that impact you?"
- "What have you tried to solve it?"
- "If I could solve this for you, what would that be worth?"
Framework 2: The solution validation interview (45-60 min)
When to use: Testing if your proposed solution solves the problem
Structure:
Part 1: Problem validation (10 min)
- Confirm they have the problem
- Understand current solution
- Establish pain level
Part 2: Solution presentation (5 min)
- Show concept/demo
- Let them absorb silently
- Ask them to explain it back
Part 3: Solution testing (25 min)
- Walk through market research use cases
- Test understanding
- Identify confusion
- Assess value
Part 4: Pricing & decision (10 min)
- Willingness to pay
- Purchase process
- Decision makers
Part 5: Wrap-up (5 min)
- What's missing?
- Next steps
Sample question flow:
- "Do you currently experience [problem]?"
- "Let me show you something. [present concept]"
- "In your own words, what is this?"
- "Would this solve your problem? How?"
- "Walk me through how you'd use this..."
- "What would you pay for this?"
For a detailed step-by-step approach to formulate research problems, consult this expert methodology guide.
Framework 3: The usability testing interview (60 min)
When to use: Testing prototypes or existing products
Structure:
Part 1: Pre-test (5 min)
Before beginning the pre-test, review how to recruit the right participants for research to ensure you have an effective strategy in place.
- Expectations
- Current solution
- Think-aloud explanation
Part 2: Tasks (40 min)
- Give 5-7 specific tasks
- Observe without helping
- Note struggles
- Ask follow-ups
For more detailed guidance on research processes, refer to the B2B Research Methodology: Process Framework.
Part 3: Post-test (10 min)
- Overall impressions
- Comparative questions
- Missing features
- Satisfaction rating
Part 4: Wrap-up (5 min)
- Final thoughts
- Thank you
Sample question flow:
- "I'm going to give you some tasks. Think aloud as you work."
- [Task 1]: "Sign up for a trial account."
- [Observe] "What are you looking for?"
- "Was that easier or harder than expected?"
- "How does this compare to [current solution]?"
- "On a scale of 1-10, how easy was that?"
Framework 4: The jobs-to-be-done interview (60-90 min)
When to use: Understanding the "job" users are hiring your product to do
Structure:
Part 1: The job (15 min)
- What were you trying to accomplish?
- What prompted you to look for a solution?
- What would success look like?
To better understand how users interact with your product and achieve these goals, you may want to learn more about usability testing methods, best practices, and tools.
Part 2: The timeline (30 min)
- First thought
- Passive looking
- Active looking
- Decision
- First use
- Ongoing use
Part 3: The forces (20 min)
- Push forces (problems)
- Pull forces (attractions)
- Anxieties (concerns)
- Habits (inertia)
Part 4: The outcome (10 min)
- Did it work?
- What changed?
- What didn't change?
Sample question flow:
- "When did you first realize you needed [product]?"
- "What were you doing when you thought 'I need to solve this'?"
- "What did you try first?"
- "What made you choose [our product] over alternatives?"
- "What almost stopped you from buying?"
- "How has your life changed since using it?"
Framework 5: The feature prioritization interview (45 min)
When to use: Deciding what to build next
Structure:
Part 1: Current usage (10 min)
- How they use product today
- Favorite features
- Frustrations
Part 2: Feature reactions (25 min)
- Show 5-10 potential features
- Get reactions to each
- Force ranking
- Identify must-haves
Part 3: Trade-offs (10 min)
- If you could only have 3...
- What would you sacrifice?
- What's missing?
Sample question flow:
- "How do you use [product] today?"
- "Here are 8 features we're considering. Which excites you most?"
- "Rank these by importance to you."
- "If you could only have 3, which would they be?"
- "What's the one thing that would make you use this 10x more?"
Framework 6: The competitive analysis interview (45 min)
When to use: Understanding competitive landscape
Structure:
Part 1: Current solution (15 min)
- What they use now
- Why they chose it
- What they like
- What frustrates them
Part 2: Alternatives (15 min)
- What else they considered
- Why they rejected alternatives
- What would make them switch
- Deal breakers
Part 3: Ideal solution (15 min)
- Build dream product
- Must-have features
- Pricing expectations
- Who should build it
Sample question flow:
- "What are you using today?"
- "How did you choose that?"
- "What alternatives did you consider?"
- "What would make you switch?"
- "Describe your ideal solution..."
Framework 7: The onboarding interview (30 min)
When to use: Testing first-time user experience
Structure:
Part 1: Expectations (5 min)
- What do you expect?
- What's your goal?
Part 2: First use (15 min)
- Walk through signup
- Complete first task
- Observe struggles
Part 3: Impressions (10 min)
- Did it match expectations?
- What was confusing?
- Would you continue?
Sample question flow:
- "What do you expect to happen when you sign up?"
- "Go ahead and create an account. Think aloud."
- "Now try to [complete core task]."
- "Was that easier or harder than expected?"
- "Would you come back tomorrow?"
Framework 8: The churn prevention interview (30 min)
When to use: Understanding why users leave or might leave
Structure:
Part 1: Usage history (5 min)
- When did you start?
- How often did you use it?
- What changed?
Part 2: The breaking point (15 min)
- What triggered consideration to leave?
- What frustrations built up?
- What was the final straw?
Part 3: Alternatives (10 min)
- What are you using instead?
- Why is it better?
- What would bring you back?
Sample question flow:
- "When did you start using [product]?"
- "When did you first think about leaving?"
- "What happened that made you decide to stop?"
- "What are you using now instead?"
- "What would it take for you to come back?"
Bad vs. good question examples
Category: Problem discovery
❌ Bad: "Do you have problems with email?"
✅ Good: "Tell me about the last time you felt overwhelmed by email. What happened?"
❌ Bad: "Is this a big problem for you?"
✅ Good: "How much time do you spend dealing with this per week?"
❌ Bad: "Would you pay to solve this?"
✅ Good: "What are you currently spending (time/money) to manage this?"
Category: Solution validation
❌ Bad: "Don't you think this is easier than what you use now?"
✅ Good: "How does this compare to what you use today?"
❌ Bad: "Would you use this?"
✅ Good: "Walk me through how you'd use this in your workflow."
❌ Bad: "Do you like this feature?"
✅ Good: "On a scale of 1-10, how useful is this feature? Why that number?"
Category: Workflow understanding
❌ Bad: "How do you do your job?"
✅ Good: "Walk me through what you did this morning from when you sat down."
❌ Bad: "Do you use a lot of tools?"
✅ Good: "Show me all the tabs you have open right now. What is each one for?"
❌ Bad: "Is your workflow efficient?"
✅ Good: "Where do you waste the most time in your day?"
Category: Motivations
❌ Bad: "Why did you buy this?"
✅ Good: "Walk me through the day you decided to purchase this. What happened?"
❌ Bad: "What's your goal?"
✅ Good: "Imagine it's 6 months from now and this is working perfectly. What's different?"
❌ Bad: "Are you happy with your current solution?"
✅ Good: "What would it take for you to switch from what you use today?"
Customizable question templates
Use these templates and fill in the blanks for your context: These templates not only help you structure your sessions but also provide practical questions to ask in your user interviews, ensuring you gather specific, actionable insights aligned with your research goals.
Template 1: The specific instance
"Tell me about the last time you [tried to accomplish X (for example, by using market research)]. What happened?"
Examples:
- "Tell me about the last time you tried to export data from your CRM. What happened?"
- "Tell me about the last time you onboarded a new team member. What happened?"
Template 2: The workflow walkthrough
"Walk me through your process for [task] from start to finish."
Examples:
- "Walk me through your process for closing a deal from start to finish."
- "Walk me through your process for creating a weekly report from start to finish."
Template 3: The comparison
"How does [A] compare to [B]?"
Examples:
- "How does this compare to what you use today?"
- "How does the mobile app compare to the desktop version?"
Template 4: The priority test
"If you could only have [number] of these features, which would they be and why?"
Examples:
- "If you could only have 3 of these features, which would they be and why?"
- "If you had to remove half of these, which would you cut?"
Template 5: The value assessment
"On a scale of 1-10, how [adjective] is [thing]? Why that number?"
Examples:
- "On a scale of 1-10, how painful is this problem? Why that number?"
- "On a scale of 1-10, how likely are you to recommend this? Why that number?"
Quick reference: question types by interview length
15-minute interview (quick screening)
Use 5-8 questions:
- Problem validation question
- Severity/frequency question
- Current solution question
- One specific instance question
- Willingness to pay indicator
30-minute interview (focused)
Use 10-15 questions:
- Warm-up (role/context) – see this generative research study design guide for a comprehensive overview
- 3-4 problem discovery questions
- 2-3 specific instance questions
- 2-3 solution/priority questions
- 1-2 competitive questions
- Wrap-up
45-60 minute interview (comprehensive)
Use 20-30 questions:
- Warm-up (5 min)
- Problem discovery (10 min)
- Solution validation or usability testing (20 min)
- Competitive/pricing questions (10 min)
- Wrap-up (5 min)
Conclusion: Your question arsenal
You now have 200+ proven questions organized by research goal.
Remember:
- These are starting points—customize for your context
- Follow-up questions matter more than prepared questions
- Ask about past behavior, not future intentions
- Get specific, not general
- Use “why” to dig deeper
Pro tip: Don’t try to ask all questions in one interview. Pick 10-15 that match your research goal, then follow up based on what you hear.
Next steps:
- Learn how to conduct user interviews effectively to gain a better understanding of your users.
- Identify your research goal
- Select 10-15 questions from relevant sections
- Customize them for your product/context
- Add 2-3 follow-up questions per main question
- Test with one interview
- Refine based on what works
Good questions lead to great insights. Great insights lead to better products.
Ready to conduct better user interviews?
CleverX makes user interviews easy with participant recruitment, session recording, AI transcription, and analysis tools—all in one platform.
👉 Start Your Free Trial | Book a Demo | Download Interview Templates