Customer problem validation template
This customer problem validation template is a comprehensive Notion framework containing interview planning tools, 50+ discovery questions organized by stage, problem intensity scorecards, validation checklists, and synthesis frameworks to systematically validate whether customer problems are worth solving.
Whether you're a product manager validating features, a founder testing startup ideas, or a UX researcher conducting discovery, this template gives you everything you need to uncover genuine customer pain points and measure problem severity: not just collect polite feedback.
What makes it different from basic interview scripts:
Unlike generic interview question lists you find online, this template follows Mom Test principles to avoid false positives (asks about past behavior, not future intent), includes a three-dimensional problem intensity scoring system (frequency × impact × alternative quality), segments customers by problem variation (not just demographics), and provides validation completion criteria so you know exactly when you have enough evidence to proceed or pivot.
The template works for B2B and B2C products, new market exploration and existing product improvement, and includes frameworks for both generative research (discovering problems) and evaluative research (validating specific hypotheses).
Built for real-world use:
Product managers: Validate feature ideas with evidence before adding to roadmap. Stop building based on HiPPO (highest paid person's opinion) and start building based on customer pain intensity scores.
Founders & entrepreneurs: De-risk your startup by proving people will pay for your solution before building an MVP. Avoid the #1 cause of failure—building products nobody wants.
UX researchers: Conduct discovery research that reveals genuine user pain points. Ground product strategy in qualitative evidence with systematic documentation.
Product designers: Ensure designs solve actual problems customers have, not problems you think they should have. Design based on evidence, not assumptions.
Innovation teams: Test new product concepts before significant investment. Kill bad ideas fast with data, double down on validated opportunities.
What's inside the template:
Interview planning framework
- Research objectives worksheet (define what you need to learn and what decisions research will inform)
- Participant recruitment criteria (identify who to talk to based on problem exposure)
- Screening question templates (pre-qualify participants efficiently)
- Sample size calculator (determine how many interviews needed based on market diversity)
Interview question bank (50+ questions organized by discovery area)
- Current state mapping: "Walk me through the last time you encountered [problem]"
- Pain point probing: "What makes this frustrating/difficult/time-consuming?"
- Workaround discovery: "How do you handle this today? What have you tried?"
- Cost quantification: "How much time/money does this problem cost you?"
- Willingness indicators: "Have you looked for solutions? What would solving this be worth?"
- Competitive landscape: "What alternatives have you considered or used?"
Real-time documentation tools
- Interview note-taking template (capture quotes, emotions, buying signals during conversations)
- Audio/video recording checklist (proper consent and storage practices)
- Follow-up tracker (questions to ask in future interviews)
Analysis & synthesis frameworks
- Interview summary template (standardized format for each conversation)
- Insight tagging system (categorize feedback by theme, segment, urgency)
- Pattern analysis framework (aggregate insights across multiple interviews)
- Affinity mapping guide (group similar problems to identify themes)
Problem intensity scorecardRate each discovered problem on three dimensions:
- Frequency score (1-5): How often customers encounter it (daily = 5, yearly = 1)
- Impact score (1-5): Cost when problem occurs (business-critical = 5, minor = 1)
- Alternative quality score (0-5): How good are current solutions (no solution = 5, perfect = 0)
- Total score (max 15): 12-15 = strong validation, 8-11 = moderate, <8 = weak
Validation completion checklistKnow when you have enough evidence:
- Interviewed 15+ potential customers from target segment
- Identified 3-5 recurring problems (mentioned by 50%+ of participants)
- Documented specific past examples, not hypotheticals
- Understood current workarounds and why they fail
- Validated willingness to pay with 30%+ of participants
- Reached saturation (last 3 interviews revealed no new information)
- Can articulate problem clearly without mentioning your solution
Problem statement canvasDocument validated problems with evidence:
- Who experiences this? (specific customer segment)
- What is the problem? (clear description)
- When does it occur? (context and triggers)
- Why does it matter? (business impact)
- How do they solve it today? (current workarounds)
- Evidence supporting this (interview count, quotes, intensity scores)
Stakeholder presentation templatePresent findings to leadership:
- Executive summary (top problems, confidence level, recommendation)
- Customer segments identified
- Problem intensity rankings
- Evidence dashboard (quotes, frequency data)
- Recommended next steps (build, validate more, pivot, or stop)
Common use cases:
Startup idea validation: Freelance expense management startup interviewed 22 freelancers, discovered "manual categorization across multiple income streams" scored 13/15 intensity. Built MVP focused on that problem—validated before writing code.
Feature prioritization: B2B SaaS with 500 customers interviewed 18 users, discovered enterprise customers needed integrations (100% mentioned), mid-market needed simpler reporting (72%), SMB needed UI simplification (89%). Created segment-specific roadmaps instead of one-size-fits-all.
Pivot decision: Team exploring project management tool discovered PM problem scored only 6/15 (customers satisfied with Asana/Notion), but async communication problem scored 13/15. Pivoted to build async tool instead—much stronger traction.
Best practices included:
Do's:
- Ask about past behavior, never future intent ("Tell me about the last time..." not "Would you use...")
- Dig three levels deep ("Why is that a problem?" asked 3 times to find root cause)
- Listen for emotion and specificity (real problems come with stories and frustration)
- Probe workarounds (if they have one, problem is real)
- Document exact quotes (powerful for stakeholder presentations)
Don'ts:
- Don't pitch your solution during discovery (stay neutral to avoid confirmation bias)
- Don't ask leading questions ("Don't you hate when..." suggests the answer)
- Don't accept hypotheticals (redirect: "Tell me about the last time this happened")
- Don't interview friends/family (too polite, need brutal honesty)
- Don't stop at 3-5 interviews (pattern across 10-15 is real signal)
.png)
