ompare qualitative and quantitative research: when to use each, pros, methods, sample sizes, timelines, and how to combine them for product decisions

Qualitative user research for product teams: interviews, usability tests, ethnography and diary studies reveal behaviors and product opportunities.
Qualitative research uncovers user needs, behaviors, and experiences through methods that generate rich narrative data to guide product decisions. It employs techniques like interviews, observations, and open-ended surveys to gather in-depth insights explaining the 'why' behind user actions. Unlike quantitative research, which focuses on measuring what and how many, qualitative research reveals motivations through direct observation, conversations, and context.
Product teams apply qualitative research across development stages—from discovery to post-launch—to deeply understand users and build effective solutions. Alongside these approaches, quantitative research methods offer structured, data-driven insights. Methods such as content and document analysis help extract insights from textual and visual data, providing valuable historical and cultural context.
Leading tech companies rely on qualitative research to avoid costly errors, uncover opportunities, and gain competitive advantages. This guide highlights key qualitative methods including user interviews, usability testing, ethnographic studies, diary studies, and card sorting, emphasizing the importance of combining qualitative and quantitative approaches for comprehensive insights. A qualitative researcher ensures studies yield rich, actionable findings by using diverse research methods to capture the full spectrum of user needs and experiences.
Qualitative research is a powerful methodology designed to achieve an in-depth understanding of human behavior and the motivations behind it. Unlike quantitative research methods, which focus on collecting numerical data and performing statistical analysis, qualitative research methods emphasize the exploration of subjective experiences, attitudes, and perceptions. By using techniques such as interviews, observations, case studies, and focus groups, researchers can gather qualitative data that provides rich insights into complex phenomena and user experiences.
This approach allows product teams and market researchers to gather in-depth insights into consumer behavior, customer feedback, and the factors influencing decision-making. Qualitative research techniques are especially valuable for uncovering the “why” behind user actions, helping organizations develop marketing strategies and products that truly resonate with their target audience. By interpreting the meanings individuals assign to their experiences, qualitative research adds depth and context to the product development process, complementing the numerical data and statistical analysis provided by quantitative research.
There are several types of qualitative research, each offering unique methodologies tailored to different research goals and objectives. User interviews, for example, provide detailed personal accounts and allow researchers to gather participant insights directly from individuals. Focus groups bring together a diverse group of participants for group discussions, enabling the exploration of shared beliefs, differences, and consumer preferences in a collaborative setting.
Ethnographic research involves researchers immersing themselves in the user’s natural environment, observing and interacting with participants to gain authentic insights into user experiences and cultural nuances. This method is particularly effective for identifying pain points and understanding how products fit into users’ daily lives. Case studies offer a comprehensive and in-depth analysis of real-world scenarios, allowing researchers to identify patterns and themes within the data and draw meaningful conclusions.
By combining these qualitative research methods, organizations can achieve comprehensive insights into customer behavior, preferences, and pain points, supporting informed decision-making and effective product development.
Company: Airbnb
Method: In-depth interviews (also known as customer interviews)
Stage: Product discovery and feature prioritization
Sample size: 30 hosts across different property types and experience levels
Research objective
Airbnb product teams needed to understand host pain points managing listings, communicating with guests, and coordinating logistics identifying opportunities for platform improvements and new features supporting host success.
Research approach
Researchers conducted 60-minute in-depth interviews—a qualitative research method that allows companies to delve deep into customer thoughts, feelings, and experiences—with hosts ranging from single-property owners to professional property managers exploring their complete hosting journey from listing creation through guest checkout. Interviews followed semi-structured format allowing hosts to describe experiences in their own words while ensuring key topics got covered.
Questions included “Walk me through preparing your space for a new guest,” “Describe your biggest challenges managing bookings,” “Tell me about your last difficult guest situation and how you handled it,” and “What takes longer than it should in your hosting workflow?”
Key insights discovered in our focus group research methodology guide.
Interviews are a common qualitative research method involving open-ended conversations to gather in-depth insights. Research revealed calendar management created significant friction especially for hosts managing multiple properties or listing on multiple platforms. Hosts described manually syncing calendars, accidentally double-booking, and losing bookings due to calendar errors. One host explained: “I spend an hour every Monday checking all my calendars making sure everything matches. One mistake costs me hundreds of dollars and guest trust.”
Pricing uncertainty emerged as major anxiety driver particularly for new hosts. Hosts described constantly wondering whether they priced correctly, comparing similar listings obsessively, and feeling stressed about leaving money on table or pricing themselves out of the market.
Guest communication proved time-consuming with hosts managing repetitive questions about check-in procedures, WiFi passwords, parking, and local recommendations through scattered messages creating inefficiency and potential for missed information.
Product impact: Buyer Behavior Trends in 2025: How Market Research Can Help
The team was able to extract insights from the interviews to inform product development. Insights directly informed multiple feature developments including unified calendar preventing double-bookings across platforms, Smart Pricing using machine learning suggesting optimal prices based on local demand and comparable listings, automated messaging enabling template responses for common guest questions, and guidebook feature allowing hosts to create digital guides answering frequent questions proactively.
These features addressed core pain points discovered through qualitative interviews demonstrating how understanding host experiences qualitatively led to specific product solutions improving host satisfaction and retention.
Company: Slack
Method: Moderated usability testing
Stage: Design validation and iteration
Sample size: 15 participants including new users and experienced team leads
Research objective: Understanding and improving customer satisfaction is crucial for business growth and long-term success.
Slack design teams needed to evaluate proposed changes to channel organization and notification settings ensuring new features improved rather than complicated user experience before engineering investment.
Researchers recruited participants matching target user profiles including individual contributors new to Slack, experienced power users, and team administrators. Each participant completed realistic tasks using interactive prototypes while thinking aloud describing their thought processes, expectations, and confusion points.
Tasks included "Create a new channel for your project team and invite relevant people," "Adjust notification settings so you only get alerts for mentions," "Find a message someone sent you last week about the budget," and "Archive old channels you no longer need."
Researchers observed where participants hesitated, clicked wrong areas, expressed confusion, or required hints noting specific usability issues and mental model mismatches between user expectations and interface behavior.
Key insights discovered
Testing revealed channel creation workflow confused users with too many options presented upfront. Participants expected simple channel creation then customization afterward but faced decisions about privacy, notifications, and integrations before creating channel creating abandonment and frustration.
User research shows that notification settings proved overwhelming with participants unable to predict consequences of different configurations. One participant said: "I don't know what 'all messages' means versus 'mentions and keywords.' Will I get notifications constantly? I'm afraid to change anything."
Channel discovery challenges emerged with participants struggling to find relevant channels to join. The channel browser listed alphabetically without indicating channel purpose, activity level, or relevance making discovery feel like searching through unorganized filing cabinet.
Product impact
Usability findings led to immediate design changes including simplified channel creation with progressive disclosure showing advanced options after basic creation, clearer notification setting descriptions with examples of notification types users would receive, and improved channel browser with descriptions, member counts, and activity indicators helping users discover relevant channels.
Changes tested well in subsequent rounds validating improvements before launch demonstrating how usability testing iteratively improved experience through observation and rapid refinement.
Company: Figma
Method: Ethnographic observation and contextual inquiry
Stage: Product strategy and roadmap planning
Sample size: 12 design teams observed over 2 weeks each
Research objective
Figma researchers needed deep understanding of design team workflows, collaboration patterns, and tool usage in natural contexts, market research informing long-term product strategy and identifying opportunities beyond specific feature requests.
Research approach
Researchers embedded with design teams at technology companies spending days observing actual work including design reviews, handoff meetings, stakeholder presentations, and individual design work. This immersive approach revealed unstated needs, workarounds, and contextual factors not visible through interviews.
Researchers observed designers working, took photos of workspaces and whiteboards, documented tool usage and context switching, and conducted informal conversations asking "Why did you just do that?" or "What are you trying to accomplish?" understanding decisions in context.
Follow-up interviews explored observed patterns asking designers to explain workflows, workarounds they had developed, and frustrations they had normalized not thinking to mention in traditional interviews.
Key insights discovered: How to Recruit the Right Participants for Research
Research revealed design system maintenance created significant hidden burden. Designers spent hours weekly updating components across files, communicating changes to teams, and fixing broken instances when components changed. This maintenance tax grew with design system maturity creating scaling challenges.
Version control emerged as persistent problem despite not appearing in feature requests. Teams developed elaborate file naming conventions, used timestamps, created "final final" versions, and struggled reconstructing decision history when questions arose weeks later.
Design-to-development handoff required extensive custom processes using combinations of Figma, Slack, Jira, Google Docs, and Loom videos creating fragmented context and repeated explanations. Designers described handoff as "translating designs into developer language through multiple channels."
Ethnographic insights shaped major product directions including component libraries with better update propagation and instance management, version history with branches enabling experimentation without breaking production files, and dev mode creating dedicated developer experience with inspect tools, code snippets, and change tracking.
These strategic features emerged from understanding workflows holistically through observation rather than feature requests from interviews demonstrating ethnography's power revealing needs users don't articulate.
Company: Notion
Method: Longitudinal diary studies
Stage: Onboarding optimization
Sample size: 25 new users tracked over first 30 days
For more information on market research applications and best practices, see this comprehensive guide to market research applications.
Research objective
Notion product teams needed to understand new user experiences during critical first month identifying when and why users got stuck, succeeded, or abandoned product informing onboarding improvements and educational content.
Research approach
Researchers recruited new Notion users having them document experiences through daily or weekly entries over first month. This approach involves collecting a variety of qualitative data, including written diary entries, screenshots, and video recordings, to capture the full range of user experiences. Participants submitted written diary entries, screenshots of workspaces, video recordings of struggles, and responses to prompts like “What did you try to accomplish today?”, “What confused you?”, “What felt successful?”, and “What almost made you quit?” Similar to open-ended surveys, these diary prompts allow participants to provide detailed, qualitative responses, revealing deeper insights into their thoughts and feelings during onboarding.
Researchers conducted brief check-in interviews at days 3, 7, 14, and 30 discussing diary entries, exploring patterns, and understanding evolving perceptions as users progressed from novice to competent.
Key insights discovered
Research revealed distinct learning phases with different challenges. First 48 hours involved blank page paralysis with users uncertain where to start or what Notion was for. One user wrote: “Day 1 - Opened Notion, stared at blank page, closed Notion. Why is this better than Google Docs?”
Days 3-7 involved template discovery and experimentation. Users found templates demonstrating Notion capabilities but struggled customizing for personal needs leading to either copying templates exactly or abandoning them entirely.
Days 7-14 represented critical persistence test where users either achieved aha moment understanding databases and relations or reverted to simpler tools. The difference hinged on discovering specific use case where Notion advantages became obvious.
Days 14-30 showed deepening commitment with users evangelizing Notion to colleagues or quietly abandoning product with those achieving competency rarely leaving afterward.
Product impact
Insights informed comprehensive onboarding redesign including immediate templates and examples on first load eliminating blank page anxiety, progressive education revealing features gradually rather than overwhelming, guided workspace setup walking users through first meaningful project, and aha moment acceleration identifying and emphasizing use cases creating early value realization.
Retention improved significantly after changes with longitudinal view revealing intervention timing and opportunity periods impossible to see through single-time interviews.
Company: Linear
Method: Card sorting and affinity mapping
Stage: Feature organization and information architecture
Sample size: 20 engineering managers and individual contributors
Research objective
Linear teams needed to organize dozens of keyboard shortcuts into logical categories accessible to users with different experience levels and work styles ensuring discoverability without overwhelming new users.
Research approach
Researchers conducted card sorting sessions where participants organized keyboard shortcut cards into groups making sense to them. Sessions used open card sorting allowing participants to create their own category names rather than sorting into predetermined buckets. usability testing
Participants sorted physical cards or digital equivalents grouping related shortcuts and naming categories. They explained thinking aloud revealing mental models about how features relate and which shortcuts belong together.
Researchers analyzed patterns identifying common groupings, category names participants used, and disagreements about organization revealing different user mental models requiring accommodation.
Key insights discovered
Research revealed users organized shortcuts by workflow context rather than technical function. Instead of grouping by "navigation," "editing," and "viewing," users created categories like "triage workflow," "writing issues," and "reviewing work" reflecting actual task sequences.
Frequency-based mental models emerged with experienced users distinguishing "everyday shortcuts" from "power user shortcuts" suggesting tiered organization revealing features progressively.
Role differences appeared with individual contributors organizing by personal workflow while managers organized by team coordination and reporting needs suggesting different documentation for different users.
Product impact
Card sorting insights informed keyboard shortcut documentation organized by workflow context with beginner/advanced tiers, in-app shortcut hints appearing contextually during relevant workflows, customizable shortcut reference sheets users could generate for personal workflow, and role-specific onboarding highlighting relevant shortcuts for different user types. For optimal card sorting and user research outcomes, it's crucial to recruit the right participants; consider these effective strategies to recruit participants for user research studies to enhance the value of your findings.
Organization reflected user mental models rather than engineering architecture improving discoverability and learning aligned with how users actually work.
Company: Calendly
Method: Competitive analysis through user interviews
Stage: Product differentiation and positioning
Sample size: 30 users of competing scheduling tools
Research objective
Calendly needed to understand why users chose competitors, what kept them from switching, and where Calendly could differentiate beyond feature parity identifying positioning opportunities and must-have capabilities. By interviewing users of competing scheduling tools, Calendly aimed to better understand the needs and preferences of potential customers.
Research approach
Researchers interviewed users actively using competing scheduling products including Acuity Scheduling, HubSpot Meetings, Microsoft Bookings, and Google Calendar appointment scheduling. Interviews explored their complete scheduling workflow, evaluation criteria, satisfaction and frustrations, awareness of alternatives, and switching barriers.
Questions focused on comparative experiences: “How did you choose your current tool?”, “What would it take for you to switch?”, “Compare your tool’s strengths and weaknesses to others you’ve tried,” and “Describe your ideal scheduling experience.”
Key insights discovered
Research revealed scheduling wasn’t core problem; coordination was. Users didn’t want better calendars but wanted to eliminate back-and-forth coordination completely. Those using simpler tools accepted limited features because coordination elimination was primary value.
Professionalism and branding emerged as unexpected decision factors. Users chose tools based on how scheduling pages looked to recipients with concerns about appearing professional versus casual, established versus startup, or organized versus haphazard.
Integration depth trumped feature breadth with users prioritizing deep integration with their primary tools (CRM, video conferencing, calendar) over standalone scheduling features.
Researchers extracted insights from these interviews to identify key themes and opportunities for differentiation.
Product impact
Insights shifted positioning from “scheduling tool” to “coordination eliminator” emphasizing time saved over features offered. Product priorities refocused on recipient experience ensuring professional appearance, seamless integration depth with popular business tools, and brand customization enabling professional presentation.
These qualitative insights informed product development and shaped Calendly's marketing strategies to better address the needs of potential customers.
Competitive understanding through user perspective revealed differentiation opportunities missed by feature comparison matrices alone demonstrating qualitative competitive research value.
Company: Superhuman
Method: Qualitative observation through direct observation and screen recording
Stage: Feature validation and workflow optimization
Sample size: 40 email power users
Research objective
Superhuman needed to understand how email power users achieved inbox zero identifying patterns, shortcuts, and workflows to build product optimizing for speed and efficiency matching expert user strategies.
Research approach
Researchers observed email power users processing morning inbox through screen recordings and live observation sessions. Qualitative observation involves directly observing subjects in their natural environment to understand behavior and interactions, providing valuable context for product development. Participants processed real email while explaining aloud what they were doing and why.
Researchers timed specific actions, counted keystrokes, documented shortcuts used, noted hesitation points, and identified repetitive patterns revealing optimization opportunities.
Follow-up questions explored decision-making: “How did you decide to archive versus delete?”, “What makes you star versus just leaving in inbox?”, “When do you defer responses?”, and “How do you maintain inbox zero?”
Key insights discovered
Research revealed triage as critical first-pass workflow. Users quickly scanned subjects determining “respond now,” “respond later,” “archive,” or “delete” separating urgent from important from irrelevant rather than processing chronologically.
Keyboard shortcuts proved essential for speed with mouse usage signaling friction points. Users who maintained inbox zero relied heavily on keyboard shortcuts while those struggling used mouse extensively.
Email as task management emerged as common pattern with users treating inbox as to-do list using flags, stars, and folders as makeshift task prioritization despite suboptimal fit.
Product impact
Insights informed core product decisions including keyboard-first interaction design with shortcuts for every action, triage workflow built into product experience with swipe gestures and quick actions, reminder and snooze features acknowledging email as task system, and speed as measurable design principle with time tracking showing users their efficiency gains.
Product’s success validated that observing expert users revealed patterns product could systematize for broader audience demonstrating observation’s power understanding behavior beyond stated preferences.
Focus groups are a valuable tool for gathering qualitative data and fostering collaborative product ideation. By assembling a diverse group of participants, focus groups create a dynamic environment where group discussions can reveal rich insights into consumer behavior, preferences, and unmet needs. Guided by a skilled moderator, these sessions encourage open dialogue, ensuring that every participant has the opportunity to share their thoughts and opinions.
Focus groups are particularly effective for testing ideas, gathering feedback on new concepts, and identifying emerging trends in the market. The group dynamic often sparks new ideas and highlights differences in perception, providing valuable insights that might not surface in one-on-one interviews. By analyzing the qualitative data collected from these sessions, researchers can draw conclusions about customer satisfaction, brand loyalty, and potential directions for product development. For example, a focus group might be used to explore collaborative product ideation, where participants brainstorm and evaluate new features, offering feedback that directly informs the product development process.
Understanding consumer behavior is essential for successful product development, as it enables companies to design solutions that truly meet customer needs and preferences. Qualitative research methods, such as user interviews and ethnographic research, are instrumental in providing valuable insights and a deeper understanding of consumer behavior. Through these approaches, researchers can gather nuanced insights into customer motivations, preferences, and pain points that quantitative data alone cannot reveal.
By immersing themselves in the user’s environment, qualitative researchers can observe firsthand how consumers interact with products, uncovering opportunities for improvement and innovation. For instance, ethnographic research allows companies to see how products fit into users’ daily routines, while user interviews provide direct feedback from potential users about their experiences and expectations. These qualitative research methods help organizations identify areas for enhancement, gather feedback from potential users, and ultimately inform product development with a more comprehensive understanding of consumer behavior.
Qualitative research plays a crucial role in building brand loyalty by uncovering valuable insights into customer behavior, preferences, and needs. By gathering in-depth insights into customer experiences and opinions, companies can identify specific areas for improvement in their products and services, leading to increased customer satisfaction and long-term loyalty.
For example, qualitative research can be used to collect feedback from customers about their experiences, helping organizations pinpoint opportunities to enhance their offerings and better meet customer needs. Additionally, by identifying emerging trends and patterns through qualitative studies, companies can stay ahead of the competition and adapt their marketing strategies and product development efforts to align with evolving customer expectations. Leveraging qualitative research to inform decision-making not only strengthens brand loyalty but also fosters lasting relationships with customers, ensuring continued success in a competitive marketplace.
These examples reveal common patterns about effective qualitative research in product development applying across companies, methods, and contexts.
Start with clear objectives
Every successful example began with specific research questions not vague “learn about users” goals. Clear objectives focus research effort and ensure insights answer actual product decisions.
Match methods to questions
Different methods suit different questions. Interviews explore experiences and motivations, usability testing evaluates specific interactions, ethnography reveals contextual behaviors, diary studies track changes over time, and card sorting understands mental models.
Go beyond stated preferences
Most valuable insights come from observing behaviors, understanding contexts, and exploring unexpressed needs rather than asking users what features they want.
Involve cross-functional teams
Examples show product managers, designers, engineers, and other stakeholders participating in research creating shared understanding and better adoption of insights.
Connect insights to action
Strong research explicitly connects findings to product decisions—whether features to build, experiences to improve, or strategies to pursue—leading to informed decisions and innovative solutions.
Validate iteratively
Examples often involved multiple research phases testing assumptions, gathering feedback, and refining understanding rather than single comprehensive studies.
Qualitative research can also inform policies, design interventions, or improve services by understanding the subjective meanings behind participants' experiences.
How much does qualitative research cost?
Costs vary widely by method and scale. User interviews typically cost $3,000-$10,000 for 10-15 participants including recruitment, incentives, and analysis. Usability testing ranges $5,000-$15,000 for 8-12 participants. Ethnographic studies cost $15,000-$50,000+ for extended observation.
How long does qualitative research take?
Timeline depends on method and scope. User interviews take 2-3 weeks from planning through insights. Usability testing requires 3-4 weeks including prototype preparation and iteration. Ethnographic studies span 4-8 weeks or longer. Diary studies run for study duration plus 2-3 weeks analysis.
How many participants do I need?
Typical ranges: 5-15 for user interviews per segment, 8-12 for usability testing, 10-20 for card sorting, 6-12 teams for ethnography, and 20-30 for diary studies. Exact numbers depend on user diversity and saturation achievement.
Can small companies do qualitative research?
Yes, qualitative research scales to any budget. Start with 5 interviews, conduct guerrilla usability testing, or use remote tools reducing costs. Quality matters more than scale for qualitative research.
Should I hire researchers or do it myself?
Product teams can conduct basic qualitative research with training. Consider hiring researchers for complex studies, high-stakes decisions, or when objectivity matters. Many companies use hybrid approach with internal researchers for ongoing work and external experts for specialized studies.
How do I convince stakeholders to invest in qualitative research?
Share examples of decisions improved by user understanding, calculate costs of building wrong features, demonstrate quick turnaround from research to insights, involve stakeholders in research observations, and start small proving value before requesting larger investment.
Access identity-verified professionals for surveys, interviews, and usability tests. No waiting. No guesswork. Just real B2B insights - fast.
Book a demoJoin paid research studies across product, UX, tech, and marketing. Flexible, remote, and designed for working professionals.
Sign up as an expert