User research for EdTech products: a product manager's guide
Foundational EdTech UX research guide for PMs. K-12 vs higher-ed vs corporate L&D, multi-stakeholder research, COPPA/FERPA compliance, and the realistic stack.
User research for EdTech products is structurally different from research in other product categories because EdTech almost always has a buyer who is not the user, regulatory frameworks (COPPA for K-12, FERPA for student records, ADA for accessibility) that constrain research design, and split-stakeholder dynamics where the same product has to satisfy administrators (who buy), teachers (who gatekeep), students (who use), and parents (who consent). Product managers at EdTech companies have to design research that captures the full stakeholder set, navigate compliance differences across K-12, higher-ed, corporate L&D, and consumer/lifelong learning, and account for the adoption gap that’s specific to EdTech: a product can be purchased and never adopted, or adopted and never used effectively. The methods that fit best are multi-stakeholder qualitative interviews, classroom or learner observation, longitudinal usage studies, and accessibility-centered usability testing.
This guide is for product managers at EdTech companies ? K-12 EdTech (LMS, curriculum, classroom tools), higher-ed EdTech (LMS, student services, admissions), corporate L&D (training platforms, microlearning, certification), and consumer/lifelong learning (Duolingo-style, MOOCs, professional development). It covers what makes EdTech research different, the segment split, multi-stakeholder dynamics, compliance overlay, and the realistic stack.
TL;DR: user research for EdTech products
- Buyer is rarely the user. Administrators buy K-12 EdTech; teachers gatekeep; students use. Higher-ed has similar split. Corporate L&D buyers (HR, L&D leaders) differ from learners. Research must cover all stakeholders.
- Four EdTech segments are different practices. K-12, higher-ed, corporate L&D, and consumer/lifelong learning have different compliance, recruitment, and methods.
- Compliance overlay matters. COPPA for under-13 users, FERPA for student records, ADA Title III + Section 504 + state-level accessibility, GDPR for EU students.
- Adoption gap is EdTech-specific. Purchased + adopted + used effectively are three different outcomes. Research must address each, not assume linearity.
- Teacher gatekeeping is real. In K-12, the teacher decides whether students actually use the product. Skipping teacher research means missing the activation gate.
What’s different about EdTech UX research
Six structural factors:
| Factor | Why it matters |
|---|---|
| Buyer ? user (almost always) | Administrators buy, teachers gatekeep, students use, parents consent. Research must cover the full chain. |
| Compliance overlay | COPPA, FERPA, ADA, Section 504, state student-data privacy laws ? stack varies by segment and geography. |
| Adoption gap | Bought, adopted, used effectively are three outcomes. Research has to validate each, not assume the chain. |
| Multi-stakeholder dynamics | Stakeholders have conflicting goals: admins want measurement; teachers want flexibility; students want engagement. |
| Accessibility is regulatory | Section 504 + ADA Title III + state EdTech accessibility laws ? accessibility is required, not optional. |
| Cyclical usage | School year, semester, course-cycle patterns. Diary studies and longitudinal research surface what point-in-time studies miss. |
PMs who treat EdTech as B2B SaaS with educational branding miss teacher gatekeeping and adoption-gap dynamics. PMs who design research around the full stakeholder chain ship products that get adopted and used.
Four EdTech segments: different practices
The four common EdTech segments require different research approaches:
| Segment | Examples | Primary stakeholders | Compliance overlay |
|---|---|---|---|
| K-12 EdTech | Khan Academy, ClassDojo, IXL, Schoology | Admin + teacher + student + parent | COPPA + FERPA + ADA + state |
| Higher-ed EdTech | Canvas, Blackboard, Coursera (university), Course Hero | Admin + faculty + student | FERPA + ADA + state |
| Corporate L&D | LinkedIn Learning, Cornerstone, Degreed, Workday Learning | HR/L&D buyer + manager + learner | ADA + employment law (varies) |
| Consumer / lifelong | Duolingo, MasterClass, Coursera (consumer), Skillshare | Individual learner | GDPR (EU), CCPA, ADA |
The research practice differs by segment more than by product type. K-12 PMs face teacher gatekeeping; higher-ed PMs face faculty + admin tension; corporate L&D PMs face engagement vs measurement tension; consumer EdTech PMs research more like consumer subscription apps.
The multi-stakeholder framework for K-12
K-12 EdTech is the most stakeholder-complex segment. The full chain:
| Stakeholder | Role | Research focus |
|---|---|---|
| District admin | Buys / approves purchases | Procurement criteria, ROI evidence, compliance, integration |
| School admin (principal) | Influences adoption + reports up | Implementation feedback, teacher reports, student outcomes |
| Teacher | Decides daily use, customizes for class | Daily workflow, customization needs, student engagement |
| Student | End user | Engagement, comprehension, learning outcomes |
| Parent | Consents (especially under COPPA), reinforces use at home | Trust, safety, privacy, learning visibility |
Skipping any stakeholder produces blind spots. Skipping admin = product that teachers love but doesn’t renew. Skipping teacher = product that admins buy but never gets used. Skipping student = product that adults choose but kids reject. Skipping parent = product that gets blocked at signup or reported.
For recruiting K-12 educators specifically, see the K-12 methodology guide.
The multi-stakeholder framework for higher-ed
Higher-ed has a different stakeholder mix:
| Stakeholder | Role | Research focus |
|---|---|---|
| Administration / IT | Procurement, integration, compliance | Vendor evaluation, technical fit, FERPA |
| Faculty | Adopt or reject | Pedagogical fit, customization, gradebook integration |
| Student | Daily user | Mobile experience, integration with student life, accessibility |
Higher-ed research often misses the faculty-administration tension. Faculty resist top-down mandated tools; administration measures vendor success by license utilization. Research that captures both perspectives surfaces the friction that drives renewal vs churn.
For higher-ed specifics, see the dedicated guide.
Common research questions in EdTech
| Question | Best method | Common mistake |
|---|---|---|
| Are students learning effectively? | Outcome studies + comprehension testing + observation | Asking students “is it working?” |
| Are teachers actually using the product? | Usage analytics + teacher diary studies | Self-reported usage in surveys |
| Why isn’t adoption happening? | Multi-stakeholder qualitative + barrier analysis | Surveying admins only |
| Is the product accessible? | WCAG audit + assistive-tech testing + Section 504 review | WCAG audit alone |
| Does parental consent flow work? | Concept testing with parents + COPPA flow audit | Skipping parent research |
| Will faculty adopt this? | Faculty interviews + pilot research | Studying only students |
| Is corporate L&D engaging learners? | Engagement analytics + learner journey research | Completion rate alone |
| Does the gradebook integration work? | Workflow observation with teachers + integration testing | Skipping integration UX |
Methods that fit EdTech well
1. Multi-stakeholder qualitative interviews
K-12 and higher-ed especially benefit from per-account stakeholder studies (admin + teacher/faculty + student per district/institution). Reveals dynamics single-perspective research misses.
2. Classroom and learner observation
For K-12 and higher-ed, observation in actual learning contexts (with appropriate consent and IRB if required) reveals adoption gaps that lab usability misses.
3. Longitudinal usage research
EdTech usage is cyclical (school year, semester, course cycle). Longitudinal studies (4-12 weeks) surface patterns that single-session research misses.
4. Accessibility-centered usability
Section 504 + ADA Title III + state laws make accessibility regulatory, not optional. WCAG 2.2 AA audits + assistive-technology testing (screen readers, switch control, voice control) are baseline.
5. Outcome research vs engagement research
Engagement metrics (time-on-app, completions) don’t equal learning outcomes. Outcome research (pre/post tests, retention measurement, transfer-of-learning studies) is harder but more meaningful.
6. COPPA / FERPA-compliant recruitment
For K-12 students under 13, COPPA requires parental consent for research participation. For any student records research, FERPA requires institutional consent. Build these into the recruitment workflow from the start.
For COPPA-compliant research, see the dedicated guide.
7. Pilot programs as research vehicles
For K-12 and higher-ed, pilot programs (1-3 month deployments at a small number of schools/districts) provide longitudinal usage data, multi-stakeholder feedback, and outcome measurement in one structure. Most B2B EdTech research happens through pilots.
Personas you’ll research in EdTech
K-12 personas
| Persona | Recruit difficulty |
|---|---|
| District / school administrator | Mid-hard ? busy, gatekept |
| Teacher (general classroom) | Mid ? accessible via panels (CleverX, User Interviews, education-specific) |
| Teacher (specialized: special ed, ELL, gifted) | Hard ? smaller populations |
| Student (under 13) | Hard ? COPPA-restricted, parental consent required |
| Student (13-18) | Mid ? easier than under-13 but still requires school/parent gatekeeping |
| Parent | Easy via consumer panels |
| School IT / tech coordinator | Mid ? verified IT B2B |
Higher-ed personas
| Persona | Recruit difficulty |
|---|---|
| Higher-ed administration | Mid-hard ? verified senior B2B |
| Faculty (teaching) | Mid ? campus communities, faculty-specific panels |
| Faculty (research) | Hard ? niche specialties |
| Student (undergraduate) | Easy via consumer panels with student demographics |
| Graduate student | Mid ? smaller pool |
| Adjunct faculty | Mid ? distinct from full-time faculty |
Corporate L&D personas
| Persona | Recruit difficulty |
|---|---|
| HR / L&D buyer | Mid ? verified senior B2B |
| Manager (deploys learning to team) | Mid ? verified B2B |
| Individual learner / employee | Easy via consumer panels |
| Learning admin (LMS admin) | Mid ? verified B2B |
The compliance overlay
COPPA (Children’s Online Privacy Protection Act)
For K-12 EdTech serving under-13 students. Affects:
- Data collection from under-13 users requires verifiable parental consent.
- Research participation by under-13 students requires parental consent.
- Recordings, transcripts, and other PII from under-13 students require COPPA-compliant handling.
FERPA (Family Educational Rights and Privacy Act)
For any product handling student education records. Affects:
- Student data access controls.
- Research using student records requires institutional consent.
- Vendor agreements (DPAs) with educational institutions.
ADA Title III + Section 504
Educational institutions are public accommodations + recipients of federal funding. EdTech apps must be accessible. WCAG 2.2 AA is industry baseline. Section 504 adds additional procedural requirements.
State student-data privacy laws
Vary by state. California’s SOPIPA, New York’s Education Law 2-d, Connecticut’s student data privacy law all add state-level requirements beyond FERPA.
GDPR (EU)
For EdTech serving EU students, GDPR applies in addition to local laws. Right to deletion, lawful basis, data minimization all matter.
The EdTech research stack
For EdTech PMs, the realistic stack:
| Layer | K-12 / higher-ed tools | Corporate L&D tools | Consumer EdTech tools |
|---|---|---|---|
| Recruitment | CleverX (educators + admins), User Interviews, education-specific panels | CleverX (HR/L&D + managers), User Interviews | User Interviews, Prolific, consumer panels |
| In-product feedback | Pendo, Sprig | Sprig, Pendo | Sprig, Hotjar |
| Customer interviews | CleverX (B2B), User Interviews | CleverX, User Interviews | Outset, Wondering |
| Behavioral analytics | Amplitude, Mixpanel | Amplitude, Mixpanel | Amplitude, Mixpanel |
| Accessibility | axe DevTools, Fable, manual audits | axe DevTools, Fable | axe DevTools |
| Synthesis | Dovetail, native AI | Dovetail | Dovetail |
Most EdTech PMs run a 4-5 tool minimum. K-12 specifically often adds parent-research capabilities and pilot management overhead.
Common mistakes EdTech PMs make
1. Single-stakeholder research. Talking only to admins, only to teachers, or only to students. Each stakeholder has a different perspective; missing any creates blind spots.
2. Generic accessibility audits. WCAG audits catch baseline issues. Assistive-technology testing (especially for K-12 with diverse learner needs) catches what audits miss.
3. Confusing engagement with learning. High time-on-app doesn’t mean learning. Outcome research is harder but reveals actual product effectiveness.
4. Skipping teacher gatekeeping research in K-12. In K-12, the teacher decides whether students actually use the product. PMs who skip teacher research miss the activation gate.
5. Treating COPPA as a checkbox. COPPA isn’t just parental consent UI; it’s a structural compliance reality affecting research design, data handling, and product architecture.
6. Single-segment generalization. K-12 research findings don’t generalize to higher-ed. Higher-ed findings don’t generalize to corporate L&D. Don’t bundle.
7. Pilot research without structured measurement. Pilots are powerful research vehicles when structured (pre/post measurement, multi-stakeholder feedback, outcome tracking). Pilots without structure produce anecdotes, not insights.
8. Ignoring cyclical patterns. EdTech usage spikes and dips with school calendars. Single-window research misses the cyclical reality.
Frequently asked questions
What’s different about UX research for EdTech vs other industries?
EdTech has buyer ? user dynamics, compliance overlay (COPPA, FERPA, ADA), four distinct segments (K-12, higher-ed, corporate L&D, consumer EdTech), multi-stakeholder dependencies (admin + teacher + student + parent), and an adoption gap unique to the industry. Generic UX methods miss most of this.
Do I need parental consent for K-12 EdTech research?
Yes for any research touching under-13 students (COPPA). Best practice: parental consent for all student-participant research regardless of age. Schools also typically require institutional consent.
How is research for K-12 vs higher-ed vs corporate L&D different?
K-12 = admin + teacher + student + parent (most stakeholders); higher-ed = admin + faculty + student (faculty-admin tension important); corporate L&D = HR buyer + manager + learner (engagement-vs-measurement tension important); consumer EdTech = individual learner (more like consumer subscription research).
What accessibility standards apply to EdTech?
WCAG 2.2 AA is industry baseline. ADA Title III applies to public accommodations. Section 504 applies to federal-fund-receiving institutions (most schools and universities). State student-data privacy laws may add accessibility requirements. Some district/state contracts require WCAG 2.2 AAA for critical flows.
How do I research learning outcomes vs engagement?
Engagement (time, completion, return) is easier to measure but doesn’t equal learning. Outcome research (pre/post tests, retention measurement, transfer-of-learning studies) is harder but reveals actual product effectiveness. Use both; weight outcome research more heavily for high-stakes decisions.
Should I run pilots as research?
For B2B EdTech (K-12, higher-ed, corporate L&D), structured pilots are among the highest-leverage research methods. They provide longitudinal usage, multi-stakeholder feedback, and outcome data. Structure them: pre/post measurement, defined success criteria, weekly check-ins, end-of-pilot synthesis.
How long does EdTech research take?
Quick consumer EdTech research: 1-2 weeks. K-12 / higher-ed multi-stakeholder studies: 4-8 weeks. Pilot programs: 1-3 months. School-year-cycle longitudinal studies: 4-9 months. EdTech research has more compliance and recruitment overhead than general consumer research.
What’s the biggest mistake EdTech PMs make?
Treating EdTech like consumer SaaS or B2B SaaS without accounting for buyer ? user dynamics, multi-stakeholder gatekeeping, and the adoption gap. PMs who design research around the full stakeholder chain ship products that get adopted; PMs who don’t ship products that get bought but never used.
The takeaway
User research for EdTech products is multi-stakeholder, compliance-overlaid, segment-specific, and adoption-gap-aware. K-12, higher-ed, corporate L&D, and consumer EdTech are different practices. The PMs who run EdTech research best treat the buyer + user + gatekeeper chain as the unit of analysis, design accessibility into research from the start, and measure outcomes alongside engagement.
The realistic stack varies by segment. K-12 and higher-ed PMs need verified educator panels (CleverX, User Interviews, education-specific) plus pilot management plus parent-research capabilities (for K-12). Corporate L&D PMs need verified HR + manager + learner recruit plus engagement analytics. Consumer EdTech PMs operate more like consumer subscription PMs.
The single biggest EdTech research mistake is treating it as B2B SaaS with educational branding. The buyer-user split, multi-stakeholder dependencies, and adoption gap are EdTech-specific realities that generic research methods miss.