K-12 education user research methods: a complete guide for edtech product teams

How to conduct user research for K-12 education technology. Covers COPPA compliance, teacher gatekeeper navigation, classroom observation, age-appropriate testing, multi-stakeholder research design, and recruiting students, teachers, and administrators.

K-12 education user research methods: a complete guide for edtech product teams

K-12 edtech products have three users who never agree on what matters. Students want the product to be engaging and not feel like homework. Teachers want it to integrate into their existing workflow without adding preparation time. Administrators want it to produce measurable learning outcomes and justify the procurement budget. Building for all three requires researching all three, separately, with methods adapted for each audience’s constraints.

The complication: accessing these users requires navigating gatekeepers at every level. Reaching students requires parental consent (and COPPA compliance for children under 13). Reaching teachers requires school and district approval. Reaching administrators requires procurement cycle alignment. And all of this happens within a school calendar that makes half the year unavailable for research.

This guide covers how edtech product teams conduct effective research across all three user groups while navigating the consent, compliance, and access challenges unique to K-12 education.

For research consent best practices applicable across all contexts, see our consent guide. For trauma-informed methods relevant to research with children, see our trauma-informed research guide.

Key takeaways

  • K-12 edtech research requires separate research tracks for students, teachers, and administrators. Never mix these audiences in a single study because their needs, constraints, and evaluation criteria are fundamentally different
  • COPPA compliance is mandatory when researching children under 13. Verifiable parental consent must be obtained before any data collection, and schools can act as parental agents for educational-purpose research only
  • Teachers are the adoption gatekeepers. If a teacher does not use the product, students never see it. Teacher workflow research is the highest-leverage edtech research investment
  • Classroom observation reveals what no other method can: how your product survives contact with 30 students, a 45-minute class period, varying device availability, and a teacher managing behavior simultaneously
  • The school calendar constrains everything. Research must happen during the school year (September-May), avoid testing weeks, and work within teacher contract hours. Plan 8-12 weeks for district approval before your first session

COPPA compliance for edtech research

When COPPA applies

COPPA (Children’s Online Privacy Protection Act) applies when your research collects personal information from children under 13 through an online service. In edtech research, COPPA applies when:

  • You test a product with students under 13 and collect any identifying data (name, screen recordings with face, voice recordings, device identifiers)
  • You observe students using an online product and capture identifiable data
  • You survey students under 13 online

COPPA compliance requirements for research

RequirementWhat it means for UX research
Verifiable parental consentBefore collecting any personal information from a child under 13, obtain consent from a parent or legal guardian using a FTC-approved method: signed consent form (physical or electronic with verification), credit card transaction, video call verification, or government ID check
School consent exceptionSchools can consent on behalf of parents when the data collection is for educational purposes only (not commercial testing). This exception is narrowly interpreted: the research must directly serve the educational function of the product, and the school must have a policy in place
Data minimizationCollect only the minimum data needed for the research purpose. If you do not need the child’s name, do not collect it. If you do not need video, do not record it
Data deletionDelete children’s personal information when it is no longer needed for the research purpose. Do not retain indefinitely
No behavioral advertising dataCOPPA prohibits using children’s data for behavioral advertising. If your research product collects usage data, ensure it is not used for ad targeting
Privacy policyYour research consent materials must clearly describe what data you collect, how you use it, and how parents can review or delete their child’s data

The school-as-agent pathway

For classroom-based research, schools can act as parental agents under COPPA, consenting on behalf of parents for educational-purpose data collection. This simplifies logistics but has strict boundaries:

What the school can consent to:

  • Observation of students using the educational product during class
  • Collection of usage data for the purpose of improving the educational product
  • Screen recordings during educational use (if the school’s policy permits)

What the school cannot consent to:

  • Research that serves primarily commercial purposes (marketing, competitive analysis)
  • Data collection beyond what is needed for the educational function
  • Sharing children’s data with third parties for non-educational purposes

Best practice: Even when using the school-as-agent pathway, send a parent notification letter (not a consent form, but an information letter) explaining the research and offering parents the right to opt their child out. This goes beyond COPPA requirements but builds trust with families.

COPPA-safe research tools

Research activityCOPPA-safe approachWhat to avoid
Session recordingRecord the screen only, not the child’s face. Or use observation notes without recordingFace-on video recording without verified parental consent
Data collectionAnonymous or de-identified data. Use participant numbers, not namesCollecting names, email addresses, or device identifiers
Survey toolsCOPPA-compliant survey platforms with parental consent featuresStandard survey tools (Google Forms, Typeform) that collect email for login
CommunicationAll communication through the school/teacher. Researcher never contacts children directlyDirect email, messaging, or phone contact with children
Testing platformBrowser-based testing that does not require account creation or personal data entryPlatforms requiring child to create an account or log in with personal credentials

Researching with age-appropriate differentiation

Age-group research adaptations

Age groupGrade rangeCognitive considerationsResearch adaptations
Early elementary (5-7)K-2Limited reading, short attention span (10-15 min), concrete thinking, difficulty articulating abstract conceptsObservation-only (no interviews). Tasks must be visual/gestural. Sessions under 15 minutes. Use a familiar adult (teacher) as co-facilitator
Upper elementary (8-10)3-5Developing reading, 15-20 min attention, beginning abstract thought, can express preferences but not always reasonsSimple preference tasks (“Which do you like better?”). Short interviews (10 min) with concrete questions. Avoid “why” questions (they struggle to explain reasoning)
Middle school (11-13)6-8Strong reading, 20-30 min attention, abstract thinking developing, social awareness affects behavior (they perform for peers)Modified usability testing with think-aloud. One-on-one only (peer presence changes behavior). COPPA applies to under-13 in this group
High school (14-18)9-12Adult-level cognitive capacity, 30-45 min attention, strong opinions but may defer to perceived authorityStandard usability testing works with minor adaptations. Ensure they know their opinions matter and there are no “right answers.” COPPA does not apply

The “UX vs. learning” distinction

The most important methodological challenge in K-12 research: distinguishing between UX friction (the product is confusing) and learning friction (the content is challenging). A student who pauses on a math problem for 30 seconds may be struggling with the interface or struggling with the math. Research must separate these:

  • UX friction indicators: Clicking wrong buttons, not finding features, misunderstanding navigation, expressing confusion about how the product works
  • Learning friction indicators: Taking time to think, re-reading content, making content errors (wrong answers), expressing confusion about the subject matter

Ask students: “Was that hard because of the app or because of the math?” They can usually tell the difference, especially from grade 3 onward.

How to research teacher workflows

Why teacher research is the highest-leverage investment

Teachers are the adoption gatekeepers for K-12 edtech. If a teacher finds the product burdensome, they stop assigning it. If it does not fit their existing workflow (lesson planning, grading, progress monitoring, parent communication), they replace it with a simpler alternative. Research shows that teacher adoption, not student engagement, is the primary predictor of edtech product success.

Teacher research methods

MethodBest forScheduling adaptation
InterviewsUnderstanding planning workflows, grading workflows, tool evaluation criteriaSchedule after school (3:30-5pm), during planning periods, or on professional development days
Classroom observationWatching how the teacher uses the product with students in real timeRequires principal approval. Observe during a lesson where the product is being used. Sit in the back
Usability testingTesting assignment creation, grading workflows, progress dashboard, parent reportingAfter-school or planning period sessions. 30-45 minutes maximum
Diary studiesTracking daily product usage, integration with other tools, frustrations over time1-2 week diary during a unit where the product is actively used. Brief entries (2 min max)
SurveysMeasuring satisfaction, feature priorities, and pedagogical alignment at scaleDistribute through district channels or teacher communities. Under 5 minutes

Critical teacher workflows to research

WorkflowWhat to testKey question
Lesson planningHow the product integrates into existing lesson plans. Can the teacher customize content to match their curriculum?”Show me how you would plan a lesson using this product”
Assignment creationHow long it takes to create, customize, and assign an activity to students”Create an assignment for your class that covers [topic]. How long did that take?”
In-class facilitationHow the teacher monitors student progress during a lesson. Can they identify struggling students in real time?”Your class is using the product right now. How do you know who needs help?”
Grading and assessmentHow the teacher reviews student work, grades assignments, and tracks progress”Show me how you would grade this assignment and report progress to parents”
DifferentiationCan the teacher customize the product for different student ability levels?”You have students at 3 different reading levels. How would you set up the product for each group?”
Tech troubleshootingWhat happens when students have tech issues during class?”A student says the product is not working. What do you do?”

The “10-minute prep test”

The most revealing teacher research question: “How long does it take to prepare a lesson that uses this product?” If the answer exceeds 10 minutes of additional prep time beyond what the teacher would normally spend, adoption is at risk. Teachers already spend 7+ hours planning per week (TNTP). Any product that adds preparation time competes with time for sleep, not time for other tools.

How to conduct classroom observation

Getting access

Step 1: District approval. Contact the district’s curriculum, technology, or research department. Submit a research brief explaining: what you are studying, how many classrooms, how long, and what data you collect. Some districts have formal research application processes.

Step 2: Principal approval. After district approval, the school principal must agree. Frame the research as benefiting the school: “We want to understand how students use [product] so we can make it better for classrooms like yours.”

Step 3: Teacher consent. The teacher must consent to observation. Never pressure teachers. Some will decline, and that is fine.

Step 4: Parent notification. Send a parent notification letter (or consent form if collecting identifiable student data) through the school’s normal communication channel. Allow opt-out for any parent who objects.

Timeline: 6-12 weeks from first district contact to first classroom visit. Start early.

During observation (one class period, 45-90 minutes)

What to observe:

Observation focusWhat to watch for
Product introductionHow does the teacher introduce the activity? Do they demonstrate the product or let students explore?
Student onboardingCan students get started independently, or does the teacher need to help each one?
Engagement trajectoryAre students engaged at the start but fade by minute 20? Do they finish early and disengage?
Help-seeking patternsWho asks for help? Do they ask the teacher, a peer, or try to figure it out themselves? What questions do they ask?
Off-task behaviorWhen students go off-task, what triggered it? Confusion, boredom, technical issue, or social distraction?
Teacher multitaskingHow does the teacher balance monitoring the product’s dashboard with managing the classroom?
Device issuesDoes the product work reliably across all student devices? Which devices have problems?
Differentiation in actionAre different students doing different things? Does the product adapt, or is everyone on the same path?

Do not:

  • Interact with students unless the teacher invites you to
  • Disrupt the class in any way
  • Photograph or record students without explicit consent (see COPPA section)
  • Offer feedback to the teacher during the class (save it for the debrief)

Post-observation teacher debrief (15 minutes)

  • “How typical was today’s class compared to a normal day using [product]?”
  • “I noticed [specific observation]. Can you explain what was happening?”
  • “What do your students like most about the product? What frustrates them?”
  • “If you could change one thing about how this product works in your classroom, what would it be?”

How to research administrator and district decision-makers

The procurement research challenge

K-12 administrators (curriculum directors, technology directors, superintendents) make purchasing decisions based on different criteria than teachers or students:

Administrator criteriaResearch methodKey question
Learning outcome evidenceInterview + document review”What evidence do you need to justify this purchase to your school board?”
Standards alignmentContent review + teacher interview”Does this product align with your state standards and curriculum?”
Integration with existing systemsTechnical evaluation”Does this integrate with your LMS, SIS, and SSO?”
Total cost of ownershipInterview”Beyond the license cost, what resources does implementation require?”
Professional development burdenTeacher interview + observation”How much training do teachers need? Who provides it?”
Data and privacy compliancePolicy review”Does this meet your district’s student data privacy policy?”
Equity and accessibilityAccessibility testing + interview”Will this work for all students, including those with disabilities and limited device access?”

Administrator interview protocol (30 minutes)

Administrators are busy and evaluate dozens of products. Your research session must be efficient and speak their language (outcomes, evidence, compliance), not UX language (usability, experience, friction).

  • “Walk me through your process for evaluating and purchasing edtech products”
  • “What was the last product you decided not to purchase? What was the reason?”
  • “What evidence of effectiveness would convince your school board to approve this purchase?”
  • “How do you handle the gap between what teachers want and what the budget allows?”

How to recruit K-12 edtech research participants

Recruiting teachers

ChannelApproachYield
District partnershipWork with the curriculum or technology department to identify willing teachersHighest quality: teachers are pre-approved and context is understood
Teacher communitiesTeachers Pay Teachers forums, educator Facebook groups, r/Teachers, EdSurge communityBroad reach, but requires careful framing: “Help improve a classroom tool” not “participate in a study”
Education conferencesISTE, state-level ed tech conferences, regional teacher meetupsEngaged, tech-forward teachers. Recruit at or after events
CleverX verified panelsPre-screened educators filtered by grade level, subject, district type, and technology usageFast recruitment with role verification
Your own user baseIn-product prompts for teachers currently using your productHighest relevance, lowest recruitment friction

Incentive benchmarks for teachers:

Study typeRateNotes
30-min after-school interview$75-125Schedule 3:30-5pm. Respect their time after a full teaching day
45-min usability test (planning period)$100-150Must fit within the planning period window
Classroom observation + debrief$100-200 for the teacherThe observation itself does not require extra teacher effort, but the debrief does
1-week diary study$100-200 totalBrief entries only. Teachers are overwhelmed
Professional development creditEquivalent to $50-100Some districts accept research participation as PD hours. Check with the district

Alternative incentives teachers value: Classroom supplies ($50-100 gift card to Amazon or educational supply store), premium product access (free premium account for their classroom), and technology for their classroom (device, software license).

Recruiting students

Never recruit students directly. All student recruitment flows through the school:

  1. District approves the research
  2. Principal approves the classroom
  3. Teacher agrees to participate
  4. Parents receive notification/consent form through the school’s normal communication channel
  5. Students who have parental consent (and their own assent for ages 7+) participate

Student assent: For children aged 7 and older, obtain the child’s agreement to participate in addition to parental consent. Assent is simpler than consent: “We are trying to make [product] better. Can we watch you use it and ask you some questions? You can stop anytime. Is that okay?”

Student incentives: Do not pay individual students. Instead: provide classroom supplies, a class party, or a donation to the classroom fund. Individual cash or gift card incentives to minors create ethical complications and often violate school policies.

Recruiting administrators

  • Through district partnerships already established for teacher/student research
  • Education leadership networks: AASA (superintendents), CoSN (technology leaders), ASCD (curriculum leaders)
  • LinkedIn targeting: “Director of Curriculum” + school district, “Chief Technology Officer” + K-12
  • Incentive: $150-300/hr for 30-min interviews. Administrators’ time is extremely scarce

Scheduling around the school calendar

PeriodAvailabilityResearch suitability
September-OctoberTeachers settling in, new products being adoptedGood for onboarding and first-use research
NovemberPre-holiday, testing season beginningLimited. Avoid standardized testing periods
DecemberHoliday preparations, semester endingPoor. End-of-semester grading consumes all available time
January-FebruaryNew semester, routines establishedBest window. Teachers are in rhythm, products are in active use
March-AprilState testing seasonVery poor. All focus is on test preparation
May-JuneEnd of year, grading, checkoutPoor. Teachers are wrapping up
July-AugustSummer breakTeachers available for interviews/surveys but no classroom observation possible. Good for planning period research

The research sweet spot: January-February and September-October. Plan your research calendar around these windows.

K-12 edtech research metrics

MetricWhat it measuresHow to captureTarget
Teacher prep timeAdditional time the product adds to lesson planningObservation + interview<10 minutes additional prep per lesson
Student onboarding timeHow quickly students can start using the product independentlyClassroom observation<5 minutes for returning users, <10 minutes for first use
Student engagement durationHow long students remain actively engaged before going off-taskClassroom observation (time-sampled engagement coding)>75% of the class period for the target age group
Teacher dashboard comprehensionCan the teacher understand student progress from the dashboard?Usability test: “Which students need help right now?”>80% correct identification
Help request frequencyHow often students need teacher or peer help with the product (not the content)Classroom observationDecreasing over sessions
Assignment creation timeHow long to create, customize, and assign an activityTimed usability task<5 minutes for a standard assignment
Device compatibility rateDoes the product work reliably across student devices?Classroom observation across device types>95% of devices work without issues
Teacher NPSWould the teacher recommend this product to a colleague?Post-study survey+40 or higher

Frequently asked questions

Does COPPA apply to classroom observation where you do not collect personal data?

If you observe students using a product but do not collect personally identifiable information (no names, no photos, no video of faces, no device identifiers), COPPA may not apply to the observation itself. However, if you record the session, photograph screens, or collect any data that could identify a child, COPPA applies for children under 13. When in doubt, obtain parental consent through the school. The compliance cost is minimal compared to the risk.

How do you research edtech products during summer when school is not in session?

Teacher interviews, surveys, and usability testing of planning/administrative features work well during summer. Teachers are often more available and willing to participate. Classroom observation and student research must wait for the school year. Use summer for formative research (testing prototypes with teachers) and the school year for validation research (observing real classroom use).

How do you handle the teacher who loves the product but whose students struggle with it?

This is the most common edtech research finding. Research both perspectives separately and present findings side by side: “Teachers report high satisfaction with assignment creation. Student observation reveals 40% of class time is spent on navigation confusion rather than learning tasks.” The juxtaposition makes the case for design changes without invalidating the teacher’s positive experience.

Should you test with students or teachers first?

Teachers first. Teacher adoption determines whether students ever see the product. Test teacher workflows (lesson planning, assignment creation, progress monitoring) before testing student interactions. If teachers cannot integrate the product into their workflow, student research is premature.

How do you recruit from under-resourced schools?

Under-resourced schools (Title I, rural, high-poverty) are underrepresented in edtech research because they have fewer devices, less technology infrastructure, and more competing demands on teacher time. Recruit by: offering tangible classroom benefits (devices, supplies, premium product access), reducing research burden (shorter sessions, more flexible scheduling), partnering with organizations that serve these communities (Digital Promise, ISTE equity initiatives), and providing stipends that reflect the school’s resource constraints. Your product must work for these schools. Research that excludes them produces products that do not.

What is the difference between edtech UX research and education research?

Education research (conducted by academic researchers) tests whether a product improves learning outcomes using controlled studies, randomized trials, and statistical analysis. UX research tests whether teachers and students can use the product effectively and whether it fits into classroom workflows. Education research answers “Does it work?” UX research answers “Can they use it?” Both are necessary. UX research without education research produces usable products that may not teach effectively. Education research without UX research produces evidence-based products that teachers abandon because they are unusable.