Smart home user research guide: diary studies and household dynamics
How to conduct user research for smart home products using diary studies and household dynamics research. Covers diary study design for multi-device ecosystems, household role mapping, control conflict research, and longitudinal engagement tracking.
Smart home research that talks to one person in the household misses the point. A smart thermostat serves the person who installed it, the partner who cannot figure out why the house is cold, the teenager who overrides it from their phone, and the grandparent visiting who does not understand why the lights turn on by themselves. The product is the same. The experience is completely different for each person.
Most smart home research recruits the “tech person,” the household member who bought the devices, set them up, and manages the ecosystem. This person reports that the smart home works great. Meanwhile, everyone else in the household has a fundamentally different experience: less control, less understanding, and often less comfort with the technology that has been imposed on their living space.
This guide covers two research approaches that reveal the full smart home experience: diary studies that capture how smart home interactions unfold over days and weeks across all household members, and household dynamics research that maps the relationships between people and their shared technology.
For broader IoT research methods (setup testing, voice interfaces, privacy research, methodology comparison table), see our IoT UX research methods guide. For wearable-specific research, see our wearable device research guide.
Key takeaways
- Smart home diary studies must include all household members, not just the person who set up the devices. The “tech person” has a different product than everyone else in the house
- Two weeks is the minimum diary study duration for smart homes because it takes 5-7 days for novelty to fade and real usage patterns to emerge
- Household dynamics research reveals control conflicts, permission hierarchies, and “tech support” burdens that single-user research never surfaces
- The most common smart home research finding: the person who set up the ecosystem loves it, and everyone else tolerates it. That tolerance gap is where your product opportunity lives
- Diary study prompts must be lightweight (under 2 minutes per entry) or participation collapses within the first week. Design prompts for the least tech-savvy household member, not the most
How to design smart home diary studies
Why diary studies work for smart homes
Single-session research (lab testing, interviews) captures what users can do with smart home products. Diary studies capture what users actually do, over days and weeks, in their real homes, with their real families, on their real WiFi networks. The difference is significant:
| What diary studies reveal | Why lab/interview misses it |
|---|---|
| Automation adoption trajectory (which automations stick, which get disabled after day 3) | Lab cannot test multi-day adoption patterns |
| Seasonal and routine-driven usage (weekday vs weekend, morning vs night) | A 45-minute session captures one time of day |
| Household member interactions (who controls what, when conflicts occur) | Lab typically includes only one household member |
| Abandonment triggers (the specific moment a user stops using a feature or device) | Interviews capture retrospective recall, not real-time abandonment |
| Feature discovery over time (when users find features they did not know existed) | Lab tests features you show them, not features they find organically |
| Failure accumulation (small frustrations that compound over days into abandonment) | A single failure in a lab session is recoverable. Ten failures over a week are not |
Study design
Duration: 14 days minimum. Days 1-3 capture novelty behavior (everything is exciting). Days 4-7 capture the transition to habit (usage normalizes). Days 8-14 capture real patterns (what sticks, what gets abandoned, what causes friction). For products with weekly patterns (routines that differ on weekdays vs. weekends), 14 days ensures you capture at least two full cycles.
Participants: 8-12 households, not individuals. Each household should have at least 2 members participating (the “tech person” and at least one other member). Recruit across household types: couples, families with children, roommates, multigenerational homes, and single-person households (as a control).
Entry frequency: Once daily for the primary user. 2-3 times per week for secondary household members. More frequent prompts produce higher-resolution data but increase dropout risk. Daily entries should take under 2 minutes.
Tools: dscout (best for photo/video diary entries), Indeemo, Google Forms (simple but works), or dedicated diary study apps. For households with non-tech-savvy members, paper journals with simple prompts are more accessible than apps.
Diary prompt design
The most important design decision: prompts must be simple enough for the least tech-savvy household member. If a prompt confuses the person who struggles with the smart home, you have lost your most valuable participant.
Daily prompts for primary user (the “tech person”):
| Day of study | Prompt | What it captures |
|---|---|---|
| Every day | ”Which smart devices did you use today? List them briefly.” | Daily usage patterns, device frequency |
| Every day | ”Did anything not work as expected? What happened?” | Failure frequency, failure types, recovery behavior |
| Days 1, 7, 14 | ”Rate your overall satisfaction with your smart home today (1-5). Why?” | Satisfaction trajectory over time |
| Days 3, 10 | ”Did you create, modify, or delete any automation today? What and why?” | Automation adoption and abandonment |
| Days 5, 12 | ”Did you help another household member with a smart device today? What did they need?” | Tech support burden |
Weekly prompts for all household members:
| Prompt | What it captures |
|---|---|
| ”Which smart devices did you use this week, if any?” | Secondary user engagement |
| ”Was there a moment this week when a smart device frustrated you? What happened?” | Friction from non-primary users’ perspective |
| ”Did anyone else in the household control a device in a way that affected you? How?” | Control conflicts and shared experience |
| ”Is there anything the smart home does that you wish it didn’t?” | Imposed technology resentment |
| ”Is there anything you wish the smart home could do that it doesn’t?” | Unmet needs from non-primary users |
Photo/video prompts (optional, high-value):
- “Take a photo of how your smart devices are set up in your most-used room” (reveals placement choices)
- “Record a 30-second video of your morning routine with smart devices” (reveals habitual interactions)
- “Screenshot an automation you set up and explain what it does” (reveals automation comprehension)
Analyzing diary study data
Day-by-day engagement tracking. Plot daily entries per household member over the 14 days. Look for:
- The novelty cliff: when daily engagement drops sharply (typically days 3-5)
- The habit plateau: when engagement stabilizes at a lower but consistent level (typically days 7-10)
- The abandonment signal: when entries become minimal (“nothing today”) or stop entirely
Cross-member comparison. For each household, compare the primary user’s experience to secondary members’. The gap between them is the “household experience gap,” and it is your most actionable research finding.
Thematic coding. Code diary entries across households for recurring themes:
- Control and ownership (“I set it up, so I manage it”)
- Confusion and learned helplessness (“I don’t know how to change the lights, I just ask [tech person]”)
- Privacy and surveillance (“I feel like the house is watching me”)
- Delight and convenience (“I love that the coffee starts when my alarm goes off”)
- Failure and frustration (“The lights didn’t respond to my voice again”)
How to research smart home household dynamics
The three household roles
Research across smart home households consistently reveals three roles, first identified by NNGroup and confirmed in subsequent studies:
The Enthusiast (setup person). Bought the devices, configured the ecosystem, manages automations, troubleshoots problems. They experience the smart home as a project they built and maintain. Their satisfaction is high because they understand how everything works and have customized it to their preferences.
The Casual User. Uses the smart home through the interfaces the Enthusiast set up (voice commands, app controls, automations) but does not configure or troubleshoot. They experience the smart home as a set of features that mostly work, sometimes confuse, and occasionally frustrate. Their satisfaction depends entirely on how well the Enthusiast set things up and how intuitive the product is without configuration knowledge.
The Resistor. Uncomfortable with smart home technology due to privacy concerns, complexity aversion, or preference for manual control. They may actively avoid smart devices (manually flipping light switches instead of using voice commands) or feel surveilled (uncomfortable with cameras and microphones). Their experience is often invisible in research because they do not volunteer for smart home studies.
Household dynamics research protocol
Step 1: Household screening. During recruitment, identify all household members and their approximate roles:
- “Who purchased and set up the smart home devices?”
- “Who else in the household uses them?”
- “Is anyone in the household uncomfortable with the smart home technology?”
Step 2: Individual interviews (30 minutes each, separate sessions).
Interview each household member alone. Combined interviews produce social desirability bias: the Casual User will not say “I hate the smart home” with the Enthusiast sitting next to them.
Enthusiast interview questions:
- “Walk me through how you set up your smart home. What was your vision?”
- “How often do other household members ask you for help with the smart home?”
- “Have you ever changed a setting or automation because someone else complained?”
- “What would happen to the smart home if you were away for a month?”
Casual User interview questions:
- “How do you interact with the smart home on a typical day?”
- “Is there anything the smart home does that you do not understand how to change?”
- “When something goes wrong with a device, what do you do? Do you fix it yourself or ask [Enthusiast]?”
- “If you could change one thing about how the smart home works, what would it be?”
Resistor interview questions:
- “How do you feel about the smart devices in your home?”
- “Are there any devices you avoid using? Why?”
- “Have you ever turned off, unplugged, or covered a smart device? What prompted that?”
- “What would make you more comfortable with the smart home?”
Step 3: Shared observation (1-2 hours, all members present).
After individual interviews, observe the household together during a routine activity (dinner, evening, morning). Watch for:
- Who controls shared devices (TV, speakers, lights)?
- Do members negotiate or just override each other?
- Does the Casual User defer to the Enthusiast for any device interaction?
- Does the Resistor participate or withdraw?
Common household dynamics findings
The “tech support” burden. Enthusiasts spend 2-5 hours per month maintaining, troubleshooting, and reconfiguring the smart home for other household members. This burden is invisible to the product team because it does not generate support tickets. It happens within the household.
The control asymmetry. The Enthusiast has full control over the ecosystem (they know the app, the automations, the settings). Everyone else has partial control (they can use voice commands and basic app features). This asymmetry creates dependency: if the Enthusiast is unavailable, the household’s smart home capability degrades.
The override cycle. Casual Users learn to override automations they do not understand (manually turning on lights that were supposed to stay off). The Enthusiast notices the override and reconfigures. The Casual User overrides again. This cycle is a UX failure that diary studies capture but interviews miss because participants do not remember individual overrides.
The privacy negotiation. Households negotiate privacy boundaries around smart devices: cameras in common areas but not bedrooms, smart speakers muted during sensitive conversations, location tracking accepted for safety but resented for surveillance. These negotiations happen once and are rarely revisited, even as new devices are added.
The children factor. Children interact with smart home devices in ways adults do not anticipate: accidentally triggering routines, asking smart speakers inappropriate questions, discovering features through experimentation. Research with families reveals both delight (children learning through technology) and concern (children accessing content or controls they should not).
How to measure smart home engagement over time
The engagement trajectory
Smart home engagement follows a predictable pattern that diary studies reveal:
| Phase | Timeline | User behavior | Research focus |
|---|---|---|---|
| Novelty | Days 1-3 | High exploration, trying every feature, setting up automations | What features do users discover and try first? |
| Configuration | Days 4-7 | Adjusting settings, refining automations, solving initial problems | Where do users get stuck? What requires troubleshooting? |
| Habituation | Days 8-14 | Usage stabilizes, some features become routine, others are abandoned | What sticks? What gets abandoned? Why? |
| Steady state | Week 3+ | Consistent daily patterns with occasional exploration | Is the user satisfied? Are there unmet needs they have accepted? |
| Drift | Month 2+ | Gradual decline in feature usage, automations stale, devices forgotten | What causes long-term disengagement? What re-engages users? |
Engagement metrics from diary data
| Metric | How to calculate from diary entries | What it reveals |
|---|---|---|
| Daily device interaction count | Count devices mentioned per daily entry | Which devices are “daily drivers” vs. forgotten |
| Feature breadth | Unique features mentioned across all entries / total available features | How much of the product is actually used |
| Automation lifecycle | Days from automation creation to modification to deletion | Whether automations are stable or constantly tweaked |
| Help-seeking frequency | Count entries mentioning asking for help or consulting docs | When the product exceeds the user’s ability |
| Frustration density | Negative sentiment entries / total entries per week | Whether frustration increases or decreases over time |
| Household engagement ratio | Active household members / total household members | Whether the smart home serves one person or the household |
How to research smart home automation adoption
Automations (routines, scenes, schedules) are the feature that separates basic smart home usage from integrated smart home living. Research shows that fewer than 30% of smart home users create custom automations beyond defaults.
Automation research protocol
In diary studies, track:
- When was the first custom automation created? (The later it happens, the higher the barrier)
- How many automations are active at day 14? How many were created and deleted?
- Which automations survive the full study period? What do they have in common?
- Who creates automations? (Almost always the Enthusiast. If Casual Users create them, the UI is working)
In interviews, ask:
- “Walk me through an automation you set up. Why that one?”
- “Have you ever set up an automation that did not work as expected? What happened?”
- “Is there an automation you want to create but have not figured out how?”
- “Has another household member ever been confused by an automation you set up?”
Common automation findings
The “if it breaks, I delete it” pattern. Users who create an automation that misfires once often delete it rather than debug it. The debugging UX for automations is almost universally poor across smart home platforms.
The “I didn’t know I could” gap. Many users do not know what automations are possible. They use voice commands and manual controls because they have never seen what automations can do. Discoverability, not capability, is the automation adoption barrier.
The “too many steps” wall. Automation creation that requires more than 4-5 steps loses most users. Users who successfully create automations often describe them as “surprisingly easy,” which means they expected them to be hard. The perception barrier is as real as the actual barrier.
How to recruit for smart home diary studies
Screening criteria
- How many smart home devices do you currently have in your home? (Minimum 3 for ecosystem research, minimum 1 for single-device studies)
- Which smart home ecosystem do you use? (Alexa, Google Home, Apple HomeKit, SmartThings, other)
- How many people live in your household? (Minimum 2 for dynamics research)
- Who set up the smart home devices? (Identifies the Enthusiast)
- Are all household members willing to participate? (Mandatory for dynamics research)
- How long have you had smart home devices? (Under 3 months = still in novelty. Over 1 year = established patterns. Both are valuable but for different questions)
Incentive structure for household diary studies
| Study component | Incentive | Notes |
|---|---|---|
| 14-day diary (primary user) | $200-300 | Daily entries, 2 min each |
| 14-day diary (secondary member) | $100-150 per member | 2-3 entries per week |
| Follow-up interview (per member) | $50-75 | 30-45 minutes, after diary period |
| Total per 2-person household | $350-525 | Higher for 3+ member households |
| Total per 4-person household | $500-750 | Each participating member incentivized |
Household incentive tip: Offer a shared household bonus (“$50 extra if all members complete the full 14 days”) to encourage the primary user to keep secondary members engaged. The Enthusiast often acts as the study coordinator within their household.
Where to find participants
- Smart home communities. Reddit r/smarthome, r/homeautomation, r/googlehome, r/amazonecho
- Home improvement communities. Broader reach, catches casual adopters
- CleverX verified panels. Pre-screened households filtered by device count, ecosystem, and household size
- Neighborhood platforms. Nextdoor, local Facebook groups for geographic diversity
- Your own user base. Companion app recruitment for existing customers
For general participant recruitment strategies, see our recruitment guide.
Frequently asked questions
How is this different from the IoT research methods guide?
The IoT UX research guide covers the full methodology toolkit for all IoT products (smart home, industrial IoT, connected vehicles, etc.) with a comparison table of 9 methods. This guide goes deep on two specific approaches, diary studies and household dynamics research, that are uniquely valuable for smart home products because smart homes are shared, longitudinal experiences that single-session research cannot capture.
Can you do smart home diary studies remotely?
Yes, and for most studies you should. Remote diary studies (participants log entries via app or form from home) are more natural than in-person observation because the researcher’s presence changes household behavior. Supplement with one in-home visit at the beginning (ecosystem mapping) or end (contextual follow-up interview) of the study for richer context.
How do you handle diary study dropout?
Expect 15-20% dropout over a 14-day study. Mitigate by: over-recruiting by 20%, sending daily reminders (automated, not manual), keeping prompts under 2 minutes, paying partial incentive at day 7 (motivates completion of the second week), and sending a “we miss your entries!” message after 2 consecutive missed days. Do not guilt participants. A gentle “no pressure, but your perspective is valuable” works better than “you committed to daily entries.”
How many household members should participate?
At minimum, the setup person and one other member. Ideally, all members who interact with smart devices regularly (even if “regularly” means accidentally triggering an automation). For families with children under 12, interview the parents about children’s interactions rather than enrolling children directly (simpler consent, more practical).
What is the most surprising finding from smart home household dynamics research?
The “tech support” burden. Enthusiasts consistently underestimate how much time they spend maintaining the smart home for others. When asked in interviews, they say “a few minutes a week.” Diary studies reveal 2-5 hours per month of troubleshooting, reconfiguring, and explaining. This invisible labor affects their satisfaction and represents a product opportunity: smart homes that require less human maintenance.
Should you study households with one ecosystem or multiple?
Both, but separately. Single-ecosystem households (all Alexa, all Google) reveal within-ecosystem friction. Multi-ecosystem households (Alexa + HomeKit + SmartThings) reveal integration friction, which is often worse. If your product operates within a specific ecosystem, prioritize that ecosystem. If your product bridges ecosystems, multi-ecosystem households are essential.