Legal tech user research guide: UX research for law practice technology
How legal tech companies do user research within attorney-client privilege and bar compliance constraints. Covers methods, recruiting lawyers, and ethical safeguards.
Legal technology serves one of the most risk-averse, regulation-heavy professions in existence. Lawyers do not adopt new tools because the interface looks clean. They adopt tools when they trust that the tool will not create malpractice liability, breach client confidentiality, or violate their state bar’s ethics rules.
That makes user research for legal tech fundamentally different from researching any other B2B product. Every research session carries the risk of exposing privileged information. Every prototype test must account for the fact that your participants are bound by ethics rules that govern how they talk about their work, their clients, and their decision-making. And the users themselves, attorneys and paralegals, are trained to be skeptical and precise in ways that change how you ask questions and interpret answers.
This guide covers how legal tech companies conduct effective user research while respecting attorney-client privilege, maintaining bar compliance, and earning the trust of a profession that treats new technology with caution by default.
Key takeaways
- Attorney-client privilege can be waived if sensitive client information is shared during research sessions. Use mock data, NDAs, and isolated testing environments to prevent accidental disclosure
- State bar rules vary significantly. AI disclosure requirements, unauthorized practice of law (UPL) restrictions, and competence standards differ by jurisdiction and directly affect how you design research
- Lawyers are hard to recruit and harder to keep engaged. Sessions must be short (30-45 minutes), highly relevant to their practice area, and scheduled around billable hour pressures
- Use one-on-one interviews over focus groups. Lawyers are reluctant to discuss workflows in front of peers from other firms due to competitive and confidentiality concerns
- Have legal counsel review your research protocol before recruiting. A privilege review ensures your screener, consent form, and task design do not create risks for participants
- Contextual inquiry in law firms reveals workflow realities that interviews miss, but requires firm approval and strict confidentiality agreements
How do legal tech companies do user research?
Legal tech teams use the same core research methods as any product team. Interviews, usability testing, surveys, and analytics. The difference is in how those methods are adapted for a profession where confidentiality is not just preferred but legally mandated.
One-on-one interviews (the primary method)
User interviews are the most valuable method for legal tech research. They allow deep exploration of workflows, pain points, and decision-making processes in a private setting.
Why interviews work better than focus groups for lawyers:
- Lawyers will not discuss client matters, firm strategies, or workflow inefficiencies in front of attorneys from competing firms
- One-on-one settings make it safer to share frustrations with existing tools
- Confidentiality is easier to guarantee with one participant than with a group
Adapting interviews for legal professionals:
- Keep sessions to 30-45 minutes. Lawyers bill in 6-minute increments. Every minute of research is a minute not billing
- Ask about workflows and processes, never about specific clients or matters
- Frame questions around efficiency and risk reduction, not “user experience.” Lawyers respond to practical language, not UX jargon
Usability testing with prototypes
Usability testing is essential for legal tech because the cost of confusion is high. A lawyer who misinterprets a search result, files the wrong document, or misses a deadline because of a UX issue faces malpractice risk.
Safeguards for legal tech usability testing:
- Use mock data in all prototypes. Never use real client names, case numbers, or legal documents
- Create realistic but fictional scenarios (“You represent a small business in a contract dispute with a vendor in California”)
- Test in isolated environments that do not connect to real legal databases or case management systems
- Record sessions only with explicit consent and store recordings on encrypted, access-controlled systems
Contextual inquiry (observing real workflows)
Watching lawyers work in their actual environment reveals things interviews miss. How they switch between 5 open applications. How they print documents to review them because the screen layout is not scannable. How a paralegal manually copies data between systems because the integration does not exist.
Getting access for contextual inquiry:
- Requires firm-level approval, not just individual lawyer consent
- NDA between your company and the firm is mandatory
- Observers must agree to not look at client-identifying information on screens
- Schedule during less sensitive work (administrative tasks, research) rather than during client meetings or depositions
Surveys (limited but useful)
Survey research works for measuring satisfaction, benchmarking feature priorities, and reaching lawyers at scale. Keep surveys short (under 5 minutes), focused on workflows and tools (never client matters), and distributed through trusted channels (bar associations, legal tech communities, not cold email).
Analytics and behavioral data
Product analytics reveal what lawyers actually do versus what they say they do. Track:
- Feature adoption rates by practice area and firm size
- Task completion rates and error rates for critical workflows
- Time-to-completion for common tasks (document review, search, filing)
- Drop-off points in multi-step processes
How does attorney-client privilege affect user research?
Attorney-client privilege protects confidential communications between a lawyer and their client. This privilege can be waived if the information is disclosed to a third party, which includes a user researcher.
What can go wrong
During a research session, a lawyer might:
- Mention a client’s name while describing a workflow
- Show a real case file on their screen during a screen share
- Describe a legal strategy that reveals privileged information
- Input real client data into a prototype or test environment
Any of these could constitute a privilege waiver, creating liability for both the lawyer and your company.
How to prevent privilege issues
| Safeguard | What it does | When to implement |
|---|---|---|
| NDA with privilege acknowledgment | Legally binds researcher to confidentiality and acknowledges privilege risks | Before recruitment begins |
| Mock data requirement | All tasks use fictional clients, cases, and documents | During prototype and task design |
| Pre-session briefing | Remind participants to avoid sharing real client information | Start of every session |
| Privilege review by counsel | Your legal team reviews research protocol for privilege risks | Before research plan is finalized |
| Isolated test environment | Prototypes do not connect to real legal databases | During usability testing |
| Recording consent with scope limits | Consent form specifies what will be recorded and who will access it | Before session recording begins |
| Data purging protocol | Delete any accidentally captured privileged information immediately | During and after sessions |
What to include in your consent form
Standard research consent forms are not sufficient for legal tech. Add these clauses:
- Participant should avoid sharing client-identifying information
- If privileged information is accidentally disclosed, it will be immediately deleted and not included in findings
- Recordings are stored on encrypted systems with restricted access
- The researcher acknowledges that information shared may be subject to attorney-client privilege
Have your legal counsel draft or review the consent form. Do not use a generic template.
How to navigate bar compliance in legal tech research
Each state bar has its own rules governing attorney conduct, technology use, and confidentiality. These rules directly affect how you design research with legal professionals.
Key bar compliance areas that affect research
Competence (ABA Model Rule 1.1). Lawyers must be competent in the technology they use. If your research involves testing a new tool, participants may be concerned about whether using it (even in a test) creates a competence obligation they are not ready for.
Confidentiality (ABA Model Rule 1.6). Lawyers must protect client information. Your research protocol must make it impossible for participants to accidentally breach this duty.
Unauthorized practice of law (UPL). If your product provides legal guidance, templates, or recommendations, your research tasks must not cross the line into having participants give legal advice through your tool to real or simulated clients without appropriate safeguards.
AI disclosure requirements. A growing number of state bars require lawyers to disclose when AI is used in legal work. If your product uses AI, your research should explore how participants understand and comply with these disclosure rules.
Jurisdiction-specific considerations
| Jurisdiction | Notable rule | Research implication |
|---|---|---|
| California | AI transparency requirements for legal work | Test how lawyers understand AI disclosure obligations in your product |
| New York | Strict UPL enforcement | Ensure test scenarios do not create unintended attorney-client relationships |
| Texas | Technology competence now explicit in bar rules | Research how comfort with technology varies across experience levels |
| Florida | Advisory opinions on AI in legal practice | Include questions about regulatory awareness in interviews |
| ABA Model Rules | Duty of technology competence (Comment 8 to Rule 1.1) | Frame research around how the tool supports compliance, not just efficiency |
Best practice: Before recruiting in any jurisdiction, have legal counsel review whether your research protocol complies with that state’s bar rules. What is acceptable in California may create issues in Texas.
How to recruit lawyers for user research
Lawyers are a hard-to-reach population for user research. They are time-constrained, skeptical of unfamiliar processes, and bound by professional rules that limit what they can share. For the full recruitment playbook, including screener questions, incentive benchmarks by firm size, outreach templates, and practice area segmentation, see our guide on how to recruit lawyers for user research.
Where to find lawyer participants
- Bar association events and mailing lists. State and local bar associations have practice-specific sections (litigation, corporate, IP) with engaged members
- Legal tech conferences. ABA TECHSHOW, Legaltech, ILTACON attendees are already interested in technology
- CLE (Continuing Legal Education) programs. Offer a research participation credit or tie recruitment to CLE events
- Law firm innovation teams. Large firms have innovation or legal ops teams that facilitate research participation
- CleverX verified B2B panels. Pre-screened legal professionals with role verification. Faster than sourcing through bar associations
- Legal communities on LinkedIn and Slack. Groups like Legal Hackers, Legal Ops, and practice-specific communities
For general B2B recruitment strategies, see our guide on recruiting participants for product research.
Recruitment tips specific to lawyers
Lead with relevance, not incentives. Lawyers earning $300-800/hour are not motivated by a $50 gift card. They are motivated by influencing a tool they will use, seeing new technology before competitors, or contributing to practice improvement. Frame the opportunity as “shape the future of [practice area] technology” not “earn a gift card for your time.”
Segment by practice area and firm size. A solo practitioner and a BigLaw associate have entirely different workflows, budgets, and technology needs. Research that mixes them produces insights that apply to neither. Define which segment you are targeting before recruiting.
Respect billable hour pressure. Schedule sessions before 9am, during lunch, or after 5pm. Send calendar invites 2-3 weeks in advance. Keep sessions to 30-45 minutes. Never run over time.
Use professional language in recruitment materials. Avoid UX jargon. “Usability study” means nothing to most lawyers. “30-minute conversation about how [tool type] fits into your workflow” is clear and relevant.
What legal tech research reveals that other B2B research misses
Legal tech research surfaces insights that are unique to the legal profession.
Trust is the primary adoption driver. Lawyers do not adopt tools because they are faster. They adopt tools they trust not to create liability. Research must explore trust formation, not just task efficiency.
Workflow integration matters more than features. A document automation tool that does not integrate with the firm’s document management system (iManage, NetDocuments) will not be used regardless of how good it is. Research the ecosystem, not just the product.
Risk perception varies by practice area. Litigation attorneys are more risk-averse about technology than transactional attorneys. Family law practitioners have different confidentiality concerns than corporate M&A lawyers. Segment your research by practice area.
The buyer is rarely the user. In law firms, technology purchasing decisions are made by managing partners, CIOs, or innovation committees. The daily users are associates and paralegals. Research both audiences, but do not conflate their needs.
Conservative does not mean resistant. Lawyers adopt technology when they understand the risk profile. What looks like resistance is often a rational assessment of liability that your product messaging has not addressed. Research helps you find the right language and framing to address these concerns.
Frequently asked questions
Can you use AI-generated synthetic respondents instead of real lawyers for legal tech research?
No. Legal workflows are too specialized, jurisdiction-specific, and risk-sensitive for synthetic respondents to replicate accurately. AI-generated participants cannot simulate the mental model of a lawyer evaluating whether a tool creates malpractice exposure. Use real lawyers for all substantive research. Synthetic data may work for early-stage UI layout testing on non-legal tasks.
How much should you pay lawyers for participating in user research?
$200-500 per hour for experienced attorneys, depending on their market rate and seniority. Solo practitioners and junior associates may accept $150-200. BigLaw partners may need $500+ or a non-monetary incentive (early access, advisory board membership). Standard $50-75 incentives that work for general B2B research will not attract quality legal participants.
Do you need IRB approval for legal tech user research?
Not typically. IRB approval applies to academic and clinical research, not commercial product research. However, if you are conducting research with vulnerable populations (incarcerated individuals accessing legal services, domestic violence survivors using legal tech) or publishing findings in academic journals, consult with an ethics review board.
How do you handle conflicting bar rules across different states?
Design your research protocol to meet the strictest jurisdiction’s requirements. If California requires AI disclosure and Texas does not, include AI disclosure in all sessions rather than creating jurisdiction-specific protocols. This is simpler to manage and eliminates the risk of accidentally applying the wrong protocol to the wrong participant.