How to research marketing software users: a guide for product marketing teams
How to research marketing software users for positioning, messaging, and go-to-market strategy. Covers buyer persona research, switching trigger analysis, competitive win/loss research, and methods that inform product marketing decisions.
Marketing software users evaluate, adopt, and abandon tools faster than almost any other B2B segment. The average marketing team evaluates 3-5 alternatives before purchasing, makes a decision within 2-6 weeks, and switches tools within 12-18 months if the product does not deliver measurable ROI. Understanding these users, what drives their evaluation, what triggers switching, and what keeps them loyal, is the foundation of effective product marketing for any martech or adtech product.
This guide is for product marketing managers, not UX researchers. The research methods here are designed to inform positioning, messaging, competitive strategy, and go-to-market decisions rather than interface design or usability improvements. The output is buyer personas that drive campaigns, competitive narratives that win deals, and switching triggers that inform retention strategy.
For UX-focused research on marketing technology products (usability testing, campaign builder testing, dashboard evaluation), see our martech research guide. For adtech-specific research, see our adtech research guide.
Key takeaways
- Research marketing software users by studying their buying journey, not just their product usage. The PMM research question is “why did they choose us (or not)?” not “can they use the product?”
- Switching trigger research is the highest-ROI PMM research activity. Understanding exactly why users leave competitors (and why they leave you) directly informs positioning and retention messaging
- Win/loss analysis with recently decided buyers produces more actionable data than satisfaction surveys with current users because the buying decision is fresh and the comparison set is clear
- Segment users by buying maturity (first-time buyer, tool switcher, stack consolidator) because each segment evaluates differently, responds to different messaging, and has different competitive context
- Product marketing research must include sales input. The sales team hears objections, competitive comparisons, and buying criteria daily. Systematic collection of this data is research
How to research marketing software buying decisions
Win/loss analysis
Win/loss analysis, interviewing buyers who recently chose your product or chose a competitor, is the single most valuable PMM research method for marketing software.
Who to interview:
- Won deals: Buyers who evaluated alternatives and chose your product (interview within 2-4 weeks of purchase)
- Lost deals: Buyers who evaluated your product and chose a competitor (interview within 2-4 weeks of decision)
- Churned customers: Users who left your product for a competitor (interview within 4 weeks of cancellation)
Interview structure (30 minutes):
| Phase | Questions | What it reveals |
|---|---|---|
| Context (5 min) | “What triggered the search for a new tool? What was the problem you were trying to solve?” | Switching triggers, pain points that drive evaluation |
| Evaluation (10 min) | “Which tools did you evaluate? What criteria mattered most? How did you narrow the list?” | Competitive set, evaluation criteria, decision framework |
| Decision (10 min) | “What ultimately made you choose [product]? Was there a specific moment or factor that tipped the decision?” | Winning differentiators, deal-closing moments |
| Reflection (5 min) | “Knowing what you know now, would you make the same choice? What would have changed your mind?” | Post-purchase validation, messaging opportunities |
For lost deals, adjust:
- “What ultimately made you choose [competitor] over us?”
- “Was there something we could have done differently during the evaluation?”
- “What was the single biggest factor against us?”
Target: 5-8 won, 5-8 lost, 3-5 churned per quarter. This produces enough pattern data to identify systemic themes without requiring massive sample sizes.
Switching trigger research
Switching triggers, the specific events or frustrations that cause a marketing team to start evaluating alternatives, are the most actionable PMM research output because they directly map to messaging and competitive positioning.
Common martech switching triggers (validate for your product):
| Trigger category | Examples | Messaging implication |
|---|---|---|
| Price shock | Contract renewal with 30-50% increase, hidden fees for features previously included | ”Transparent pricing, no surprises at renewal” |
| Integration failure | Key integration breaks after an update, new tool in the stack is not supported | ”Integrates with your entire stack, not just part of it” |
| Scale ceiling | Tool works for 10K contacts but breaks at 100K, reporting slows as data grows | ”Built for growth, performs at any scale” |
| Feature stagnation | Roadmap has not delivered meaningful improvements in 12+ months | ”Shipping improvements every [cadence], driven by user research” |
| Support decline | Response times increase, dedicated CSM removed, self-service docs inadequate | ”Dedicated support at every plan level” |
| Acquisition/merger disruption | Vendor gets acquired, product direction changes, pricing model shifts | ”Independent, focused, and not going anywhere” |
| Champion departure | The internal advocate who chose and managed the tool leaves the company | (Retention strategy: reduce single-champion dependency) |
| Compliance requirement | New regulation (GDPR enforcement, CCPA amendment) requires capabilities the current tool lacks | ”Built for [regulation] compliance from day one” |
How to research triggers:
- Win/loss interviews: “What triggered your search for a new tool?”
- Churned customer interviews: “What was the final straw that made you decide to switch?”
- G2/Capterra review mining: analyze 1-star and 2-star reviews for trigger patterns
- Sales call analysis: what do prospects say about their current tool in the first discovery call?
- Reddit/community monitoring: r/marketing, r/martech, GrowthHackers threads about “switching from X to Y”
Buyer persona research for marketing software
PMM buyer personas are different from UX personas. UX personas describe how users interact with the product. PMM personas describe how buyers evaluate, purchase, and advocate for the product.
PMM persona framework for marketing software:
| Persona dimension | What to research | How to research |
|---|---|---|
| Buying role | Decision maker, influencer, champion, blocker, end user | Win/loss interviews: “Who was involved in the decision? What was each person’s role?” |
| Evaluation criteria | What factors matter most? (Price, features, integration, support, brand, peer recommendation) | Win/loss interviews + surveys: rank criteria by importance |
| Information sources | Where do they research tools? (G2, Gartner, peers, demos, free trials, content) | Survey: “How did you first hear about the tools you evaluated?” |
| Buying maturity | First-time buyer, tool switcher, stack consolidator | Screening question: “Is this your first tool in this category, or are you replacing an existing tool?” |
| Decision timeline | How long from trigger to purchase? | Win/loss interviews: “When did you start looking? When did you decide?” |
| Objection patterns | What concerns almost stopped the purchase? | Sales team input + lost-deal interviews |
| Champion profile | Who internally advocates for the purchase? What is their title, seniority, and department? | Win/loss analysis: “Who drove this evaluation internally?” |
Competitive intelligence research
Understanding how your product is perceived relative to competitors is essential PMM research.
Methods:
Competitive review mining. Systematically analyze G2, Capterra, TrustRadius, and Gartner Peer Insights reviews for your product and top 3-5 competitors. Code reviews for:
- Features praised vs. features criticized
- Support quality mentions
- Pricing satisfaction
- Integration experience
- Switching context (what they switched from and why)
Competitive usage interviews. Interview users who currently use a competitor: “Walk me through how you use [competitor] for [specific workflow]. What works well? What frustrates you?” This produces competitive positioning ammunition directly from the competitor’s own users.
Sales team competitive intelligence. Systematic collection of what prospects say about competitors during sales calls:
- Which competitors are mentioned most often?
- What do prospects praise about competitors?
- What do prospects criticize about competitors?
- What feature comparisons come up repeatedly?
Run a monthly “competitive intelligence roundtable” with 3-5 sales reps to aggregate these observations.
How to segment marketing software users for PMM research
Segmentation by buying maturity
| Segment | Characteristics | Research focus | Messaging angle |
|---|---|---|---|
| First-time buyers | No existing tool in this category. Solving the problem for the first time | Category education, use case validation, ROI justification for new spend | ”Here’s why you need this category of tool” |
| Tool switchers | Replacing an existing tool. Have experience and expectations | Switching triggers, competitive comparison, migration friction | ”Better than what you have, and here’s why” |
| Stack consolidators | Reducing the number of tools. Want fewer vendors, simpler integrations | Integration coverage, total cost of ownership, vendor reliability | ”Replace 3 tools with 1 platform” |
| Power scalers | Current tool works but cannot handle growth. Need enterprise capabilities | Performance at scale, advanced features, enterprise support | ”Built for where you’re going, not just where you are” |
Each segment requires different research questions, different interview scripts, and produces different positioning insights. Mixing them produces averaged findings that resonate with nobody.
Segmentation by company stage
| Stage | MarTech needs | Research angle |
|---|---|---|
| Startup (1-50 employees) | All-in-one tools, minimal budget, fast setup | ”How do you choose tools when budget is tight and you need to move fast?” |
| Growth (50-500 employees) | Scaling existing stack, adding specialized tools, increasing data complexity | ”What broke as you grew? Which tools could not keep up?” |
| Enterprise (500+ employees) | Consolidation, compliance, cross-team governance, vendor management | ”What does your tool evaluation process look like? Who is involved?” |
How to research marketing software positioning
Message testing
Test your positioning statements with real marketing software buyers before committing to campaigns.
Method: Show 3-5 positioning variants to 15-20 marketing professionals and measure:
| Metric | How to measure | What it reveals |
|---|---|---|
| Comprehension | ”In your own words, what does this product do?” | Whether your positioning communicates clearly |
| Relevance | ”How relevant is this to your current challenges?” (1-7 scale) | Whether your positioning addresses real pain |
| Differentiation | ”How different does this sound from other tools you have seen?” (1-7 scale) | Whether your positioning stands out competitively |
| Credibility | ”How believable is this claim?” (1-7 scale) | Whether your positioning is trusted |
| Action intent | ”Based on this, would you want to learn more?” (Yes/Maybe/No) | Whether your positioning drives evaluation |
Testing approach: Surveys for quantitative ranking across variants, followed by 5-8 interviews for qualitative depth on the top 2 variants. Ask: “What about this message caught your attention? What made you skeptical?”
Value proposition research
Interview 10-15 current customers who are active advocates (high NPS, referral sources, case study participants):
- “How would you describe our product to a colleague who asked about it?”
- “What is the single biggest thing our product has changed about how your team works?”
- “If you had to justify the cost to your CFO in one sentence, what would you say?”
The language your customers use to describe your product is more authentic and more persuasive than anything your marketing team writes. Capture their exact words and use them in positioning.
How to collect and systematize sales team insights
The sales intelligence system
Your sales team talks to marketing software buyers every day. That data is research gold if collected systematically.
Weekly sales insight form (2 minutes, after every lost deal or competitive mention):
- Prospect company size and industry
- Which competitors were mentioned?
- What was the prospect’s top evaluation criterion?
- What objection or concern came up most strongly?
- If we lost, what was the stated reason?
Monthly competitive intelligence review (30 minutes with 3-5 reps):
- Top 3 competitors mentioned this month
- New competitor entrants or positioning changes observed
- Most common prospect objections
- Feature requests that came from competitive comparison
- Messaging that resonated vs. messaging that fell flat
Quarterly synthesis: Combine sales insights with win/loss interviews, review mining, and survey data into a competitive intelligence brief that informs the next quarter’s positioning, content, and campaign strategy.
How to recruit marketing software buyers for PMM research
Recruitment approach
PMM research recruits buyers and decision-makers, not daily users. This requires different channels and framing.
For win/loss interviews: Your CRM and sales team are the primary source. Identify recently closed-won and closed-lost deals. Have the sales rep introduce the research (“Our product team wants to understand your experience”) or have a neutral third-party researcher reach out to reduce bias.
For competitive and market research: Recruit through:
- LinkedIn targeting by title (VP Marketing, Director of Demand Gen, Marketing Ops Manager) and tool usage
- Marketing communities (GrowthHackers, MarketingProfs, CMO Alliance)
- CleverX verified B2B panels filtered by marketing role, company size, and tool stack
- G2 reviewer outreach (people who left detailed reviews are often willing to discuss their experience)
Incentive benchmarks
| Participant type | Rate range | Notes |
|---|---|---|
| Marketing practitioner (30-min interview) | $100-175 | Standard for competitive or market research |
| Marketing director/VP (30-min interview) | $200-350 | Higher seniority = higher rate |
| Won-deal customer (30-min win interview) | $75-125 | Lower because they have a relationship with your product |
| Lost-deal prospect (30-min loss interview) | $150-250 | Higher because they chose a competitor and have less motivation to help you |
| Churned customer (30-min exit interview) | $125-200 | Medium because they have recent experience but may feel negative |
Screening for PMM research
- Have you evaluated or purchased marketing software in the last 12 months? (Yes/No. Primary filter)
- What was your role in the evaluation? (Decision maker / Key influencer / End user evaluator / Not involved)
- Which category of marketing software? (Marketing automation, CRM, analytics, ad platform, email, social, content, other)
- How many tools did you evaluate? (1-2, 3-5, 6+)
- What was the outcome? (Purchased a new tool / Stayed with existing tool / Still evaluating)
PMM research metrics and outputs
What PMM research produces (that UX research does not)
| Output | What it is | How it is used |
|---|---|---|
| Buyer personas (PMM version) | Profiles of buying roles, evaluation criteria, decision timelines, and objection patterns | Inform campaign targeting, content strategy, and sales enablement |
| Competitive positioning map | How your product is perceived vs. competitors across key evaluation criteria | Inform messaging, battle cards, and competitive content |
| Switching trigger inventory | The specific events and frustrations that cause buyers to evaluate alternatives | Inform demand gen campaigns, competitive takeout messaging, and retention strategy |
| Win/loss patterns | Systematic trends in why deals are won and lost | Inform product roadmap priorities, sales training, and pricing strategy |
| Value proposition language | Exact customer words that describe your product’s value | Inform website copy, ad creative, and sales scripts |
| Message effectiveness data | Which positioning statements resonate with which segments | Inform campaign creative, landing pages, and sales decks |
Frequently asked questions
How is PMM research different from UX research for marketing software?
UX research asks “Can users accomplish their goals with the product?” PMM research asks “Why do buyers choose this product (or not)?” UX research informs interface design and feature development. PMM research informs positioning, messaging, competitive strategy, and go-to-market decisions. Both are essential. They answer different questions with different methods for different stakeholders.
How often should you run win/loss analysis?
Continuously. Target 5-8 won and 5-8 lost interviews per quarter. Do not batch them into annual studies because buying dynamics, competitive positioning, and market conditions change too quickly. Monthly synthesis of ongoing win/loss data keeps your competitive intelligence current.
Should PMMs conduct the research themselves or use a third party?
Lost-deal and competitive interviews produce more honest data when conducted by a neutral third party (researcher, consultant, or agency) rather than someone from the company. Buyers are more candid about why they rejected your product when talking to someone who is not personally invested in the outcome. Won-deal interviews and customer research can be conducted internally with less bias risk.
How do you research marketing software users who will not talk to you?
Mine the data they leave behind: G2/Capterra reviews, Reddit discussions, community forum posts, social media complaints, and support ticket themes. This passive research captures candid sentiment that interview participants often filter. Combine with competitive product usage data (feature comparison, pricing analysis, integration coverage) for a complete picture even without direct access.
How do you turn PMM research into messaging?
Extract the exact language buyers use. When a lost-deal prospect says “Your competitor showed me ROI in the first demo, and you talked about features,” that becomes messaging direction: lead with ROI, not features. When a won-deal customer says “I chose you because the integration with Salesforce just worked,” that becomes proof point copy. The research-to-messaging pipeline is: collect buyer language, identify patterns, test variants, and deploy the winner.