Customer satisfaction survey questions template

Customer satisfaction survey questions template

Download now
Ideal for:
✅ Marketing teams
✅ SaaS companies & subscription businesses
✅ Service businesses
What you'll get
✅ 40+ pre-written CSAT survey questions
✅ 6 survey type templates
✅ Survey deployment checklist

What is this template?

This customer satisfaction (CSAT) survey questions template is a comprehensive Notion database containing 40+ professionally crafted questions, 6 complete survey scripts for different touchpoints, email templates, and proven frameworks to measure customer satisfaction effectively.

Whether you're measuring satisfaction after a purchase, support interaction, product usage, or onboarding, this template gives you everything you need to launch professional CSAT surveys that get responses and reveal what's actually affecting customer satisfaction—not just collect scores.

What makes it different from other CSAT question lists:

Unlike generic question lists you find online, this template is organized by survey type (transactional, relationship, product, service, support) and customer journey stage, includes complete email invitation templates with optimal timing guidance, and provides a CSAT score calculator with industry benchmarks and response analysis frameworks.

The template works for email surveys, in-app surveys, post-interaction pop-ups, and includes guidance for different industries and customer touchpoints.

Built for real-world use:

Every question includes context on when to use it, what insights it uncovers, and recommended answer formats (rating scale, multiple choice, open text). The survey scripts include professional subject lines, survey introduction language, question sequencing, and thank-you messages—everything you need to run a complete CSAT program from deployment to action.

Plus, you get response analysis templates to calculate CSAT scores, identify trends, segment by customer type, and create action plans based on what you learn.

📦 What you'll get

📋 40+ pre-written CSAT survey questions
Stop starting from scratch for every survey. Get professionally crafted questions organized by survey type (transactional after purchase/support, relationship for overall sentiment, product for feature feedback, onboarding for new customer experience, churn prevention for at-risk accounts). Each question includes recommended answer format (5-point scale, 1-10 rating, thumbs up/down, multiple choice, open text) and analysis guidance on interpreting responses.

🎯 6 survey type templates
Complete, ready-to-deploy survey scripts for every major touchpoint:

  • Post-purchase CSAT survey (transactional - right after order/delivery)
  • Post-support CSAT survey (transactional - after support case closure)
  • Product satisfaction survey (feedback on features and overall product)
  • Onboarding satisfaction survey (after first 30 days or key milestone)
  • Relationship CSAT survey (quarterly overall sentiment check)
  • Churn prevention survey (for customers showing disengagement signals)

Each template includes full question sequence, optimal timing, expected response rates, and industry benchmarks.

📧 Email invitation templates
Professional email templates for every survey type:

  • Post-purchase invitation (4 subject line options tested for open rates)
  • Post-support invitation (timing: immediately after case resolution)
  • Product feedback invitation (best for monthly active users)
  • Reminder email (sent to non-respondents after 2 days)
  • Thank you email (acknowledges response and shares next steps)

All emails include personalization variables, mobile-optimized formatting, and can be customized for your brand voice.

💡 Strategic follow-up question frameworks
Learn which follow-up questions to ask based on satisfaction score:

  • Dissatisfied customers (1-2 stars) - understand what went wrong and how to fix it
  • Neutral customers (3 stars) - identify what would improve their experience
  • Satisfied customers (4-5 stars) - understand what's working and gather testimonials
  • Question types: reason for score, improvement suggestions, feature requests, comparison to alternatives

📊 CSAT calculator & analysis framework
Pre-built calculator that automatically:

  • Calculates your CSAT score from response distribution
  • Segments scores by customer type, product, channel, or custom attributes
  • Identifies trends over time (improving, declining, stable patterns)
  • Benchmarks against industry standards (we include data for 12+ industries)
  • Flags concerning patterns (sudden drops, segment disparities)
  • Generates recommended actions based on score and feedback themes

✅ Survey deployment checklist
Complete checklist for launching successful CSAT surveys:

  • Pre-launch (survey design, timing selection, tool configuration, test sends)
  • Launch (email schedule, response monitoring, technical troubleshooting)
  • During survey (response rate tracking, early feedback review, reminder timing)
  • Post-survey (data export, analysis, insight generation, action planning, team communication)

❌ Common CSAT mistakes guide
Learn what NOT to do with real examples:

  • Question wording mistakes that bias responses toward positive or negative
  • Survey timing mistakes that capture wrong sentiment (too early, too late, wrong trigger)
  • Scale choice mistakes (when to use 5-point vs. 10-point vs. thumbs up/down)
  • Analysis mistakes that lead to wrong conclusions about customer satisfaction
  • Action planning mistakes that waste insights or create wrong priorities

🗂️ Searchable question database (Notion)
Browse and filter all 40+ questions by multiple criteria:

  • Survey type (transactional, relationship, product, service, support, onboarding)
  • Question category (overall satisfaction, specific attribute, comparison, likelihood, open feedback)
  • Answer format (rating scale, multiple choice, open text, yes/no)
  • Customer journey stage (awareness, consideration, purchase, usage, renewal, advocacy)
  • Quickly find the perfect questions for any CSAT survey scenario

📈 Response analysis templates
Structured templates for making sense of CSAT data:

  • Score distribution analysis (identify patterns in satisfaction levels)
  • Verbatim categorization template (tag open-ended responses by theme)
  • Trend analysis template (track scores over time by segment)
  • Driver analysis template (correlate satisfaction with specific attributes)
  • Action priority matrix (decide what to fix first based on impact and effort)

🎓 Bonus resources included

  • CSAT vs. NPS vs. CES comparison guide (when to use each metric)
  • Industry benchmarks (compare your scores across 12+ industries)
  • Sample size calculator (determine if your results are statistically significant)
  • Survey frequency best practices (how often to survey without fatigue)
  • Recommended reading list (top resources on customer satisfaction measurement)
  • Video tutorials on running effective CSAT programs
  • Access to our CX research community for questions and peer support

💾 Multiple formats available

  • Notion template (fully interactive database with calculators and filters)
  • Google Sheets version (includes calculator formulas for teams not using Notion)
  • PDF cheat sheet (printable quick reference for survey design)
  • Editable in any tool you prefer (copy to Excel, Typeform, SurveyMonkey, Qualtrics, Delighted)

Why most CSAT surveys fail to drive improvement

You're sending customer satisfaction surveys after purchases, support interactions, or product usage. You're collecting scores and calculating your average CSAT. But you're not learning WHY customers are satisfied or dissatisfied, you're not following up on negative feedback effectively, and you're missing patterns that could prevent future dissatisfaction.

The result? You have numbers (maybe 78% satisfied, maybe 82% satisfied) but no clear action plan to improve customer experience. Your team doesn't know which issues matter most, and customers feel like their feedback disappears into a void.

You need a CSAT survey program that:

  • ✓ Asks the right satisfaction question plus strategic follow-ups that reveal the "why"
  • ✓ Deploys at the right moment in the customer journey to capture accurate sentiment
  • ✓ Gets high response rates through well-timed, well-written, mobile-friendly surveys
  • ✓ Works for your specific industry, business model, and customer touchpoints
  • ✓ Includes analysis frameworks to turn scores into improvement priorities
  • ✓ Closes the loop with dissatisfied customers so they see you're listening and fixing issues

Most CSAT resources give you generic questions like "How satisfied are you with your experience?" without explaining what follow-up questions to ask, when to send surveys for different touchpoints, how to analyze patterns, or what to do with dissatisfied vs. satisfied customer feedback.

That's exactly why we created this template.

What makes this template different

Organized by survey type and customer journey stage

Post-purchase CSAT (transactional):

  • When: 2-7 days after delivery (product arrives and customer has time to use it)
  • Core question: "How satisfied are you with your recent purchase?"
  • Key follow-ups: Product quality, delivery experience, website usability, value for money
  • Goal: Reduce returns, improve product offerings, optimize checkout

Post-support CSAT (transactional):

  • When: Immediately after support case closes (while experience is fresh)
  • Core question: "How satisfied were you with the support you received?"
  • Key follow-ups: Resolution quality, agent helpfulness, time to resolution, channel preference
  • Goal: Improve support quality, identify training needs, reduce repeat contacts

Product satisfaction (ongoing):

  • When: After 30-60 days of usage (customer has sufficient experience)
  • Core question: "How satisfied are you with [Product Name] overall?"
  • Key follow-ups: Feature usefulness, ease of use, performance, missing functionality
  • Goal: Prioritize feature development, reduce churn, identify expansion opportunities

Onboarding satisfaction:

  • When: After completing onboarding or hitting first milestone (setup complete, first value achieved)
  • Core question: "How satisfied were you with getting started with [Product]?"
  • Key follow-ups: Clarity of instructions, ease of setup, time to value, support availability
  • Goal: Reduce early churn, improve onboarding, shorten time to first value

Each survey type has different optimal timing, question focus, and action implications. This template accounts for that.

Includes complete survey scripts, not just isolated questions

Survey script components:

  • Subject line (4 tested options per survey type with open rate benchmarks)
  • Email body (mobile-optimized, personalized, clear value proposition for responding)
  • Survey introduction (sets context, explains how feedback will be used)
  • Core satisfaction question (properly formatted with appropriate scale)
  • Follow-up questions (conditional based on score or asked to all respondents)
  • Thank you message (acknowledges response, shares next steps)
  • Reminder email (for non-respondents, sent at optimal timing)

Most templates just show questions. This gives you the complete deployment package.

Teaches you which rating scale to use when

5-point scale (Very dissatisfied to Very satisfied):

  • When to use: Most common for CSAT, easy for customers to understand
  • Best for: Post-purchase, post-support, general satisfaction
  • Calculation: % of 4-5 star responses = CSAT score
  • Pros: Familiar, quick to answer, works on mobile
  • Cons: Less granularity than 10-point

10-point scale (1-10 or 0-10):

  • When to use: When you need more granular data for analysis
  • Best for: Product feedback, relationship surveys, competitive benchmarking
  • Calculation: Average score or % of 8-10 responses
  • Pros: More nuance, good for tracking small changes
  • Cons: Takes slightly longer, harder on mobile

Thumbs up/down (binary):

  • When to use: In-product micro-surveys, very simple feedback
  • Best for: Feature feedback, help article ratings, quick pulse checks
  • Calculation: % thumbs up
  • Pros: Fastest to answer, highest response rates, mobile-friendly
  • Cons: No nuance, requires follow-up for context

Emoji scale (😞 😐 😊):

  • When to use: Consumer products, younger audiences, mobile-first
  • Best for: App experiences, retail, entertainment
  • Calculation: % positive emotions
  • Pros: Visual, fun, language-independent
  • Cons: Interpretation can vary by culture

This template explains when to use each and provides examples for all formats.

Includes response workflows based on satisfaction level

For dissatisfied customers (1-2 stars or negative feedback):

  1. Immediate alert to relevant team member (support, account manager, product)
  2. Personal outreach within 24 hours (apologize, understand issue, commit to resolution)
  3. Document issue with categorization (product, service, delivery, expectations mismatch)
  4. Resolve and follow up (explain what was done, confirm resolution)
  5. Track if satisfaction improves in next survey

For neutral customers (3 stars or middle score):

  1. Group by common themes in feedback (identify patterns)
  2. Prioritize quick wins (easy fixes that would boost satisfaction)
  3. Address in next product/service improvement cycle
  4. Communicate changes to customers who mentioned that issue
  5. Monitor if satisfaction improves after changes

For satisfied customers (4-5 stars or positive feedback):

  1. Thank them for positive feedback
  2. Request advocacy action (review, testimonial, referral based on satisfaction level)
  3. Understand what's driving satisfaction (preserve in future changes)
  4. Consider for case study or customer spotlight
  5. Keep engaged with VIP treatment or exclusive updates

The template includes complete workflows for each satisfaction level.

How to use this template

Step 1: Choose your survey type
Are you measuring satisfaction after a purchase, support interaction, product usage, or onboarding? Select the appropriate survey template for your touchpoint.

Step 2: Customize questions for your business
Browse the question database and select 3-5 questions (including core CSAT question + follow-ups) that fit your goals. We provide recommendations based on common scenarios.

Step 3: Set up email invitation
Use our email templates to create your survey invitation. Customize subject line, body copy, and timing for your industry and customer segment.

Step 4: Configure survey tool
Copy questions into your survey tool (Typeform, SurveyMonkey, Google Forms, Qualtrics, in-app survey widget). Set up conditional logic if using score-based follow-ups.

Step 5: Launch and monitor
Deploy survey according to our timing recommendations. Monitor response rates in real-time. Send reminders to non-respondents after 2-3 days.

Step 6: Analyze responses
Use our analysis framework to calculate CSAT score, segment by customer type, identify verbatim themes, and spot trends.

Step 7: Take action
Follow our response workflows to address dissatisfied customers immediately, implement improvements for neutral feedback, and activate satisfied customers as advocates.

Works with: Notion (our template), Google Forms, Typeform, SurveyMonkey, Qualtrics, Delighted, Hotjar, Intercom, or any survey tool you prefer.

Real examples from the template

Example 1: Core CSAT question (proper formatting)

Bad: "Did you like your experience?"
Why it's bad: Yes/no question doesn't measure satisfaction level. Too vague about what "experience" means.

Good: "How satisfied were you with your recent purchase?" with 5-point scale (Very dissatisfied, Dissatisfied, Neutral, Satisfied, Very satisfied)
Why it works: Uses standard CSAT language. Specific about what's being measured (purchase). Allows for nuanced responses.

Example 2: Post-purchase follow-up question

Bad: "What did you think?"
Why it's bad: Too vague. Doesn't prompt specific, actionable feedback.

Good: "What aspect of your purchase experience most influenced your satisfaction rating?"
Options: Product quality, Delivery speed, Website ease of use, Customer service, Price/value, Other
Why it works: Identifies which specific attribute drove their satisfaction score. Gives actionable insight on what to improve or preserve.

Example 3: Post-support follow-up

Bad: "How was the support agent?"
Why it's bad: Focuses only on agent, missing other satisfaction drivers like resolution quality or time to resolve.

Good: "Please rate your satisfaction with the following aspects:" (5-point scale for each)

  • Support agent's helpfulness
  • Time to resolution
  • Quality of solution
  • Ease of getting help
    Why it works: Separates different satisfaction drivers so you can identify specific improvement areas (maybe agents are great but resolution time is slow).

Example 4: Survey timing mistake

Bad: Sending post-purchase CSAT immediately after order placement
Why it's bad: Customer hasn't received product yet. Can't evaluate actual purchase experience.

Good: Sending post-purchase CSAT 3-7 days after delivery confirmation
Why it works: Customer has received and used product. Can give informed satisfaction rating.

Common CSAT program mistakes (and how to avoid them)

Mistake #1: Using wrong rating scale for your use case

What it looks like:
Using 10-point scale for quick in-app micro-survey, or using thumbs up/down for nuanced product feedback.

Why it's a problem:
10-point scales are hard on mobile and take longer (reduces response rates). Thumbs up/down loses nuance needed for improvement decisions.

How to fix it:

  • Quick surveys (<30 seconds): Use thumbs up/down or emoji scale
  • Standard CSAT: Use 5-point scale (Very dissatisfied to Very satisfied)
  • Detailed feedback: Use 10-point scale with space for open comments
  • Consider your audience and channel when choosing scale

Mistake #2: Asking satisfaction about things customer can't evaluate

What it looks like:
Asking "How satisfied are you with our backend infrastructure?" or "Rate your satisfaction with our data security."

Why it's a problem:
Customers can't see backend systems or security measures. They're guessing, giving you meaningless data.

How to fix it:
Ask about outcomes they can observe:

  • Instead of "infrastructure": "How satisfied are you with [Product]'s speed and reliability?"
  • Instead of "data security": "How confident do you feel that your data is secure?"Focus on customer-visible impact, not internal systems.

Mistake #3: Surveying too soon after interaction

What it looks like:
Sending CSAT survey minutes after purchase (before product arrives) or immediately after support case opens (before resolution).

Why it's a problem:
Customer hasn't experienced the full interaction yet. Scores reflect incomplete picture.

How to fix it:
Timing by survey type:

  • Post-purchase: 3-7 days after delivery (product arrived and used)
  • Post-support: Immediately after case closes (not opens)
  • Product satisfaction: After 30+ days of usage (sufficient experience)
  • Onboarding: After key milestone (not just signup)

Mistake #4: Only asking satisfied customers

What it looks like:
Sending CSAT survey only to customers who complete desired action, not to those who abandon or have issues.

Why it's a problem:
Selection bias. You only hear from happy customers. Miss crucial feedback from dissatisfied segment.

How to fix it:
Survey representative sample including:

  • Customers who completed action AND those who didn't
  • Buyers AND browsers who didn't buy
  • Support cases resolved AND escalated or unresolved
  • Active users AND churned customersYou need both sides to understand full satisfaction picture.

Mistake #5: Not following up with dissatisfied customers

What it looks like:
Customer gives 1-2 stars. You add it to spreadsheet. No one ever contacts them.

Why it's a problem:
Dissatisfied customers feel ignored, become detractors. You miss opportunity to save relationship and fix systemic issues.

How to fix it:
Create automatic alert for low scores (1-2 stars):

  • Support or account owner reaches out within 24 hours
  • Apologize, understand specific issue, commit to resolution
  • Document issue for product/service improvement
  • Follow up when resolved
  • Track if satisfaction improves

Mistake #6: Asking too many questions

What it looks like:
CSAT survey with 15+ questions taking 8-10 minutes to complete.

Why it's a problem:
Low completion rates. Survey fatigue. Biased results (only very happy or very angry people finish).

How to fix it:
Keep surveys short:

  • Transactional CSAT: 2-3 questions max (core + 1-2 follow-ups)
  • Relationship CSAT: 3-5 questions max (core + context + open feedback)
  • Target: Under 2 minutes to complete
  • If you need more data, rotate questions or use separate deep-dive surveys

Mistake #7: Ignoring verbatim feedback

What it looks like:
Calculating CSAT score from ratings, ignoring open-ended text responses.

Why it's a problem:
Scores tell you "what" (satisfaction level) but not "why." Verbatim feedback contains specific, actionable insights.

How to fix it:
Process open-ended responses:

  • Read all verbatim feedback, don't just rely on scores
  • Categorize by theme (product, service, delivery, expectations, etc.)
  • Count frequency of each theme
  • Look for specific, recurring issues to fix
  • Share verbatim quotes with team for context

Mistake #8: Comparing scores across different survey types

What it looks like:
Comparing post-purchase CSAT (usually higher) to post-support CSAT (usually lower) and wondering why there's a gap.

Why it's a problem:
Different survey types measure satisfaction with different interactions. Not directly comparable.

How to fix it:

  • Track each survey type separately over time
  • Compare post-purchase month-over-month (not to post-support)
  • Understand that support CSAT is usually lower (people only contact support when they have problems)
  • Set different benchmarks for different survey types

Mistake #9: No action after collecting feedback

What it looks like:
Collect CSAT surveys, calculate score, share with team. Nothing changes. Next quarter, survey again.

Why it's a problem:
Customers notice nothing improves based on feedback. Response rates drop ("why bother if nothing changes"). Missed opportunity to actually improve satisfaction.

How to fix it:
Create action loop:

  1. Analyze results and identify top 3 improvement opportunities
  2. Assign owners and deadlines for each
  3. Implement changes based on feedback
  4. Communicate back to customers what changed and why
  5. Measure if satisfaction improves after changes
  6. Repeat cycle

Mistake #10: Celebrating score without understanding it

What it looks like:
CSAT goes from 75% to 82%, team celebrates, nothing else examined.

Why it's a problem:
Score improved but you don't know why. Can't replicate success. Might be temporary or due to external factors.

How to fix it:
When scores change, investigate:

  • What changed in our product/service during this period?
  • Are there segment differences (certain customer types improved more)?
  • Did specific issues get resolved?
  • Is this statistically significant or random variation?
  • Can we identify root cause and replicate it?

What CX professionals are saying

"This template helped us increase our post-purchase CSAT from 73% to 87% in one quarter. The follow-up questions revealed exactly what was driving dissatisfaction."
— Michelle Thompson, CX Director at E-commerce Company

"We were sending generic 'rate your experience' surveys with 15% response rates. Using these optimized email templates, we're now at 42% response rates."
— Carlos Rivera, Head of Customer Support

"The post-support CSAT template helped us identify that resolution time, not agent quality, was our biggest satisfaction driver. Completely changed our improvement priorities."
— Sarah Kim, VP of Customer Experience

"Finally, a template that explains WHEN to send surveys, not just what questions to ask. The timing guidance alone was worth it."
— James Anderson, Product Manager

Frequently asked questions

Q: What format is the template?
A: It's a Notion template you can duplicate to your workspace. Works with free Notion accounts. We also include Google Sheets (with calculator formulas) and PDF versions.

Q: Can I customize the questions?
A: Absolutely! The template is fully editable. Add your own questions, modify existing ones, remove what you don't need, and customize for your industry and brand voice.

Q: Do I need Notion?
A: No. While it's built for Notion (for database and calculator features), you can copy everything to Google Sheets, Excel, Typeform, SurveyMonkey, or any survey tool.

Q: What's the difference between CSAT, NPS, and CES?
A: CSAT measures satisfaction with specific interaction. NPS measures loyalty and likelihood to recommend. CES measures ease/effort. We include a comparison guide in the template to help you choose.

Q: How do I calculate CSAT score?
A: CSAT = (Number of satisfied responses / Total responses) × 100. For 5-point scale, "satisfied" = customers who selected 4 or 5. The template includes an automatic calculator.

Q: What's a good CSAT score?
A: It varies by industry. Generally: 75%+ = good, 80%+ = great, 85%+ = excellent. We include industry benchmarks for 12+ industries in the template.

Q: When should I send CSAT surveys?
A: Timing depends on survey type. Post-purchase: 3-7 days after delivery. Post-support: Immediately after case closes. Product satisfaction: After 30+ days of usage. See our timing guide in the template.

Q: How often should I survey customers?
A: Transactional CSAT: After each significant interaction (purchase, support case). Relationship CSAT: Quarterly or bi-annually. Rule: Don't survey same customer more than once per month to avoid survey fatigue.

Q: What if I get low CSAT scores?
A: The template includes response workflows for dissatisfied customers: immediate outreach, root cause analysis, resolution commitment, follow-up, and re-measurement after fix.

Q: Should I use 5-point scale or 10-point scale?
A: 5-point scale for most transactional CSAT (easier, faster, higher response rates). 10-point scale for detailed product feedback where you need more granularity. The template explains when to use each.

Q: Can I use this with my existing survey tool?
A: Yes! The questions work with any survey tool (Typeform, SurveyMonkey, Google Forms, Qualtrics, in-app widgets, email surveys). Just copy questions into your tool.

Q: Is this really free?
A: Yes, completely free. No credit card required, no hidden costs. We create free resources to help CX and research professionals.

Ready to measure customer satisfaction effectively?

Stop collecting CSAT scores without context or action plans.

Get our complete template with:

✅ 40+ CSAT questions organized by survey type and journey stage
✅ 6 survey templates for every major customer touchpoint
✅ Email invitation templates with tested subject lines
✅ CSAT calculator with industry benchmarks
✅ Response analysis frameworks and verbatim categorization
✅ Action workflows for dissatisfied, neutral, and satisfied customers
✅ Survey timing and frequency recommendations
✅ Common mistakes guide with before/after examples

What you'll be able to do after using this template

✓ Launch professional CSAT surveys without a CX agency or consultant
✓ Ask the right questions at the right time for each customer touchpoint
✓ Get higher response rates with optimized, mobile-friendly survey invitations
✓ Calculate CSAT scores correctly and benchmark against your industry
✓ Segment satisfaction by customer type, product, or interaction
✓ Identify root causes of dissatisfaction from follow-up questions
✓ Close the loop with dissatisfied customers to save relationships
✓ Activate satisfied customers as advocates and testimonial sources
✓ Track satisfaction trends over time and measure impact of improvements

Don't launch your next CSAT survey without this template

Download the template
Browse other templates
View all