TL;DR
- A good NPS response rate is 15-30% for most channels. Email surveys typically get 15-25%, SMS 40-50%, in-app 20-35%, and website surveys 8-15%.
- Response rates below 15% usually signal delivery problems (spam folders, broken links) or design issues (survey too long, bad timing, survey fatigue).
- The diagnostic framework maps your rate to the root cause: Below 5% means delivery problem. 5-15% means design or timing problem. 15-30% is healthy range. Above 30% requires a bias check.
- Channel matters more than industry. SMS and in-app surveys consistently outperform email and website surveys regardless of what business you're in.
- Transactional NPS surveys get 8-12 points higher response rates than relational NPS because they're contextual and sent right after an interaction.
You sent 500 NPS surveys last month. 82 people responded. Is that good? Bad? Should you be worried?
You might know how to design surveys, which questions to ask, how to calculate your score. But how many responses you should actually expect to get back, or what it means when you're not hitting that number.
That's what this article covers. Response rate benchmarks by channel, what your current rate actually means, and strategies to improve it without introducing bias into your data. The most useful part is the diagnostic framework. It tells you exactly what's broken based on your response rate range.
Quick clarification: this is about response rates (how many people complete your survey), not NPS scores (what they rate you). Those are separate metrics. If you're looking for NPS score benchmarks by industry, that's covered in what is a good NPS score.
What Is a Good NPS Survey Response Rate?
A good NPS survey response rate is 15-30% for most channels. Email NPS surveys typically achieve 15-25%, SMS 40-50%, in-app 20-35%, and website surveys 8-15%. Rates below 15% often signal delivery, design, or timing issues. Rates above 30% are healthy but warrant data quality checks.
Response rates measure how many people who receive your NPS survey actually complete it. Unlike NPS scores (which measure loyalty), response rates measure engagement. Whether customers are willing to give you feedback in the first place.
Why response rates matter: Statistical validity depends on sample size. Low rates mean small samples, which reduce confidence in your NPS score. Response bias is the other risk. If only your happiest or angriest customers respond, your score doesn't reflect reality. And declining response rates often precede drops in NPS. Customers disengage before they churn.
What affects response rates most: Channel is the biggest factor. SMS gets higher rates than email, in-app higher than website. Survey type also plays a role. Transactional NPS gets higher rates than relational because it's contextual and sent immediately after an interaction. Industry has less impact than most people think. A healthcare org and a SaaS company using email NPS will see similar response rate ranges (15-25%), even though their actual NPS scores will differ significantly.
Survey design matters. Embedded questions beat linked surveys. Timing affects everything. Immediately after an interaction beats days later. And survey length is critical. One question beats five.
NPS Response Rate Benchmarks by Channel
The channel you use to send NPS surveys has the biggest impact on response rates. SMS and in-app surveys consistently outperform email and website surveys because they're harder to ignore and often embedded. One tap to respond vs. clicking through to a separate form.
Here's what to expect by channel, based on data from multiple 2024-2025 industry research studies.
| Channel | Expected Response Rate | What Drives It | When to Use It |
| Email (embedded) | 15-25% | Question visible in email, no click required | Relational NPS, B2B, post-purchase |
| Email (linked) | 6-15% | Requires click-through, adds friction | When you need multi-question follow-ups |
| SMS | 40-50% | High open rates (98%), mobile-first format | Transactional NPS, time-sensitive feedback |
| In-app (mobile) | 27-36% | Contextual, appears during product use | SaaS, product-led growth, feature feedback |
| In-app (web) | 20-27% | Triggered by user actions, contextual | SaaS web apps, feature research |
| Website pop-up | 8-15% | Interrupts browsing flow, generic timing | Anonymous traffic, research panels |
| 30-50% | Conversational format, high engagement | International markets, mobile-first audiences |
-
Email surveys: Studies show that mbedded email NPS surveys (where the question appears directly in the email) achieve 15-25% response rates, while linked surveys (requiring a click-through to a separate page) get 6-15%. It also showed internal email survey response rates of 21% for their B2B customer base. Research reports B2B email NPS surveys averaging 12.4%, with a range between 4.5% and 39.3%.
-
SMS surveys: Research shows text message surveys get response rates between 45-60%. Their research also found that surveys sent within 2 hours of an event get 32% more completions than surveys sent later. SMS outperforms email significantly because customers can reply with a single number, and open rates for SMS approach 98%.
-
In-app surveys: According to reports, analyzing 1,382 surveys with 5+ million views, found that mobile app surveys average 36.14% response rates while web app surveys average 26.48%. The overall in-app survey average across both platforms was 27.52%. For NPS specifically, Refiner found that in-app NPS surveys achieved 21.71% response rates.
-
Website surveys: Response rates for website pop-ups and feedback tabs are lower, ranging from 8-15%. Research suggests that surveys shown to anonymous website visitors get just 3-5% response rates because these visitors have no relationship with your brand yet.
-
WhatsApp surveys: Research shows WhatsApp surveys achieving 30-50% response rates. The conversational format and high platform engagement make WhatsApp particularly effective in markets where it's the primary communication channel.
NPS Response Rate Benchmarks by Survey Type
Whether you send transactional or relational NPS surveys affects response rates almost as much as channel choice. The difference is timing and context.
-
Transactional NPS (sent immediately after a specific interaction like a support case, purchase, or onboarding call) consistently gets 8-12 points higher response rates than relational surveys. Expected range: 25-40% depending on channel. Why it works: The survey arrives while the experience is fresh, customers know exactly what you're asking about, and the feedback feels relevant. According to SurveySparrow's research, event surveys sent within 2 hours get 32% more completions than delayed surveys.
-
Relational NPS (sent on a schedule, quarterly or post-milestone, to measure overall loyalty) gets lower response rates. Expected range: 15-25% depending on channel. Why it's lower: The question is broader and less tied to a specific moment. Customers need to think harder about their answer. The feedback feels less urgent. CustomerGauge's research shows companies see a 5.2% increase in retention when customers are surveyed every quarter, but response rates for these relationship surveys average around 12-15% for B2B.
The trade-off: Transactional surveys give you more responses but measure narrow interactions. Relational surveys give you fewer responses but measure the broader relationship. Most businesses run both. For more on when to use each, see our guide to relationship vs transactional NPS.
What Your Response Rate Actually Means?
A response rate isn't just a number. It's a diagnostic signal. The framework below maps your rate to the most likely root cause and tells you where to start fixing.
a. Below 5%: Delivery Problem
This isn't a survey design issue. Your surveys aren't reaching inboxes. Check spam folder placement using tools like Mail-Tester or GlockApps. Verify survey links work on mobile and desktop. Confirm email authentication (SPF, DKIM, DMARC) is configured. Test SMS delivery if using that channel. Review suppression lists to ensure you're not accidentally blocking active customers.
Common causes: Spam filter triggers (too many links, promotional language, blacklisted sending domain). Broken survey links. SMS shortcode not approved. Sending to invalid or outdated contact lists.
b. 5-15%: Design or Timing Problem
Surveys are reaching customers but something is stopping them from completing. This range signals friction in the survey experience or poor timing. Check survey length (should be 1-2 questions for NPS). Review send timing (immediately after interaction vs. days later). Audit question wording (is it clear what you're asking?). Test mobile experience (does the survey work on phones?). Check if you're sending too many surveys (survey fatigue). Review whether you're asking the right audience (sending NPS to brand-new customers who have no opinion yet).
According to studies, every extra question drives down response rates by 5-15%. If your survey has 5+ questions, that's likely the problem.
c. 15-30%: Healthy Range
This is where most well-designed NPS programs land. You're doing the fundamentals right: good deliverability, clean survey design, appropriate timing, reasonable frequency. The goal now is optimization, not repair. Test embedded vs. linked formats. Experiment with send time (morning vs. evening, weekday vs. weekend). Personalize survey invitations with customer name and specific interaction details. Add one reminder to non-responders 3-5 days after initial send.
Reports show that a well-timed reminder can lift total response rates by up to 50% extra responses (example: 21% response from first email, 13% additional from reminder sent 7 days later).
d. Above 30%: Check for Response Bias
Very high response rates can be great, but they can also signal that only your most engaged (or most frustrated) customers are responding. This creates sample bias. Segment your responses by NPS score. If you're getting mostly 9-10s and 0-3s with very few middle scores, that's a red flag. Check if high-value accounts are responding at the same rate as low-value accounts. Review whether specific customer segments are over- or under-represented. Test whether incentives are driving the high rate (and potentially biasing the data).
A healthy 30%+ rate should show a representative mix of promoters, passives, and detractors that roughly matches your actual customer base distribution.
12 Strategies to Improve NPS Survey Response Rates
These tactics work across channels and you can apply them systematically rather than all at once so you can measure what actually moves the needle.
1. Use Embedded Survey Formats
Embedded surveys (where customers can answer directly in the email, SMS, or in-app prompt without clicking through to a separate page) consistently outperform linked surveys by 10-15 points. For email NPS, include clickable rating buttons (0-10) directly in the email body. For SMS, allow customers to reply with a single number. For in-app, use a native survey widget that feels like part of the product, not a pop-up that blocks the interface.
Studies show embedded email formats achieving 15-25% response rates compared to 6-15% for linked surveys.
2. Personalize Survey Invitations
Generic survey invitations get ignored. Personalized ones get opened. Include the customer's name in the subject line and email body. Reference the specific interaction that triggered the survey ("How was your support experience with Sarah on March 10?"). Mention account-specific details (product tier, tenure, recent activity). Use dynamic content to tailor the survey question to the customer's situation.
Personalization signals that this isn't a mass email. It's a request for feedback on something that actually happened to this specific customer.
3. Optimize Send Timing by Channel
Timing affects response rates more than most teams realize. For transactional NPS, send within 2 hours of the interaction. Surveys sent within this window get 32% more completions. For email surveys, Tuesday through Thursday, 10 AM to 2 PM local time performs best for B2B. For SMS surveys, send between 12 PM and 8 PM local time to avoid early morning or late night interruptions. For in-app surveys, trigger after a success moment (feature used successfully, task completed, positive interaction) rather than during an active workflow.
Test timing with A/B splits. What works for your audience might differ from generic best practices.
4. Keep Surveys to 1-2 Questions Maximum
Survey length directly affects completion rates. Every extra question drives down response rates by 5-15%. For NPS, stick to the core question ("How likely are you to recommend us?") plus one optional follow-up ("What's the primary reason for your score?"). Anything beyond that should be conditional. Only ask follow-ups if the initial answer suggests they're relevant.
Single-question surveys perform well, and 4-5 question surveys perform best overall when designed correctly, but most teams should default to shorter.
5. Send One Reminder to Non-Responders
One reminder works. More than one creates survey fatigue. Send the reminder 3-5 days after the initial survey, only to customers who haven't responded yet. Change the subject line to make it clear this is a follow-up, not a duplicate. Keep the reminder shorter than the original invitation. One well-timed reminder can lift total response rates by up to 50% extra responses.
Do not send reminders for transactional surveys sent immediately after an interaction. The moment has passed.
6. Suppress Customers Who Recently Received Any Survey
Survey fatigue kills response rates. If a customer received any survey (NPS, CSAT, product feedback) in the last 30 days, suppress them from your next NPS send. This applies across all survey types, not just NPS. Use a 30-day suppression window as your baseline, then adjust based on your send volume and customer feedback. High-touch accounts with frequent interactions might need a 14-day window. Low-touch accounts might need 90 days.
Track how often each customer is being surveyed across all channels and teams. Uncoordinated survey programs create fatigue without anyone realizing it.
7. Use Conversational Language, Not Corporate Speak
Survey invitations written in stiff, formal language get lower response rates than those written like a human asking a question. Compare: "We would appreciate your participation in our customer satisfaction survey" vs. "Quick question: How'd we do?" The second version gets higher open rates and response rates because it feels like a person asking, not a system sending.
Write like you're talking to a colleague, not filing a legal document. Use contractions. Ask direct questions. Keep sentences short.
8. Test Mobile Experience Obsessively
Most NPS surveys are opened on mobile devices, but many are designed for desktop. Test your survey on iPhone and Android. Check that rating buttons are large enough to tap easily. Verify that text input fields work with mobile keyboards. Ensure the survey loads quickly on 4G connections. Make sure customers can complete the entire survey without zooming or horizontal scrolling.
If your survey isn't mobile-optimized, you're losing 40-60% of potential responses from customers who open it on phones and immediately close it because it's too difficult to use.
9. Make Survey Purpose Clear in Subject Line
Subject lines like "We want your feedback" are vague and get ignored. Instead, be specific about what you're asking and why it matters. Good examples: "How was your support experience today?" (clear, specific, timely). "Quick NPS check-in: 30 seconds" (sets time expectation, uses familiar term). "One question about your onboarding" (specific topic, quick commitment).
Avoid: "Customer satisfaction survey", "Tell us how we're doing", "We value your opinion". These are generic and don't tell customers what you're actually asking about or why they should respond.
10. Close the Loop Visibly
Customers are more likely to respond to future surveys if they see that you acted on previous feedback. After someone submits an NPS survey, send a follow-up email (within 48 hours for detractors, within 7 days for promoters) acknowledging their response and explaining what you're doing about it. For detractors, this is your recovery opportunity. For promoters, this is your chance to ask for a referral or testimonial.
Research shows that businesses see an 8.5% increase in retention when they close the loop with all customers, not just detractors. And customers who receive follow-up communication respond to future surveys at rates 12-18 points higher than customers who receive no follow-up.
11. Use Survey Logic to Reduce Perceived Length
Even if you need multiple questions, you can make the survey feel shorter by using conditional logic. Only show follow-up questions based on the initial NPS score. For promoters, ask "Would you be willing to write a testimonial?" For detractors, ask "What specifically disappointed you?" For passives, ask "What would make you more likely to recommend us?"
This way, every respondent only sees 2-3 questions maximum, but you're gathering different insights from each segment. The survey adapts to the customer rather than asking everyone the same battery of questions.
12. Share What Happens with Feedback
Include a line in your survey invitation that explains what you do with responses. Examples: "Your feedback goes directly to our product team and helps prioritize our roadmap." "We review every response within 48 hours and personally follow up when there's an issue." "Your input has already influenced 3 feature releases this quarter."
This gives customers a reason to respond beyond "the company asked me to." They see that their feedback has real impact, not just disappearing into a dashboard nobody reads.
NPS Response Rate Benchmark by Industry
Industry affects response rates less than channel or survey design, but some patterns exist. This section provides light context only, not full industry-by-industry tables. For NPS score benchmarks by industry (what scores are typical by sector), see our comprehensive guide to what is a good NPS score.
-
B2B SaaS: Email response rates trend higher (18-25%) due to engaged user bases and product-led survey triggers. Typical B2B brands can expect response rates between 4.5% and 39.3%, with an average of 12.4%.
-
Healthcare: Lower rates (12-20%) due to patient survey fatigue. Hospitals send satisfaction surveys for every touchpoint.
-
Financial services: Moderate rates (15-22%), higher for relational surveys sent by personal advisors vs. automated bank emails.
-
Retail and E-commerce: Website and post-purchase email surveys see lower rates (10-18%) due to transactional customer relationships.
-
Hospitality: Post-stay email surveys achieve 18-25% if sent within 24 hours of checkout. Rates drop sharply after 48 hours.
Are NPS Response Rates Declining? (The Macro Trend)
Yes. Response rates have been declining for years, and the trend continues. The cause is survey fatigue. Customers receive more surveys than ever, from every app, every purchase, every support interaction. The solution isn't to chase higher rates. It's to focus on data quality over quantity.
The evidence is clear across multiple sources. The Bureau of Labor Statistics notes that survey response rates have not recovered to pre-pandemic levels, and declines in response rates actually predate the pandemic. A 2024 study published in the Journal of Survey Statistics and Methodology found that one hypothesis to explain declining survey response rates over time is that individuals are receiving more and more survey requests. Research from Oxford Academic confirms this frequent survey request hypothesis.
Why this is happening: Survey saturation. Customers are surveyed by every company they interact with. Research suggests the average consumer receives 12+ survey requests per month. Inbox overload is another factor. Email volumes are up significantly. Survey invitations compete with work emails, newsletters, and promotional clutter. Privacy concerns also play a role. Customers are more cautious about sharing feedback, especially on channels they don't trust.
Even national statistics offices are struggling. According to UK's Labour Force Survey response rate collapsed to near 13%, delaying data releases.
Why declining rates are okay if you adapt: A 20% response rate with a representative sample is more valuable than a 30% rate dominated by outliers. Statistical significance doesn't require high response rates. It requires sufficient sample size. For most NPS programs, 100-200 responses per segment per quarter is enough.
Focus on data quality (bias prevention, sample representativeness) over response rate as a vanity metric. According to the American Association for Public Opinion Research (AAPOR), the relationship between response rates and survey quality has become much less clear. Consumers of survey results should treat all response rates with skepticism and pay attention to other indicators of quality.
How to adapt: Use the tactics in this guide to maintain rates above your channel benchmark. Monitor response rate trends. Declining equals problem, stable equals fine. Accept that some customers won't respond, and that's their right.
Your Response Rate Problem Might Not Be a Response Rate Problem
Remember those 500 surveys and 82 responses from the opening?
If that's a 16% response rate on an email survey, you're within benchmark. If it's an SMS survey, something's broken. If those 82 responses are all from your biggest accounts and none from your smallest, your rate doesn't matter because your sample is skewed. The number alone doesn't tell you anything. The diagnostic framework in this guide exists to tell you what it means.
Response rates matter, but sample quality matters more. A 20% rate with promoters, passives, and detractors proportionally represented beats a 40% rate skewed toward extremes. Focus on whether your respondents look like your customer base, not whether more people clicked.
Use the above mentioned strategies to improve NPS response rate systematically. Don't implement all of them at once. Pick three, test them, measure what moves. Then add three more.
And remember: customers who don't respond are sending a signal too. They're indifferent enough not to care, or engaged enough elsewhere that your survey doesn't register. Both are feedback. Both matter.
The real question isn't "how many people responded?" It's "what are you doing with the responses you get?" Your NPS program only works when you act on feedback. That starts with knowing how to improve NPS, understanding NPS detractors and how to convert detractors into advocates. The response rate gets you the data. What happens next determines whether it actually drives change.
Sources cited in this article: SurveySparrow 2025 Mobile Engagement Report, Clootrack 2025 Survey Response Rate Research, CustomerGauge NPS Response Rate Study and B2B Benchmarks Report, Refiner 2025 In-app Survey Response Rate Report (analyzing 1,382 surveys with 5+ million views), Delighted 2024 Analysis, Stripo 2025 Research, SightMill Data, Qualtrics XM Institute, AskYazi Research, Event Marketing Institute 2024 Study, US Bureau of Labor Statistics Survey Response Rates (linked), Journal of Survey Statistics and Methodology 2024 (linked), American Association for Public Opinion Research/AAPOR (linked), UK Office for National Statistics Labour Force Survey (linked).