TL;DR
- Email survey response rates have dropped to 15–25%. SMS pulls 40–55%, and most responses come back within minutes.
- Salesforce doesn't send SMS natively. You need a connected survey tool. Salesforce handles the trigger and stores the data; the SMS platform handles delivery and response capture.
- Start with post-case CSAT. One trigger, one question, responses mapped to the Case record. That's the fastest way to prove the model before expanding.
- US businesses must complete A2P 10DLC registration before sending. It takes 1–3 weeks. Plan for it.
- TCPA requires express written consent, opt-out instructions in every message, and send times between 8 AM and 9 PM local. Non-compliance runs $500–$1,500 per unauthorized message.
- Six months in, the signal should be clear: response rates above 35%, loop closure on low scores within 48 hours, CSAT trends by agent visible inside Salesforce.
Most teams defaulted to email surveys because the setup was easy. Already in Salesforce. Email already flowing. Someone pastes a survey link into the template, the case closes, and the email goes out. Done.
That worked well enough for a while. It doesn't anymore.
Average email survey response rates are sitting at 15–25% and still going down. Inbox overload, mobile friction, survey fatigue — customers are conditioned to ignore them. By the time someone opens an email about a case that closed two days ago, the experience has faded enough that the data is soft. And the teams that do notice the drop usually try fixing it with better subject lines or shorter surveys, which helps a little and then plateaus.
The issue isn't the email. It's the channel. SMS surveys flip the dynamic. A retail customer we work with switched from email-linked surveys to SMS triggered within 30 minutes of purchase. Response rate jumped from 14% to 48%. Same customers. Same timing. Different channel.
Quick note on something that trips people up early: Salesforce doesn't send SMS natively. Salesforce handles the trigger and stores the response data. A connected SMS survey tool handles delivery, survey rendering, and response capture. You need both. That distinction matters for how you set this up, and we'll come back to it in the walkthrough.
For a broader view of Salesforce survey setup across all channels, start with the complete Salesforce surveys guide.
Is a Salesforce SMS Survey Right for Your Team?
Not every team. SMS surveys are high-leverage for specific use cases, and running them where they don't fit costs real money and burns your opt-in list fast.
You'll get actual value from this if you're running support operations at meaningful volume and your CSAT data is thin, delayed, or both. If you're in healthcare, property management, field services, or financial services, where customers are already on mobile as a default. If you want pulse signals from key accounts between quarterly check-ins and email open rates have made that unreliable.
It's less useful for relationship NPS at enterprise accounts, where depth matters more than speed and a phone call often gets you better data. Or for use cases where the customer is already in a web session — in-app surveys win there because the friction is even lower than SMS.
If your instinct going into this is "post-case CSAT isn't giving us useful data" or "we're sending surveys and getting 12% response rates," that's the right instinct. Keep reading.
Salesforce SMS Survey Use Cases: Where It Actually Works
Spend a few minutes here before touching a single Flow. The teams that get the most out of Salesforce SMS surveys aren't the ones with the most sophisticated setup — they're the ones who picked the right starting moment. One trigger, tight scope, prove the model. Then expand.
Post-case CSAT. Start here. A support ticket closes in Salesforce, an SMS fires 20 minutes later, the customer taps a rating. Simple. But what makes it useful is that the response maps to the specific Case record, not just a Contact, so you get a score tied to a specific agent, issue type, and product area. That's what turns it from a dashboard vanity metric into something a support manager can actually use for coaching. It's also the simplest trigger to configure, which matters when you're still figuring out the integration.
Post-purchase NPS. Right after a purchase or subscription renewal, customers are actively thinking about the decision they just made, which is exactly when NPS means something. Trigger from an Opportunity stage change to Closed Won, map the score to the Account record, and your CS team has a signal before their next check-in call. One timing note worth making twice: wait two weeks, not two days. Let the first real product interaction happen before you ask someone how likely they are to recommend you.
CES during onboarding. This one is underused. If a new customer is struggling with setup in week one, you want to know before they quietly churn six weeks later. A CES survey after key onboarding milestones (first login, first workflow activated) surfaces exactly where things break down. For SaaS teams already tracking onboarding stages in Salesforce, you can route low CES scores automatically to a customer success manager. That's the gap between "customer is frustrated and saying nothing" and "someone actually does something" — closed before the relationship sours. Attribution questions work here too — triggered during onboarding, they tell you which channels are actually bringing in customers worth keeping.
Trial conversion signals. Two moments consistently produce useful data. One or two days before trial expiry gauges purchase intent while you can still act on it. One day after expiry captures why they didn't convert. Trial users respond at higher rates than almost any other segment because they're actively evaluating and they have opinions. One question here can change how sales prioritizes follow-ups for the rest of the week.
Appointment-driven industries. Healthcare, financial services, property management. A short CSAT survey right after an appointment catches the experience while it's still fresh. These customers are already on mobile, so the channel fits the moment. And because responses map back to the relevant Salesforce record, you end up with a real interaction history over time instead of scattered one-off surveys that never connect to anything.
SMS vs. Email Survey Response Rates: Why the Gap Keeps Growing
Email has a sequencing problem that's hard to solve from within email. The customer finishes a support call, the email arrives, they're on mobile about to do something else. They see it. "I'll do that later." They don't. By the time they notice it again — if they do — the memory has faded enough that whatever they submit isn't really the data you wanted. That's not a hypothetical. That's what 15–25% response rates look like in practice.
SMS doesn't have that problem because the phone is already in their hand. The message arrives in the same moment people are paying attention. No inbox to dig through. Survey right there. Tap a number, maybe type a line, done. Surveys sent within two hours of an event see roughly 32% more completions than delayed ones, and on SMS the timing advantage compounds because the open rate is so high to begin with.
There's a second piece here that's worth being specific about. When SMS responses map directly to Salesforce Contacts, Cases, Accounts, or Opportunities, nothing gets stuck in a separate tool or waiting on an export. Sales sees the NPS score before a renewal call. Support managers see CSAT by agent, not just aggregate. Ops spots patterns across locations. The intelligence reaches the people who can act on it, which is the part that most survey programs get wrong — not the collection, the routing.
| SMS Surveys | Email Surveys | |
| Response rate | 40–55% | 15–25% |
| Speed of response | Minutes | Hours to days |
| Survey length | 1–3 questions max | 5–15 questions possible |
| Best for | Post-interaction, transactional, time-sensitive | Detailed relationship surveys, quarterly check-ins |
| Personalization | Merge fields, short format | Rich HTML, branding, embedded questions |
| Cost per survey | Higher (SMS delivery fees) | Lower (email is essentially free) |
| Compliance | TCPA opt-in, 10DLC registration, restricted hours | CAN-SPAM, less restrictive |
| Ideal metric | CSAT, CES (transactional) | NPS (relationship), detailed CSAT |
For most teams, running both channels makes more sense than picking one. SMS for the immediate post-interaction signal; email for relationship surveys or when a mobile number isn't on the Contact. They cover different moments and they don't compete. For the email side of this setup: embedding surveys in Salesforce emails.
How to Set Up SMS Surveys in Salesforce: Step-by-Step
Most teams are live within a week. The process isn't complicated, but there are two or three points where people make decisions that come back to haunt them. I'll flag those as we go.
Step 1: Choose a Salesforce SMS survey tool
Start here because this choice affects everything downstream — including steps you haven't thought about yet. Salesforce doesn't handle SMS delivery natively, so you need a third-party survey platform with a Salesforce integration. That platform owns delivery, survey rendering, and response capture.
The things that actually matter when evaluating SMS survey software: native Salesforce integration with managed field mapping (tools that only connect via Zapier or webhooks will cause you debugging pain when field mapping breaks and you have no way to trace it inside Salesforce), built-in SMS delivery infrastructure so you're not also managing a separate Twilio setup on top of this, and automation trigger support so surveys can fire from Case status changes, Opportunity updates, or custom object events without needing a developer.
Zonka Feedback is what we build: native Salesforce integration, managed and custom field mapping, SMS and WhatsApp distribution, NPS/CSAT/CES templates, AI analysis that writes back to Salesforce records. For a broader comparison of options: Salesforce survey tools roundup.
Step 2: Connect to Salesforce and configure field mapping
The Salesforce connection and org authentication is usually about ten minutes. Field mapping is where you need to slow down. This is the decision that determines where survey responses actually land inside Salesforce, and getting it wrong is a pain to fix after you have data.
The basic logic: Contact or Lead fields for individual respondent data. Case fields for post-resolution feedback — this is what makes agent-level CSAT reporting possible. Account-level rollups for aggregate satisfaction scores. Custom objects if your workflow needs a dedicated feedback record type.
Map as close to the trigger as possible. If a Case triggered the survey, the response should land on that Case. That's what keeps data in context and makes Salesforce reports actually useful instead of just present. In Zonka Feedback, it's either Managed Mapping (one-click for standard objects) or Custom Mapping, where you define the object, conditions, and triggers yourself. The Salesforce survey mapping types guide covers the specifics if you're not sure which to use.
Step 3: Build the SMS survey
Short. One metric question (NPS, CSAT, or CES) and an optional open comment field. That's it. The 160-character message limit matters more than people realize: anything longer splits across multiple texts, which looks broken on some devices and tanks completion rates. Keep the message itself well under that limit after the merge fields are populated.
Personalization consistently moves response rates more than any other single variable. Generic messages underperform personalized ones every time:
SMS message example (Salesforce merge fields):
Hi {!Contact.FirstName}, thanks for reaching out. How satisfied were you with support on Case #{!Case.CaseNumber}? [survey link] Reply STOP to opt out.
The merge fields pull directly from the Salesforce record that triggered the survey. No manual personalization. No "Dear Customer."
Step 4: Set up a Salesforce Flow trigger for SMS surveys
Flow is the right tool for most teams. Build a record-triggered Flow that fires when a Case closes, an Opportunity moves to Closed Won, or any custom field updates. The Flow calls your survey tool's API or kicks off a pre-configured SMS distribution. Process Builder still works for simple triggers, but Salesforce is deprecating it in favor of Flow — don't build anything new there.
Two things people get wrong with timing. First: firing the SMS the instant a case closes. Give it 15–30 minutes. Let the interaction finish landing before you ask about it. Second: not restricting send hours. TCPA requires 8 AM to 9 PM in the recipient's local time zone. Build that into the Flow logic before you go live, not after the first complaint.
Step 5: Test end-to-end before touching production
Create a test Case in your sandbox. Trigger the status change. Confirm the SMS arrives with correct merge fields. Complete the survey. Verify the response lands on the right record with the right fields populated. Run it on both iOS and Android with a few team members first. Then go live on one queue or product line and watch it for a week before expanding. A week of real data on a narrow scope will tell you things that no amount of sandbox testing can.
TCPA and 10DLC Compliance for Salesforce SMS Surveys
This is the part people scan first and then set aside to deal with later. Don't. You can't go live without it and two of the three items have lead times that will push your launch if you wait.
A2P 10DLC Registration for Salesforce SMS
What it is: The US carrier system that requires businesses to register their brand and messaging campaigns before sending business SMS at scale. Skip it and carriers filter or block your messages regardless of how clean your opt-in list is. It doesn't matter how well your Flow is configured.
What you need to do:
- Register your brand with The Campaign Registry (company details, use case)
- Register each messaging campaign separately (surveys count as one campaign type)
- Most SMS platforms handle this during onboarding and walk you through it
Timeline: 1–3 weeks for approval. If you build the full Flow automation and then start 10DLC registration, you're waiting. Start this the same day you start the integration.
TCPA Compliance for SMS Surveys
The Telephone Consumer Protection Act covers all business SMS in the US. Survey texts are included.
Required:
- Express written consent before adding anyone to an SMS survey list — a checked checkbox or keyword opt-in both qualify
- "Reply STOP to opt out" in every message
- Opt-outs processed within 10 business days (the FCC tightened this from 30 days in April 2025 — automate it, don't rely on someone doing it manually)
- Send times between 8 AM and 9 PM recipient local time, no exceptions
- Consent records stored in Salesforce (a custom field like
SMS_Survey_Consent__c) so you have documented proof
Penalties: $500–$1,500 per unauthorized message. TCPA class actions went up over 100% year-over-year in 2025. Keep this inside Salesforce where it's auditable, not in a spreadsheet.
GDPR, International, and HIPAA
For EU and UK contacts: you need a lawful basis for processing mobile data and sending SMS. Consent is the most common. Add an SMS opt-in field to your Contact records and have your Flow check it before sending. No opt-in, skip SMS and fall back to email. That fallback logic is worth building into the Flow from the start.
Healthcare use cases: verify your SMS platform is HIPAA-compliant before connecting it to patient records in Salesforce. Not all of them are.
What Actually Improves SMS Survey Response Rates in Salesforce
Timing is the highest-leverage variable, which is worth saying again even though it came up in the setup section. Surveys sent within two hours of an event see roughly 32% more completions. For post-case CSAT specifically, 15–30 minutes after closure tends to be the sweet spot. Close enough that the experience is fresh, far enough that the agent has actually wrapped up. Waiting until the next morning costs you a third of your responses before you've done anything else wrong.
Question count matters more than most teams expect. One metric question with an optional comment field consistently beats surveys with three or four. If you need more depth from specific respondents, route them to a longer email follow-up or a callback workflow after they respond — don't try to get everything in one text. The 160-character constraint makes this a forcing function anyway, but the temptation to ask one more question is real and worth resisting.
Frequency is easy to miscalibrate. More than once every 30 days via SMS and opt-out rates climb. Track last survey date on each Contact record and build a suppression window into your automation. This is one of those things that seems fine when you're first setting it up and then becomes a problem six months later when your opt-in list has quietly shrunk. Easier to build it in from the start.
If your tool supports two-way SMS, set up conversational follow-ups. Customer responds with a low CSAT, the tool automatically sends back: "We're sorry to hear that. Can you tell us more about what happened?" That single message captures qualitative data that would otherwise disappear and signals to the customer that someone's actually reading the responses. We've seen this show up in opt-in retention rates over time, which makes sense — people stay opted in when they feel like the survey isn't going into a void.
For field service teams operating in areas with spotty coverage: pair SMS surveys with offline surveys synced to Salesforce. Responses captured without connectivity sync back once a signal is available.
How to Measure Salesforce SMS Survey Performance: What Good Looks Like at 6 Months
Most guides end at "you're set up." A lot of programs stall because nobody defined what success looks like past that point. Here's what to actually watch — and what early failure looks like before it compounds.
Response rate. Healthy post-case CSAT programs typically hit 35–50% within the first 60 days once timing and personalization are dialed in. Under 25% after two months usually means one of three things: survey going out too late (more than two hours after case close), generic message (no merge fields), or cold opt-in list (customers didn't expect to hear from you via SMS). Fix timing first. It's the fastest variable to move and has the biggest impact.
Loop closure rate. Most teams don't track this and it's the one metric that tells you whether the program is actually working — not just collecting. What percentage of low scores (CSAT below 3, NPS detractors) triggered an automated follow-up? Of those, how many got a human response within 48 hours? If you're collecting feedback and nothing routes automatically from a bad score, the data is sitting idle. And worth saying plainly: customers notice when they submit a low score and nothing happens. It's worse than not asking.
CSAT trends by agent inside Salesforce. By month three, support managers should be looking at agent-level CSAT trends in the same reports they already use for case volume and resolution time. If that data still lives in a separate dashboard, the field mapping setup needs attention. We mentioned this in the mapping section — it's worth repeating because it's where a lot of programs lose their audience internally. The intelligence needs to reach the people who can act on it, inside the tool they're already using.
360-Degree View of Customers: The teams that get the most out of this setup eventually stop thinking about SMS surveys as a support tool and start thinking about them as a shared data layer. Marketing uses positive feedback signals to identify which customer segments are engaged enough for referral campaigns or upsell outreach. Sales uses that same data to prioritize accounts worth expanding — customers who scored 9 or 10 on NPS are a different conversation than customers who scored 6. Service uses low scores to flag at-risk accounts before they escalate. None of this requires building something new. The data is already in Salesforce. It just needs to reach the right team, in the right report, at the right time.
What failure looks like: Response rates that start strong and drop steadily usually mean survey fatigue — sending too often, or to segments that haven't really opted in. Low scores that never generate follow-up Tasks usually mean loop closure wasn't configured. A CSAT dashboard nobody checks usually means the reports aren't tied to anything anyone is accountable for. All of it is fixable. Easier at month two than month six.
The iteration pattern: One trigger. 60 days. Review response rates, loop closure rates, and whether scores line up with what support managers already know from direct customer feedback. Then expand to a second trigger — onboarding CES, post-purchase NPS — and repeat. Teams that try to run five triggers simultaneously from day one almost always end up with noisy data and no clear owner for any of it.
Open-text comments are where most of the real intelligence lives and the part most teams skip. Reading hundreds of two-line SMS comments manually doesn't scale. AI feedback analysis groups those comments by theme automatically — instead of "CSAT dropped this week" you get "four customers mentioned long hold times after being transferred," which is something you can actually do something about. Zonka Feedback's AI analysis handles this and maps the output back to Salesforce records so it reaches the right team without an export step in between.
Ready to connect SMS surveys to your Salesforce workflows?
See exactly how Zonka Feedback's Salesforce integration handles triggers, field mapping, and AI analysis — and get your first SMS survey live within a week.