Customer Satisfaction Survey Questions
Ask the right questions in your CSAT survey
Whilst variations on the statistic exist, it has long been held that happy customers are prone to tell 3 people, whilst unhappy customers tell 10. In today’s digital world, where customers can communicate with more people, more easily, these numbers are likely even higher.
If your customers are going to tell anyone about your product, service or organisation, you ideally want them to be telling you, too. Running CSAT surveys helps capture and quantify these opinions.
But an effective customer satisfaction survey needs careful attention to the questions you ask, and the order in which you ask them (customer satisfaction survey templates can help with this). It’s easy to introduce bias or to simply steer customers toward leaving positive feedback rather than capturing the insight you need to improve the experience for both current and future customers.
Tips for effective CSAT questions
Don’t make the survey too long
Your customers’ time and patience are not inexhaustible, especially if they’ve been left less than happy with their experience, don’t fall into the trap of asking too many questions.
Brevity will give you a better chance of capturing completed surveys but do ask enough to make the feedback meaningful. If you need more detail you can always consider running a follow-up survey, or a survey of another type, such as a customer loyalty survey or a questionnaire exploring customer engagement. A great way to collect more feedback is to use skip logic and piping to insert a response from a previous question and ask a follow up question for example, “thanks for giving us a 5! Could you tell us why you gave us that score?”
Do keep individual questions succinct
Its important to keep your survey to an appropriate length. Questions should be clear and easily understood on their first read-through. Instead of asking:
Having purchased an upgrade to [service] from [Our Company Name], how would you rate your satisfaction with the pricing, delivery and installation, using the following scale?
…a better approach would be:
How satisfied are you with the upgrade to [service]?
You’ll see we’ve also simplified the scope of the question from “pricing, delivery and installation” to “the upgrade to”. If you want an accurate response concerning their satisfaction with other aspects of the order, from pricing to installation, these should be asked as individual questions (as a customer could have differing levels of satisfaction between each element of the product or service).
Don’t use leading questions
Your questions should be objective and avoid prompting the respondent to think or answer in a particular way. For example, the question:
How positively would you rate your experience of the Customer Champion handling your query?
This is a leading question due to the subtle suggestion that the experience was to some extent “positive” and using the name “Champion”. Even if this is how the team members are commonly referred to, it could influence the respondent’s feelings about the question.
In this example, a more objective question would be:
How satisfied are you with the customer service agent handling your query?
Do ensure respondents can answer
If your survey questions don’t allow the respondent to answer how they wish, they’ll get frustrated and may even abandon the questionnaire. Imagine the survey relates to your support team, and a customer has been trying to cancel a membership. If the survey asks them “What were you trying to do today?” and this option isn’t listed, how should they respond?
By answering inaccurately, or by trying to move on with the question left unanswered (at which point you’ll no longer know what subsequent responses relate to)? Or just abandoning your survey completely and feeling frustrated.
So, if the possibility exists that their answer isn’t listed, then allow respondents to answer, by providing an open text field to allow them to elaborate on a choice of “Other”.
Whilst we’ve related these do’s and dont’s to their use within a customer satisfaction survey, this advice does of course stand true for most customer surveys.
Using open-ended and closed-ended survey questions
For some questions it makes sense to offer the respondent a set of pre-selected answers to choose from, perhaps using multiple-choice selection, or asking for a single ranking. The latter approach is a good way to start your survey, beginning with a question such as:
How would you rate your overall satisfaction with our product/service?
…and offering responses ranging ‘Very satisfied’ – ‘Satisfied’ – ‘Neutral’ – ‘Unsatisfied’ – ‘Very unsatisfied’.
Open-ended questions invite the respondent to give more detail, or where the answer could be something unique rather than a pre-defined selection. Following on from our opening question you might ask something like:
Please tell us a little bit more about why you gave us this rating
Precede this with a free-text field, so the respondent can provide more detailed feedback.
As we can see, closed-ended questions are especially useful when we want quantifiable information, with customer effort surveys and the Net Promoter Score using this approach. Open-ended questions are qualitative, and useful when exploring responses in customer experience surveys or when capturing the Voice of the Customer.