According to just over half (52 per cent) of Canadian CEOs, data about customer preferences is critical for making decisions that affect long-term success. Still, 51 per cent say their main challenge to having better data is customers’ unwillingness to share information with them. Meanwhile, the Canada Customer Experience Index, 2021, shows that emotion is still key to customer experience success

All this adds up to the fact that understanding customer satisfaction is necessary for your company’s success. You need to harvest accurate, actionable, relevant data while making it fast and easy for your customers to supply it. A survey can be a great way to do just that — when you create a good one. 

 

 

Discover how 8000+ service leaders are driving cost efficiency and meeting real-time customer needs.

See how trends like AI are shaping the future of customer service.

 

What Is a Customer Satisfaction Survey and How to Get Started 

A customer satisfaction survey is a balancing act between generating accurate, relevant, actionable data and being sensitive to your customers’ privacy concerns. To find the right balance, answer the following four questions.

 


 

What’s your goal? 

Pick one specific goal, and make sure it aligns with what stakeholders on and off your team expect. It’s tempting to throw in a ton of questions while you have a customer’s attention, but that’s how you wind up with long, grueling surveys that respondents don’t finish. Possible goals can include getting insights into a specific metric, such as customer service call times or upsell conversion rates. 

 

Who should take the survey? 

Knowing who you want insights from can help you narrow your target audience. Are you looking for feedback from a broad group of customers, or just those who needed help troubleshooting your product and reached out to your customer service team? Knowing who you’re targeting can also help you determine what kinds of questions to ask.  

 

How will you distribute the survey? 

More and more, companies are creating online surveys because they’re convenient, inexpensive, and flexible. But, if you already have phone or in-person touchpoints, you may see higher response rates through these channels. Meanwhile, mailed surveys have the lowest response rates, which could result in a nonresponse bias.  

 

How will you use survey results? 

Depending on the kind of data you collect, making it shareable and actionable could mean creating a spreadsheet, graphs and charts, customer experience maps, and more. Make sure you can create these deliverables quickly and accurately so you can act as soon as possible. 

 

How to Write Screener Questions 

Your first question should be a screener question, which will act like the sorting hat in Harry Potter, placing respondents into the “target audience” house or “not target audience” house. Those not in your target audience should be thanked for their time without further questions. Those in your target audience should continue on through the rest of the survey so you can collect their answers. 

Your screener question will depend on who your target audience is. Here are three examples: 

  • Have you shopped for a [product] in the last six months? 

  • Have you used our service in the last month? 

  • Have you used our customer service in the last three months? 

Your target audience may be more complex than “a recent customer.” In that case, you may need more than one screener question to keep each question simple. 

Instead of: 

  • Are you a parent of at least one child under the age of 18, and you use our product at least weekly and have spoken to a customer service representative in the last three months? 

This could be two screener questions: 

  • Are you a parent of at least one child under the age of 18, and you use our product at least weekly? 

  • Have you used our customer service or help desk in the last three months? 

In some cases, you won’t need any screener questions. If a purchase or service interaction auto-triggers a survey and you can tie that back to a ticket or CRM (customer relationship management) record, it may already be obvious that the respondent falls into your target audience. Don’t waste time with questions you already have the answer to unless you absolutely need to verify the information. 

 


 

Customer Satisfaction Questions 

The bread and butter of customer satisfaction surveys is a data point called CSAT, or Customer Satisfaction. As the name implies, this score measures whether a customer is happy with your product, service, company, or a specific experience. For example: 

  • How satisfied are you with our company? 

  • [Product or service] made it easy to [accomplish my goal].  

  • Strongly agree 

  • Agree 

  • Neither agree nor disagree 

  • Disagree 

  • Strongly disagree 

Many companies create quick, easy surveys with a single CSAT question in the form of a website pop-up or automated phone survey. This allows them to ask the question quickly and often with less risk of intruding on customers’ time since it’s a small ask. 

 

Avoid biased phrasing. 

If you want to get accurate answers, avoid phrasing survey questions that express a bias. Examples of biased CSAT questions to avoid are: 

  • How much do you love this product? 

  • What do you like most about our company? 

  • How would you rate our award-winning customer service? 

 

Net Promoter Score 

While CSAT asks about satisfaction, Net Promoter Score (NPS) takes it a step further. It asks whether the customer would recommend your company to someone else.  

  • How likely are you to recommend this product to a friend or colleague? 

  • Based on your interaction with our support team, how likely are you to recommend our company to a friend or colleague? 

  • How likely are you to recommend this app to other small business owners? 

Answers should be single-selection multiple choice or on a Likert scale. You can ask respondents to choose a number from one to five, with five being the highest, a phrase (such as likely, not likely, very likely), or even an emoji.  

Many surveys ask just this one question, which people can answer in a few seconds. Other surveys ask a follow-up, open-ended question in order to attempt to get more information from the respondent. 

 

Keep it neutral or make it dynamic. 

Keeping your follow-up question neutral will make it seem natural, whether the customer has indicated they recommend you or not: Why or why not? 

If your survey platform can respond dynamically based on their answer to the first question, you can ask a more specific follow-up question. 

For example, when a respondent indicates they’d recommend your company, consider these follow-ups: 

  • What would you say to your friend to recommend us? 

  • What do you like about our company? 

  • Which features of [product] do you value the most? 

If they wouldn’t recommend you, consider these questions: 

  • What can we do better? 

  • What was missing from our service? 

 


 

Deeper Questions

Open-ended follow-up questions are a great way to dig deeper into a customer’s experience, but answers can be hard to quantify and more challenging for customers to answer. If there are specific products, services, or experiences you want to know more about, you could also consider including some closed-ended questions, which generally require a yes or no response, that dive a little deeper. 

  • Are you considering future purchases from our company? 

  • How did you learn about us? 

  • Is this service priced fairly? 

  • Did anyone help you make this purchase? 

 

Keep it short. 

The longer, more challenging your survey, the more likely people are to exit partway through and leave some questions unanswered. This may leave you with too little information. Instead, think of every question like it’s taking up valuable real estate — because it is.  

 

Demographic Questions 

Often, you’ll want to learn more about how well you’re serving different customer segments or who your best customers are so you can find more of them. That’s where demographic questions come in. When these are helpful, you could ask about topics such as gender, age, location, profession, education level, religious affiliation, or income.  

 

Be inclusive. 

Help respondents give accurate responses or answer in a way that makes them comfortable. When respondents feel included, they’ll often continue to the next question. Conversely, if a respondent has to answer a demographic question that’s not entirely accurate, then continues answering your questions, your data may be inaccurate. 

For example, if you’re asking about gender, don’t stop at male or female. Adding “non-binary” or “Other gender identity” will help make these customers feel seen and not force them to choose an inaccurate answer to move forward in the survey. 

  • What is your gender?  

  • Man 

  • Woman 

  • Other gender identity 

 

Give them a way out. 

Some demographic questions are uncomfortable for some customers. Some people may not want to reveal their income, exact age, or location, for example. To prevent customers from dropping out to avoid sharing more than they want to, give them a way out by including “Prefer not to say” or “I’m not sure.” 

For example: 

  • What is your Zip code? 

  • [Key in response] 

  • Select “I’m not sure or I prefer not to say” 

 

Avoid ambiguity. 

To avoid confusing or frustrating your respondents, make sure they can only choose one answer for single-select multiple-choice answers. For example, if you ask about income or age and provide overlapping ranges in the answer options, customers won’t be sure which one to select. Make sure your answers don’t overlap. 

  • What is your annual income? 

  • $25,000 or below 

  • $25,001 to $50,000 

  • $50,001 to $75,000 

  • $75,001 to $100,000 

  • $100,001 or above 

  • What is your age? 

  • Under 18 

  • 18-24 

  • 25-34 

  • 35-44 

  • 45-54 

  • 55-64 

  • 65 or older 

 

Conclusion 

Creating a survey that’s easy on your customers and provides great insights for you is no easy task. It’s even harder if you try to do it all on your own. Lean on your colleagues or a small group of customers to test your survey before distributing it widely. You may go through a few iterations, but the result will be a better experience for respondents and more accurate, reliable, and actionable insights for you.