Besides the instructions, the questionnaire and the “submit” button, every customer satisfaction survey goes out with a little bit of hope inside it.
One of the hopes is that enough customers will fill out the survey to make the results statistically valid.
There is also the hope that you’ll get the kind of feedback you can act upon to provide the customer experience you deliver.
For most brands, though, there’s also a more secret hope: that the feedback you get from your customers will be positive, or at least not excessively negative.
Customer satisfaction surveys have long been a core part of what are often called Voice of the Customer (VoC) programs. As the term implies, it’s all about the brand reaching out and capturing the opinions, emotions and ideas from the people who purchase its products and services.
In a way, VoC surveys can be thought of as a form of active listening to your customers. That means you should be prepared to hear them out whether the sentiment shows a strong relationship, or one that is in serious jeopardy of ending badly.
The only risk with this kind of research is that the survey questionnaire is designed in such a way that you don’t get the real story.
Depending on what you ask, it can be relatively easy to ensure customers focus on the sunnier aspects of their experience – or at least avoid where the experience somehow went wrong.
Brands don’t always do this intentionally. Most marketing strategies are focused on using a positive, uplifting tone of voice. That can creep into surveys to the point where questions aren’t worded in an objective way.
Use the following process (or adapt it based on the needs of your particular business) to develop customer surveys that are as unbiased as possible. In doing so you’ll also gather data that leads to more tangible business results.
The bias may not always come from the way a brand words a question, but in that the survey isn’t well structured or too complex.
As anyone who’s been asked to fill out a customer survey knows, the shorter and simpler the questions, the better. If you have to stop and figure out what you’re ally being asked, you’re less likely to provide the most complete answer.
The same principles apply in how answers are captured. If all you give customers is text boxes to fill out, they may share more details than are relevant, or simply not share enough. If you only use multiple choice answers, you may not be giving them the scope to share insights that don’t neatly fit into an A, B, or C.
Offer a variety of response types that best suit the nature the question you’re asking, always putting yourself in the role of respondent. Is it really a “yes” or “no” question, or something less absolute and more nuanced?
Would you say this is the best blog post you’ve read all week, or the best blog post you’ve read in your entire life?
The leading nature of this question probably jumps out at you immediately. Of course the answer will be biased, because the question was written to maximize the likelihood of a rave response! When we’re asking questions about products we market and sell, though, we can be blind to leading questions.
The same goes for loaded questions, which embed an assumption about the person answering. For instance, a survey question like “Why do you always recommend our products to friends and family” is getting ahead of itself. First, you should ask about how often or willing they are to recommend your products.
Double-barrelled questions are another common pitfall in customer surveys. Make sure that in your efforts to keep the survey short and simple that you don’t cram too much in.
Imagine someone who sees a question like, “How satisfied are you with our product selection and customer support?” These are two very different areas, to make them separate questions.
Bias has a way of making us doubt the person asking for feedback. We wonder if the person (or brand) just wants us to tell them what they want to hear. Although this is not what you want when your survey goes out in the field, it can be a surprisingly helpful attitude when you’re testing it beforehand.
Find some people on your team, or a trusted partner or supplier, to put your questionnaire through its paces. Divide them into two groups. One should complete the survey like a real-world customer. The other should approach it more skeptically, almost like they were playing a game of “spot the bias.”
Another trick to use in testing is to pretend you were intentionally putting bias in. Go through your survey and imagine how you’d reword it to make some questions leading, loaded, confusing or double-barrelled. If it would take a drastic re-write, you may have a solid questionnaire for your research.
Finally, remember the ultimate intentions of customer satisfaction surveys. The first is usually to get comprehensive feedback on how well you’re delivering on your brand promise. The second is to get ideas for improvement.
When you test your survey, are you simply getting reactions, or data that will inspire action? Hopefully your research will give you some helpful numbers, but what really matters is what those numbers mean for your strategic plan to change your customer experience for the better.