News & Insights | Baesman

Removing Bias In Your Customer Surveys | Baesman

Written by Baesman | May 28, 2021 4:00:00 PM

Lately, many brands have been focusing on “voice of the customer,” or VoC, programs to generate insights, produce intelligence, and build action plans to align with their customers. VoC programs are extremely beneficial given the constant state of change and innovation that has creeped into every sector of customer marketing.

From retail to finance, every customer-facing channel has been shaken by changes in customer expectations.

A fundamental cog in VoC programs is the customer survey. It’s one of the easiest strategies to produce metrics on a larger, quantitative scale and show the breadth of sentiment across a large cross-section of the customer base.

As marketers, we can’t place enough value on customer feedback, but we need to be careful not to make customer surveys a self-fulfilling prophecy to soothe brand egos. Often, our own bias creeps into the surveys through copy, creative, leading questions, and incentives—and self-fulfilling surveys don’t help anyone.

Surveys sometimes provide a false foundation and suffering metrics, but the root-cause remains murky. It could be poor strategy, faulty execution, fragmented channels, or a host of other issues, but often it comes back to the foundation that it was all built upon.

It’s imperative to remain neutral in developing surveys; that’s why pollsters spend so much time and money trying to remove bias from political campaigns. Neutrality is essential for a fact-based strategy to move forward.

Here are some strategies you can focus on to help improve your customer surveys:

Find the Root

The biggest issue with surveys is that the questions are usually too complex. The questions don’t get to the root emotion or behavior that a brand is trying to identify. The further from that root behavior, the more variables marketers must contend with, which results in false red flags.

Within each question, you need to strip away how marketers, retailers, and data analysts think about the question, and instead think about it from the customer’s perspective. What information are you trying to glean? Is there a simpler way to ask the question? And what variables may convolute the response?

Narrow Focus

Keep it simple. No matter how good marketers might be, they aren’t experts at polling. And even the experts get it wrong quite a bit. Narrow your questions to focus on one or two elements. Each question should complement the others to make sure you’re building a stronger picture on a very specific purpose.

Failsafe

One of the best strategies to validate findings and ensure variables are removed is to insert failsafe questions. We’ve all seen these before in surveys we’ve taken—often obvious and poorly disguised. Essentially, it’s the same question asked in a different format or context.

It’s the best technique to help you rest easier that your findings are actionable and not based on flawed data points.

Don’t Forget Lifecycle Copy

Marketers often spend so much time focusing on the survey questions that they forget to monitor the copy in the content prior to the survey. That copy can often influence the survey by putting the customer in a specific mindset prior to moving forward.

If utilizing an offer, which is often essential to produce enough feedback, make sure the offer doesn’t correlate back to the questions in the survey. For instance, if the survey is identifying how a customer feels about sweaters, don’t offer 20% off sweaters as an incentive.

You may only get customers interested in sweaters to fill out the form, skewing results positively when the reality is not as sunny.

K.I.S.S.
In the end, if there’s one area marketers can focus on to make their surveys more beneficial, its to just keep it simple. The more complex we try to get, the higher the chance of failure.

What’s worse is that with surveys, we often don’t know that the survey failed—everything always looks great on paper until the programs are executed. And even then, the failure is often shrouded by layers of initiatives built on top of it. So, next time you’re building a customer survey, remember that the key to a good survey is this: KISS—Keep It Simple Stupid.