What a Survey Can’t Tell You
Check out the reference guide at the end of this post!
This is a true story.
A large IT services company sold a complex suite of products in a highly competitive market. It was losing customers at an alarming rate, so an internal research team was tasked with finding out why. The team set to work designing a fairly standard customer satisfaction web survey to be emailed to current customers.
Response rates were respectable – their customers wanted to give feedback.
Concise and comparable customer satisfaction numbers were generated.
The problem was defined: attrition was due to poor product performance and low confidence in the company’s future.
With a burst of activity, the company set out to improve the numbers. Where were we rated low? Fix those areas! Do whatever you think we need to do! Make the numbers go up! All personnel were on deck, ready to take on the challenge.
Fast forward a year later. Attrition remained high and the next wave of customer satisfaction numbers were low as well. What gives?
Some Even Wrote Full Sentences!
The problem was the method used to collect feedback (a survey), not the data. It was the wrong tool for the job.
For example, take the open-ended questions and comment boxes littered throughout the survey. The company recognized they needed some depth, so felt those would do the trick.
But those types of questions don’t work on web surveys. Why?
- Mobile Rules. There is a good chance the customer was completing the survey on a mobile device (based on Mary Meeker’s 2014 trends, it’s a pretty good bet). Who wants to type a paragraph of feedback on a smart phone?
- Brevity Rules. Most people are not willing to devote more than 5 minutes for a web survey, which means questions that require long-form responses don’t stand a chance.
When we met with this company, we asked the project manager why they chose surveys.
“They were cheap and fast,” the PM said. “And besides, last year we had great comments. Some people even wrote full sentences!”
In fact, the PM went on to explain, those brief comments raised more interest than any other survey finding. Collectively the company started saying, We want to know more. What exactly do they mean by ‘fix your cloud services’?
When you have to fill in the blanks yourself, you’re not capturing the voice of the customer.
Is it any surprise that’s where they gravitated? They were trying to understand complex issues which required understanding nuances, not just percentages and volume data. But because the research tool didn’t give what they needed, “fix your cloud services” had to be extrapolated out to an action plan that was inevitably based more on the company’s perception of the problem rather than the customer’s.
Aim higher than a full sentence when trying to understand your customers!
Use the Right Tool for the Job
And herein lies the problem with surveys: a survey can tell you what and how many, but not why. Even if you throw in a few comment boxes.
The survey results explained what general areas needed attention, but left the company in the dark on what exactly and how.
A survey can tell you what and how many, but not why.
If the company wanted to know the demographics of their buyers, or how often they used their products, a survey would have been a great choice. For what they wanted, though, an in-depth discussion was the better choice. An experienced interviewer, for example, can ask “why” until they find the root causes. (The 5 Whys really work wonders.)
How to Pick the Right Tool
Of course, there are many more research options—focus groups, customer panels, random polling, observing, etc.—all with their own merits. If your choices are narrowed down to web surveys or interviews (over the phone or in person), below are my guidelines for making the right decision.
And remember: you can do both. At Primary Intelligence, we capture the best of both methods by first sending buyers a web survey with the straightforward questions and then following up with a phone interview for the more in-depth questions.