Customer Survey? Check. Now what?

Guest post by: Barry Dalton

Customer Satisfaction: a measure of how products and services supplied by a company meet or surpass customer expectations.

Survey: a method for collecting quantitative or qualitative information about items in a population

Ok, so now that we’ve got the basic definitions established, at least according to Wikipedia. Mash up these definitions and we get a reasonable definition of customer satisfaction survey: “a method for collecting and measuring quantitative or qualitative information about how products or services supplied by a company meet or surpass customer expectations”.

It’s hard to argue against the potential value of asking our customers how we can do better. I think we’d all agree it’s a reasonable thing on which to spend time and resources. What, at least to me, is less clear is how companies justify continuing to invest in this process without strategic imperatives or processes that facilitate taking action against this information to drive change.

In light of this however, the state of the customer experience and that of the relationship between companies and customers, in general, continues to erode. Companies continually fail to understand the value of these surveys from the perspective of their customers. As a result, the “CSat Survey” ends up being a check box on the customer service performance management dashboard. For whatever reason, I’ve yet to experience many true role models when it comes to ingesting survey data into the organization and delivering the requisite changes that drive measurable improvements in the customer experience. Many companies that execute surveys fail to act on the stated requests of customers or utilize customer surveys as a way to excite customers and improve the customer experience.

To prove (or disprove) my hypothesis, I make it a point to respond to every survey I’m offered, whether it via email, IVR post call or web. I’m on a mission in 2010 to hopefully collect enough data points to draw some sort of reasonable conclusions about various processes behind these surveys. Stay tuned for more details as this project develops. In the mean time, here’s an example of what happened with a survey I completed just today.

I called the contact center of my mortgage holder, that happens to be one of the largest global banks whose name is a homophone of the word describing a large urban center. Two days later, I receive an email request to participate in a post-interaction survey. So, I clicked through. I was asked a total of 10 questions. One in nine parts! So, I guess technically it was 19 questions. I intentionally rated specific items at the extremes of the Net Promoter 11 point scale. The long and short? Perhaps I haven’t quite struck the right combination of answers that triggers the “potential to defect” business rule red flag. But, if this were my contact center, and a customer indicated that they were in the market for a big new mortgage and indicated that they would be very unlikely to do business with me, I’d have some sort of workflow to route that information to the right person and take appropriate action. In this case, nothing happened.

Unfortunately, this seems to be a consistent response, or lack of, that I’ve seen with respect to surveys. In fairness, there may be some internal action being taken based on those results. But from the customer’s perspective, it generally appears to fall into a black hole.

With the waning attention span of today’s consumer, anything less than full commitment to follow-up on the survey can only damage the customer relationship. Even if action is being taken, communication back to your customer is critical. Customers need to know that, if they’re offering their opinions, those opinions are being heard. If you don’t intend to follow up and change in a meaningful way, don’t waste the money and resources conducting a survey. The only value to both customers and businesses of surveys in today’s experience economy is as a strategic dialogue with full commitment to execution. If you’re not prepared for that level of commitment, don’t ask.

I welcome you to follow the further exploration of this topic over on my blog as I attempt to quantify the benefits of customer satisfaction surveying. And along the way, I’m sure we’ll have a little bit of fun responding to and, hopefully, receiving some interesting feedback from various customer satisfactions surveys.

2 Replies to “Customer Survey? Check. Now what?”

  1. thanks for the opportunity Kristina. Here’s an interesting tidbit about customer sat surveys. According to The Customer Rage Survey (the fact that such a survey exists speaks volumes alone) 68% of people surveyed reported being ‘extremely or very upset’ over a consumer problem they had over the last 12 months. 77% complained to the company. Of those 57% decided never to do business with that company again.

    I don’t have the quantitative data in front of me, but in the 100’s of contact centers I’ve been in that measure CSAT, I’ve never seen a reported CSAT score internally that reflects these. and quality management scores? Can remember the last time I saw those below 90%.

    So, where’s the disconnect?

  2. Thanks for such a great post, Barry. Such great insight. I believe it is truly summed up by your sentence – “If you’re not prepared for that level of commitment, don’t ask.”

    So many companies devalue their relationships with the customers by asking, but don’t look at or even consider the responses.

    The most valuable information companies will ever receive is the feedback generated by those who directly keep the business alive – the customer.

Please share your thoughts and opinions here...