Holding Customer Research Firms Accountable For Misleading Research
It’s been a bad week in a bad year for customer research and the companies that produce information about customer and consumer behavior. While there’s a torrent of research findings coming out of firms like Forrester, The Temkin Group, TOA, Zendesk, and others, much of the research is badly designed, or reported in ways that mislead the reading public and purchasers of reports based on this research.
Point: ZenDesk Infographics
In an blog post dated July 5, customer service company Zendesk posted a rather good looking infographic about the importance of customer service. What’s striking about this is that nowhere on the page is there an explanation of where the many numbers mentioned come from. Or disclaimers about the limitations and accuracy of those numbers.
Typical of customer research coming from firms that sell services in the customer service space, all of the “data” reported show how important customer service is.
But even if we don’t know how the numbers were collected, are they accurate? Are they based on properly collected data that is properly interpreted? In this case at least some of the numbers are misleading, basically causing one to have to question the credibility of both the information, and Zendesk.
The infographic states that 85% (of customers) “would pay up to 25% more to ensure a superior customer service experience”. There are other studies using survey methods, that have come up with different numbers around the same theme.
But here’s the catch. In social science research we know that responses on surveys are very POOR predictors of what people do. We also know that the phrasing of survey items, even small alterations, can completely alter how people respond.
As you will see, Zendesk appears to make the same mistake that almost all customer research firms make, assuming that survey responses will tell us not only what customer DO do, but how they will behave.
In fact, survey research can only tell us what people SAY they will do, not what they do. That’s assuming the customers understand the survey items the way the writers do, AND that customers actually know well enough how they make decisions. (See How We Decide By Johnah Lehrer). Often that assumption simply doesn’t hold.
If you’d prefer to think in terms of whether this infographic is consistent with what customers DO, consider this: Each day millions of customers self-serve themselves gas to save a few dollars, while more millions go to low cost, and almost despised companies like Walmart, no-service warehouse outlets like Costco, no frills grocery outlets and so on. Clearly customers are NOT that willing to pay more for customer service or the customer experience, or at least not according to daily behavior.
As a further example, Zendesk says: “95% of customers have taken action as a result of a bad experience”, 79% indicating they told others about their experience. Intuitively sensible, but wait. WHAT actions did they take? As for the 79% who told others, one must remember that telling someone, and having one’s words heeded and acted upon are two different things. Having a voice is not the same as being able to affect behavior, which is what counts. After all, if that 79% told their toddlers about the bad experience, it’s hardly something companies should act upon.
Zendesk is not unusual here. It’s the norm. But who cares? At this writing this blog post has been shared close to 200 times on social media platforms, and embedded in other websites, thus relaying what is almost certainly faulty and misleading information about customer behavior.
Point: TOA Technology
In a press release appearing on their own site, dated June 14, 2011, TOA Technology makes some sweeping statements about customer behavior and customer service. Here’s a few quotes:
“…today released the results of its study on customer behavior and the use of Twitter in customer service. The survey found that more than 1 million people per week view Tweets related to customer service experiences with a variety of service providers and that more than 80% of those Tweets reflect a critical or negative customer experience.”
They go on to talk about the implications of this “finding”, which contradicts several other earlier studies looking at the exact same issue. Those previous studies indicate that over half of brand mentions are informational, not judgmental, and that of the judgmental tweets, praise is more common than criticism. In research one always looks at what comes before for context.
Be that as it may, where did their numbers come from? After all, their broad statements refer to “customers” meaning all customers on Twitter. Does there data support that?
No. Later in that very press release the following text appears:
“The statistical sampling of over 2,000 Tweets was collected during the period of February 25 to May 2, 2011 and focused on terms that included “the Cable Guy” and “installation appointments,” among other terms. TOA’s study found that during the selected time period, 82% of the surveyed Tweets contained negative (or somewhat negative) sentiments about customers’ cable appointment experience…”
They sampled only 2,000 tweets, AND they only sampled tweets having to do with a particular subset of tweets — about the “cable guy” and installations. The sample is biased, because it looks only at one small segment and keywords almost always used in conjunction with criticism. So, while their results “might” apply to a small subset of Twitter people, it certainly DOES NOT accurately represent “customers” in general. Yet that’s not what they say in their press release, OR the actual report.
There are other issues in this research, which we’ll leave out for brevity’s sake. In a nutshell, this study tells us NOTHING, and the claims about the report are not only wrong, but so wrong on the surface that most people should recognize that it is fundamentally worthless. But they don’t. We don’t know how many, but I’ve seen links to this study tweeted and retweeted at least fifty times just among the people I track on Twitter. People who work in customer service, and who should know better.
Point: The Temkin Group
There’s a fair amount of very interesting material coming from the Temkin Group, headed by former Forrester employee, Bruce Temkin. In a post abstracting some findings from their “for purchase report” ($195.00) entitled “The Current State Of Customer Experience”
Temkin summarized research he describes as based on 140+ large North American companies, and it makes sense that we are not presented with the full methodologies uses, since presumably that would be contained in the report if one purchases it. The post sounds credible and professional, and reflects the generally high quality presentation and research one appears to find on the Temkin Group site.
But the TITLE. Does this research actually reflect the STATE of Customer Experience, as is stated in the title. From the abstracted findings, it appears not. It represents something quite different which is the PERCEPTIONS of the people and companies involved as respondents. Not that these perceptions are trivial. Perceptions are important, but they are NOT behavior, and they are not the reality.
If you want to measure what companies are doing, you have to do so much more directly by looking at what they do, and not what they say they do. A hard task but the ONLY way the findings have value.
To illustrate the point, there is a fair amount of research that looks at customer perceptions of customer service versus the CEO’s (or other executives’s) perceptions of the customer service provided by the company. Guess what. CEO’s rate their companies as providing much better customer service compared to their customers. In fact, the gaps are often huge. BOTH sets of respondents offer PERCEPTIONS, not reality. Again, that’s not to say perceptions are unimportant. They are. It’s just that they ARE not indicative of behavior.
Others Both Large and Small
American Express conducted a study, and appears to have confused survey results with customer behavior. In fact, the majority of research reports on customer service and/or social media make a similar confusion. Or their own interpretation of basic survey data is faulty, where claims are made that are not derivable from the actual data. On top of that, it is often the case that the actual details of the research, the how, why, who, context, previous research, caveats, and so on are not easily found. The headlines can be found, are found, and disseminated, often widely.
But often the headlines are wrong.
Conclusion and Some Questions
One has to wonder where the problems lie in the customer research industry. Is it because the people doing the research are incompetent in research logic and design? Is it because there’s a bias at work because most research companies provide other services that are based on a particular slant on customer service or social media? Is it sloppy copywriting?
There’s no way to tell, and I make no comment about the particular companies above, except for what appear to me to be basic flaws in how the research is reported. In addition to the “why” questions above, here are a few more.
- What are the implications and costs for businesses if they take action based on “research” reports that provide incorrect conclusions?
- What level of accountability should these companies have? (Journal research tends to hold researchers accountable in various ways, while research companies in customer service don’t appear to be accountable to research experts applying research standards?
- What are the other consequences of poor conclusions being circulated hundreds or thousands of times online by people who either don’t read the source documents, or lack the ability to critically assess the research?