Ask the right questions when looking at statistics to decide whether the data is a complete waste of time or genuinely worthy of attention.
We grew up believing that numbers don’t lie. Numbers were always objective. Words could get twisted, but numbers were carved in stone. Two plus two always equaled four.
I still believe in numbers, but I am amazed at how often they get misinterpreted or manipulated to reflect a non-objective “fact”.
We see different pollsters conducting surveys at the same time coming up with diametrically opposed findings. We see conflicting and confusing numbers in marketing research findings, public opinion surveys, advertising claims, political polls, news stories and more.
People will often mistrust numbers unless those numbers validate an opinion that they already hold. There is a suspicion of numbers, particularly if they contradict a deeply held belief.
In an era of “fake news” and news media that are pointedly non-objective, the mistrust of statistics is spilling over into the business world.
As a marketing researcher, I find that of late, I am spending almost as much time defending data as presenting data. I have been to meetings where board members are questioning the validity of numbers getting presented by management.
The mistrust of numbers in the public environment is creeping into a distrust of numbers in the business environment.
How can we look at and objectively evaluate the statistics that get presented to us? We can still trust numbers and statistics, as long as we know what to look for. Here are some questions we should be asking whenever we get presented with statistical data:
1. What questions got asked?
When looking at findings from a study, it is essential to know what questions got asked. Were the questions unbiased, or did they predetermine the response? Do questions allow for all possible responses or just specific responses? Are questions clear enough that they can get answered without ambiguity?
2. Who participated in the study?
Was the study sample random or specifically recruited? If it got specifically recruited, what were the selection criteria? Does the sample selected allow for the conclusions that got drawn? Do the findings acknowledge the makeup and rationale for the sample?
3. What do all the numbers say, not just selected numbers?
If there is 38% approval, what do the other 62% say? If sales grew by 15%, what were baseline sales and by how much did the market grow?
4. Who is interpreting the findings?
Are data being analysed by somebody who is objective and has no vested interest in the outcome? Are data being interpreted by somebody secure in the knowledge that regardless of what the data show, the messenger will not be “shot”? Does the information help us to make decisions?
Statistics don’t lie if we are asking the right people the right questions and looking at all the answers rather than looking for specific answers.