Data and sources don’t guarantee accuracy – that’s your job

Time have changed with regard to email hoaxes. In the old days, people would forward the most outrageous stories to you (and one hundred of their other friends) in an attempt to keep everyone out of harm’s way.

Typically, after carefully reading the email and applying some common sense, I could tell that most likely it was untrue. However, to check my thinking I’d often search on Snopes.com to check out the story. In the last five years, I’ve only found one story to be true. The rest are hoaxes.

Recently, I’ve noticed something new with these emails. Many now start out with a sentence saying, “This has been confirmed on snopes.com” and often provide a link to the story on Snopes.

Yet surprisingly, when I click on the link, often Snopes says that the rumor is false! Sometimes the link itself isn’t even a real Snopes link in the first place. It seems that just citing Snopes is enough for most people to believe the story.

Unfortunately, this doesn’t just happen with urban legends and emails. I see business leaders make the similar mistakes.

Once during a presentation, I made a point that customer satisfaction scores were dropping. Several people were skeptical and asked to see the data. I put up a line chart showing scores decreasing over the course of six months. I asked if they were satisfied. They were. I then told them that I had showed them a graph from the prior year (I later showed them the current year’s graph which supported my statement). I asked why my original statement was suspect but the graph was taken at face value. Most of them didn’t have an answer.

It seems that we don’t trust people as much as we trust graphs and tables. If something doesn’t make sense when it is said, why would it suddenly make sense when you see the data?

Several years ago, I was asked to confirm the validity of a model that was being used in a proposal. The model suggested certain causal relationships between how people learned something and how well they retained it. The model didn’t make sense to me intuitively. Luckily it had a source attributed to it. I contacted the organization that created the model. In my first attempt, no one in the organization had ever heard of it. After two or three months of trying, I finally found one person who recognized it. She sent me an email saying that the “model” was created during a brainstorming session and was loosely adapted from some research that was done in the 1960’s in a totally different field of study. She said that she would never use it as a definitive source. I contacted the people who asked me to verify the model. I told them what the woman said. I also explained why the model didn’t really make sense. They said that as long as they had the correct citation, they felt the model was ok. They never took the model out of the presentation. Several years later, I still saw the model being used and quoted as fact.

Finally, a 2008 study by Greg Miller titled “Neuroimaging: Don’t Be Seduced by the Brain,” (Science 13 June 2008 320: 1413) found that the presence of MRI images in a report led undergraduates to more readily accept far-fetched conclusions. One of the reports in the experiment erroneously asserted that the same areas of the brain that are activated by doing math are active when we are watching television. It then stated the conclusion that watching TV improves math skills. Readers were more likely to accept this conclusion as true when a photo of the brain accompanied the article.

Data should not replace common-sense or experience. We need to learn to use the two as checks and balances against one another.

Perhaps this blind acceptance of “data” (or the illusion of data) is due to the current, relentless focus on evidence-based and data-driven management. Both of those are important practices for a leader. However, they are meant to supplement good thinking, not replace it.

It’s not enough to confirm that a statement or conclusion is backed by data and sources. The data and sources themselves must be scrutinized for their own accuracy and bias.
——–
PS: Did you Google the Greg Miller study to see if it actually existed? It does. But if you didn’t check for yourself, you left yourself open to poor thinking.

Brad Kolar is the President of Kolar Associates, a leadership consulting and workforce productivity consulting firm. He can be reached at brad.kolar@kolarassociates.com.

Print Friendly, PDF & Email