Why are we so afraid of subjective data?

Data and analytics continue to dominate the agenda of most businesses.  However, somewhere along the line, we’ve become myopic in what we consider to be valid data.  In particular, I’ve noticed a backlash against “subjective” data. To some, it is the enemy of good decision making.  To others, it plays second fiddle to “real” (e.g., numeric) data.  I think both attitudes are a mistake.

Data consist of any and all information you receive.  At times that information may take a numeric form but at other times it arrives as words and opinions.  Both are important and necessary in fully understanding your situation.  Both can also equally lead you astray if you don’t look at them critically.

There are three problems associated with this artificial distinction between the efficacy of objective and subjective data.

1)    Confusing quantification with objectivity
2)    Using a single data point to tell a story
3)    Misaligning outcomes and how those outcomes are measured

Confusing quantification with objectivityThere is a general belief that if you can assign a number to something, it is objective.  On the surface this seems true.  After all, there is universal agreement on how we count.  Four equals four anywhere in the world and it is one more than three and two less than six.  How can that not be clear?

The problem is that as soon as you decide to quantify something, you have to create rules about how and what gets quantified.  That process is subjective. 

Consider the presidential election in 2000 and the manual tallying of votes in Florida.  Depending on who was counting and how much a chad was “hanging” four might not equal four. 

In a prior entry, I wrote about a company that boasted a 99.6% safety rating in their rockets despite the fact that 8.5% of them exploded upon launch.  The company explained the discrepancy as a result of their not including explosions due to human error in their safety calculations. So was their objective number correct?

Finally, consider statistical significance.  People love to cite statistical significance as the ultimate objective criteria for assessing their data.  But there’s a catch.  Statistical significance is not absolute.  It requires an opinion – the level at which you want to measure significance.  Something that is not significant at a .001 level might very well be significant at a .1 level.  The person doing the analysis has to choose a level of significance at which to assess the data.  More often than not, that choice is subjective.

Counting is objective.  But, choosing what to count is subjective.  Numbers that appear absolute often have a story or set of assumptions behind them.  Those assumptions can often drastically alter what the data is actually telling you.  If you don’t believe me, just read the footnotes on a company’s financial statements sometime.

Using a single data point to tell a storyOne of the arguments against using subjective data is that it often doesn’t represent the whole picture.  For example, one customer can only account for his or her experience.  What if they happen to interact with your company on a bad day?  What if they got the customer service representative who was your lowest performer?  These tend to be the arguments against using subjective or anecdotal evidence. 

They are partially correct.  However, this isn’t an issue that is limited to subjective data.  It’s a sampling issue.  Any data you use must fully represent the subject area that you are trying to understand. Sales data taken from only one store in a chain can be equally misleading as an indicator of the entire chain’s performance as just using one customer’s opinion to draw conclusions about service.

In addition, any argument that is based on a single data point is probably going to be suspect and hard to sell. This is true for subjective and objective data.

I once had a person push back on me for using employee focus group data in making an argument about a particular issue in her department.  The issue had to do with problems with supervisors and managers.  She said that focus group data wasn’t reliable because one person raising an issue didn’t make it a problem.  She was confusing the use of a single data point with a conclusion based upon a multiple data points across a sample.

I told her I agreed. I would never extrapolate a single data point (subjective or objective) into a trend or finding.  However, in this case, the same issue consistently came up multiple times in multiple focus groups, across diverse audiences, more than any other issue.  The issue was real, even though we didn’t have any “hard” data to support it.

Ideally, your story should be supported by both objective and subjective data.  At a minimum it should be supported by disparate data sources.  In the case of the focus group, while we didn’t have any “hard” data, the issue was corroborated by discussions with leaders who said that they’ve observed similar behaviors in their supervisors and managers. 

If you are relying on a single data point to tell your story, it doesn’t matter whether it’s objective or subjective.  In either case, your story is likely to be flawed.

Misaligning outcomes and how those outcomes are measuredThe final issue that leaders face is mismatching their data and outcomes.  Some outcomes, especially those involving your customers, employees, or other people, are subjective.  If your outcome is subjective you need subjective data to asses it.  If your outcome is objective, you need objective data to assess it.  Misaligning in either case will provide an incomplete and possibly incorrect story.

A marketing department had a goal to improve the clarity of their communications.  I asked them how they were going to measure that.  They provided a list of objective measures:  Overall length in words, FOG index, use of acronyms, etc.  All of those were good measures and were certainly related to clarity.  However, none of them actually measured clarity.  Clarity is subjective.  The only way to measure clarity is to ask the person reading the message whether it was clear.  When I suggested that to the mangers they pushed back.  They were concerned that such data would only provide opinions.  And, what if one person’s definition of what was clear was different than another person’s?  I suggested that was what mattered.  It doesn’t matter what people’s definitions are, if half of your audience doesn’t think your communication is clear, it’s not clear regardless of their (or your) definition of clarity.  The team finally reconsidered.

In an age of sophisticated process measures and techniques for gathering process data, we sometimes forget that process data only measures the process.  It doesn’t tell you what people think of that process.  That is a subjective assessment.

Of course, not everything is so black and white.  Sometimes an outcome is a fuzzy.  For example, suppose you are trying to improve decision making in your organization.  Ultimately your decisions impact objective measures around quality, efficiency, and results.  However, those connections aren’t always direct or improvements can take a while to translate into results.  In those cases, a combination is needed.  Look at the objective measures but also ask people their opinion (individuals, their bosses, their peers) if decisions making is improving.  The combination of objective and subjective will give you a much better (and quicker) understanding of what is really happening.

The story of your business happens regardless of whether you have the tools and metrics to needed quantify it objectively.  By limiting yourself only to “objective” data, you are limiting yourself to seeing only half of the picture.

All data is helpful but all data is incomplete and subject to problems.  Be smart.  Take advantage of all of the information available to you.  Just be critical, understand each data set’s strengths and limitations, and use multiple data sources to tell your story. 

——————
Brad Kolar is an Executive Consultant, speaker, and author  He can be reached at brad.kolar@kolarassociates.com.

Print Friendly, PDF & Email