The modern workplace has become very data-driven, with a focus on creating rich statistical reports to empower businesses. The people creating these reports are savvy with their numbers and understand how powerful a statistic can be at not only telling a story about data, but also at potentially swaying people’s opinions.
The onus is on us as report readers to improve our data literacy, arming ourselves with the right knowledge to carefully consider the information we’re presented with so we aren’t fooled by purposefully or inadvertently deceptive statistics. So we’ve compiled several questions you should ask yourself when reading statistical reports to help you determine how much you can trust the data.
Question #1: What’s the Objective?
All reports have a goal in mind. Depending on what information is included and how it’s presented, reports can be angled in a way that feeds us exactly what the writer thinks we want to—or should—hear.
An important step in sniffing out this kind of behavior is to first identify the overarching message of a report. Are the statistics strongly supporting one side of an argument much more than another? Do they seem to be encouraging a particular action to be taken? What is the creator’s point of view? For example, a report whose message is that a company is entering a period of unprecedented growth should be interpreted differently depending on whether it was written by that company or by an outside organization with no ties to that business. Just because information is quantitative doesn’t mean it is objective.
Considering these types of questions is a great way to start determining what message a report is trying to send. How well you understand the goal of a report greatly impacts its ability to mislead you.
Question #2: What’s the Context?
“Context is key.” We must consider all of the information surrounding what is being said because it will help us interpret a statistical report with greater accuracy.
Included in the context is anything from the scope of a report to its sample size to the way the data was collected. One regularly reported statistic that is rarely placed in proper context is the U.S. unemployment rate. In 2016, reports claiming that the jobless rate was finally back to “normal” after the long post-recession rebound were only telling part of the story. While unemployment was down, labor participation also was down. If millions of Americans dropped out of the workforce altogether, the picture of the U.S. employment changes.
Question #3: What’s Missing?
Sometimes it’s important to focus on what’s not there rather than solely on what is. In fact, a statistic could be presented as a red herring meant to pull your attention away from vital information that the report might lack. The absence of a comparison or a control test, for example, can easily make a mountain out of a molehill.
There are times when a writer may choose to keep an outlier out of a report to avoid confusion, in which case there ideally would be transparency about the decision. But often what’s left out is based on the desire to paint a specific picture. For example, a report about a company’s employee diversity that includes hiring stats but not retention data is only telling part of the story.
Question #4: Am I Being Biased?
We all have biases that can affect our decision-making. As important as it is to cross-reference and double-check the work of others, it’s also imperative to do so for your own thinking. More than that, you need to check your assumptions. When we read and synthesize information, we often fill in gaps in the information with our own presumptions and this can easily affect the way we interpret a particular statistic or report. Intellectual honesty, however, calls for us to look at the information and see where it leads, as opposed to where we want it to go.
Let’s say you’re considering launching a new product. You suspect that the market for that product is growing, and the reports you read confirm your belief, so you put the full weight of your marketing team behind the launch. But it’s a bust. Turns out your customers weren’t clamoring for the product. In this case, you found reports and interpreted their data in a way that supported your preconceived ideas.
Question #5: Can I Trust the Report Publisher?
Data literacy isn’t just about picking out the human errors and subjective aspects of report and statistic creation. It’s about the data itself.
This is more difficult to determine. Reading a report, it’s tough to dig into data completeness, whether recorded values were approximations as opposed to exact values, whether the data processing was error-free. But it’s relatively easy to do some quick web or social media searching to find out how the report, or at least the source behind it, is viewed by others. What is its reputation? Is it considered trustworthy by outsiders?
There’s a sense of precision and authority in numbers and statistics, but they don’t always deserve the weight they’re given. That said, by fostering a healthy skepticism and learning what questions to ask, we can develop greater confidence in our ability to consume data responsibly.
This post was written by Quincy Smith, part of the marketing team at Springboard, an online learning company focused on closing the world’s skill gap through courses like their Cybersecurity Career Track.