I worked with a client recently who was working through an emergency situation. The details aren’t critical for this discussion today – to relate, just pick one of the “my house is on fire” moments you’ve had to deal with in your organization (they crop up for all of us eventually).

Long story short, I needed information to respond: which locations were burning and which were likely to catch next. These outcomes, despite being simple enough on their own, required complex calculations of a number of variables. I requested a simple report to understand the landscape. This was not the report handed to me.

Instead, I got a data dump of those many variables. In an emergency situation, with locations on fire in real time, this dump required me to crunch the numbers and weed out all of the locations that were safe in order to pick out the burning few. In other words,

There’s Plenty of Room for Improvement
How do we get data analysis so wrong considering we spend so much time working with data? I believe it ties back to the daily data wrangling tasks we’re mired in. Extracting data from our warehouses and data lakes, cleansing and consolidating it, building a single, unified data set to work with... Data, data, data, data. This is a real grind. We sometimes become so focused on data management that we can’t see the forest for the trees. We resort to summarizing data rather than using it to create valuable information.

Before sending that next report, perhaps we should make sure it is worth sending to begin with. How? By asking ourselves,

  • What challenges do our stakeholders need to solve with it?
  • What question does the report answer? 
  • What is causing us to spend the time – a known need or simply because “that’s the report we always ship out Monday morning?”
  • What value does it add?

Don’t think about analytics in terms of the tactical elements or simply pass around chunks of data. Instead, be the voice of that data.

What are You Trying to Say?
We’re between a rock and a hard place in becoming this voice. On one hand, we have an overdose of raw figures. On the other, we have oversimplified visualizations. This means two things:

We lose when stakeholders have to dig into data to interpret reports. How many stakeholders could fill in the blanks when we don’t? If our audience is a member of the C-suite, I’d venture to say very few – that isn’t their role. Besides, they have their hands full dealing with the crisis du jour. Regardless, asking them to complete the analysis is asking them to do our job.

We also lose when those reports are unclear or ambiguous. We also shouldn’t rush to simplify data through visualizations to the point of being meaningless. Visualizations should make interpreting facts easier, not alter the interpretation. I don’t need to delve into data visualization sins as we’ve all seen them before. However, as an example:

It is incredibly easy to misinterpret data visualizations without having clear parameters and an understanding of the underlying data. Both examples above show a rising trend, so there is some truth to both. However, the sharp rise on the right isn’t really true because of the impact of changing the Y axis starting point.

Get Better
In order to get better, we need to understand what our stakeholders need, and give them no less and no more. Don’t just pass around data others will use to find answers. Be the one who comes up with those solutions.

Share To:

Brian Seipel

Post A Comment:

0 comments so far,add yours