The Perils of Context-Free Data: Lessons from the Chernobyl Disaster

436
1

I really enjoyed the HBO miniseries “Chernobyl” that aired/streamed recently. You can watch it all now through HBO if you have access (or you can get an HBO free trial through Amazon).

One of my favorite and most meaningful quotes from Don Wheeler (as I've shared in my book Measures of Success) is:

“No data have meaning apart from their context.”

There are many applications of this concept. One is the idea that we need to not overreact to every up and down in a metric. If we report that a metric is down 22%, that might sound bad. But, if we draw a run chart or a “Process Behavior Chart” that shows more data and more context, we might learn that the natural behavior of that metric is that it routinely goes up and down by 15%, 20%, or 25% in a typical month. That context matters.

In the first episode of “Chernobyl,” there's a gripping scene where data is taken out of context to suit an agenda (or it's done out of denial) — bad language warning — don't blare this at work:

https://www.youtube.com/watch?v=eckdfSzN07Y

In a shorter clip from the scene, the deputy chief engineer, Dyatlov, is told that the radiation reading is:

“3.6 roentgen, but that's as high as the meter…”

He's cut off. 3.6 is as high as the meter goes. But Dyatlov, who already said “RMBK reactors can't explode” (one just did), is clearly in denial and he's processing everything through that lens.

He says:

“3.6… not great, not terrible.”

The actual number must have been higher, of course.

Dyatlov then goes and reports the faulty number upward. Is he lying or just blinded by his bias and mental models. The number on the dosimeter WAS 3.6… but Dyatlov doesn't share the context of “that's as high as the meter will go.” So, that makes the situation seem less dire than it really is (although it looks more dire to those who can see the reactor burning from a distance).

This whole Chernobyl disaster is a situation when “going to the gemba” (the actual place) is quite deadly. But, we need to be careful that we're not misled when relying on data.

He tells someone higher up:

“I'm told the number is 3.6 roentgen per hour”

and is told:

“Well that's not great, but that's not horrifying.”

Higher ups are told that things are “well under control” when that was clearly not the case.

Not long after, they're told the number was 200 roetgen. The response:

“Another faulty meter.”

An employee who was in the gemba tells Dyatlov that he saw graphite in the rubble (which would be part of the control rods). Dyatlov replies,

“No you didn't… because IT'S NOT THERE!”

He says this from a conference room.

Denial.

It's easy to deny things that fly in the face of your mental models.

Later in a scene from Episode 2, a dosimeter that reads up to 15,000 arrives. When the senior military officer is told that “lead shielding might not be enough,” he says, “I'll do it myself” (which is a great example of Leaders Eat Last-style servant leadership):


“What does that number mean?” That's a question that tries to set context.

“That means the core is open… the fire is giving off more than twice the radiation of the bomb at Hiroshima… 40 bombs worth by now… it will not stop…”

The actual number was later estimated to be something more like 30,000 roentgen, a very deadly number.

The official death toll of 31 seems to be another example of “data without context.”

The whole series illustrates what happens in a culture of fear — except it's not a company or a hospital, but a country. Everybody feared sending bad news upward to the central committee and Gorbachev, even with him being a reformer.

The video and the Chernobyl story show that “data-driven decisions” can be terrible if the data is faulty or if context is missing.

I've shared this idea on Twitter recently and before:

And on LinkedIn:

What do you think about the need for more context with our data?


What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn – and follow me or connect with me there.

Did you like this post? Make sure you don't miss a post or podcast — Subscribe to get notified about posts via email daily or weekly.


Check out my latest book, The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation:

Get New Posts Sent To You

Select list(s):
Previous articleOperational Excellence Mixtape: June 7, 2019
Next articleInside Toyota’s Takaoka #2 Line – Flexibility and Kaizen
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's new book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation. He is also the author of Measures of Success: React Less, Lead Better, Improve More, the Shingo Award-winning books Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean. Mark is also a Senior Advisor to the technology company KaiNexus.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.