I really don't enjoy the type of “Throwback Thursday” posts that are triggered by something in the news that reminds me of something that I've blogged about in the past.
Sometimes, it seems people are doomed to keep repeating the same preventable errors instead of learning from others, improving systems, and mistake proofing things in life.
Pathology Mixups
One type of incident is the repeated mixups of patient specimens in anatomic pathology, the type of preventable error (a process problem) that leads to one patient getting a mistaken incorrect cancer diagnosis (and delays another patient from getting the right diagnosis). I've blogged about this before, including this “it happened again” post:
Yet Again – A Patient Harmed as Hospital Lab Mixes Up Specimens
Sadly, that post includes links to other posts about the same problem occurring in different hospitals. I've been in enough anatomic pathology labs to see how this is often not error proofed. When I pointed out the risk of mixed up specimens once, a lab manager cheerfully said, “It's OK, my people are very careful.” Being careful is not enough.
I haven't blogged about it as much in recent years, but the mix-ups keep occurring.
Tearing Down the Wrong Buildings
Back in 2016, I wrote a second blog post about the wrong building being demolished… can you imagine? Here's another case where being careful isn't enough.
#Throwback Thursday: Wrong House Gets Demolished (And It Happened Again)
There was another case like this in 2010… and it will probably happen again.
Should Have Posted a “Caution: Don't Knock Our House Down” Sign?
This year, I've seen two buildings in the Orlando area that were set for demolition. One was a series of connected “strip mall” buildings and one was a house. Both had DEMO spray painted in big letters. That's probably more practical than spray painting “NO!” on every other building :-)
There's always the risk that somebody spray paints DEMO on the wrong building, but maybe that reduces the risk of error, as long as employees are trained to ONLY knock down buildings that are marked as such and that people are trained to take precautions and to error proof against marking the wrong building?
Flashback Friday: Demolition Errors, Mistake Proofing, and Healthcare
THIS IS NOT A DRILL. Or is it?
You might remember the incident from earlier this year where a test of the emergency messaging system in Hawaii mistakenly sent out a warning about an inbound nuclear missile that said: “THIS IS NOT A DRILL.”
The Response to the Hawaii False Alarm Can't End With Firing Someone
Well, it was a drill. And there was a combination of “bad process” and “bad technology” that led to this erroneous message and all of the stress that was caused. Society likes to blame individuals and to fire people, though. Does that really help prevent future errors? If you improved the systems and had the same people, they probably wouldn't (or ideally couldn't) make the same mistake.
The other day, I saw a news alert about an “active shooter” at Walter Reed Hospital in Maryland. Oh no… this happens far too often…
It turns out it was ANOTHER drill error…
Navy: Notification Error Led to Walter Reed Shooter Report
From the article:
“In a statement, the Navy said that the Naval Support Activity Bethesda notification system was inadvertently activated while preparing for an upcoming drill, without the words “exercise” or “drill.” People who saw the notification contacted security at the Maryland base, which launched an active shooter response.
NSA Bethesda spokesman Jeremy Brooks said the incident “was an accident. It was not something that was planned.”
There was about an hour of stress before officials realized there was no shooter.
“A nurse at Walter Reed, Mary Lock, said she and other employees remained locked down in a second-floor clinic for an hour after hearing this repeated announcement over a loudspeaker: “Active shooter, this is not a drill!”
I hope the response here involves asking “what happened?” and “how did that happen?” instead of asking “who screwed up?”
How can other organizations learn from these mistakes so they aren't repeated again somewhere else?
What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn – and follow me or connect with me there.
Did you like this post? Make sure you don't miss a post or podcast — Subscribe to get notified about posts via email daily or weekly.
Check out my latest book, The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation:
“Who screwed up?” is useful for leaders because it sustains power and fear that they deem necessary for effective leadership and organizational control. Asking “What happened?” and “How did that happen?” does not fulfill the need and makes leaders look weak in their eyes and as seen by their peers and others. This is a centuries-old tradition despite all the education, training, etc. This continuing pattern of negative leadership behavior shows that the true nature of the problem has been poorly understood. Hence, the ineffectiveness of education, training, etc. Traditions of the past are ever-present and remain a major barrier to more widespread adoption of Lean management.
We humans aren’t always comfortable with ambiguity. We like quick, definitive answers as to what caused such a catastrophic event to happen, which often leads to a person as the scapegoat. Who made the error? When that person is fired or written up, it gives others a sense of relief that something has been done to fix the problem. It’s not fun to hear or easy to explain that rather than fixing the problem right now, they need to take the time to find the holes in the multiple layers of swiss cheese that led to the catastrophic event and fill those holes. That takes more time and energy than just writing someone up, hoping they won’t make the same error again. It’s a shame, because if they did take that time, they may find that person is unintentionally a “single point of failure”, and that the outcome rests on their shoulders. A lot of unnecessary pressure on one person to not make an error.