In healthcare, it's a well-known problem that people often don't speak up to point out risks or to report near misses. It's an organizational culture problem… people are afraid of being blamed, punished, or retaliated against for speaking up.
A 2012 “report released… by the inspector general of the U.S. Department of Health and Human Services found that more than 80 percent of hospital errors go unreported by hospital employees.” (Read More)
That's not to mention the near misses and unsafe conditions that don't get mentioned.
Hear Mark read the post (subscribe to the podcast):
The Problem With Underreporting Problems
Aviation is supposed to be an area where they have fixed this culture (through things like Crew Resource Management and mandatory non-punitive reporting systems). Hear two podcasts I've done on those topics here and here.
But, this isn't always the case around the world, especially in cultures with large power differentials between people of different ages or rank. See this article from the WSJ: In Asia's Skies, Mistakes Go Unreported.
Highlights from the story:
Without proper accounting, experts say, airlines can't learn from mistakes and regulators can't properly assess safety risks. Even smaller incidents–like component failures and near misses on the runway–are key bellwethers for major crashes. Left undocumented, botched procedures are left to grow endemic.
…
The ICAO [United Nations' International Civil Aviation Organization] has “recognized (underreporting) as an issue” and is seeking ways to improve data collection, including tracking media reports of incidents that may not have been officially reported, said Anthony Philbin, a spokesman for the U.N. agency. The agency uses its incident reports to help set safety guidelines for the industry.
Countries like Indonesia have fewer reported incidents, compared to Australia, than would be expected given the number of flights. As we see in healthcare, having low (or ZERO) rates of reported incidents or sentinel events does NOT mean that the hospital is actually safer. It's more likely an underreporting issue.
If you don't report problems, you can't learn from them and, therefore, you can't improve.
There seems to be more fear in Asia (and more reason for it):
Another concern is that Asian nations have a history of stiff punishments for people involved in safety incidents — a disincentive for employees to speak up when mistakes occur [or “a culture of saving face” makes it hard to report mistakes.]
After a Henan Airlines crash in 2010 that killed 44, the captain, who survived, was jailed.
Harsh penalties are less common in the U.S., which uses a confidential national reporting system that guarantees aviation employees won't be punished by the Federal Aviation Administration if they report air-safety incidents within 10 days–unless the violators were deliberately negligent or inebriated. U.S. carriers have similar systems in-house that keep reporters' names confidential from airline managers.
Since 1979, the U.S.'s national confidential reporting program has processed over a million anonymous tips. It published 4,565 incident reports last year.
As I've blogged about before, I think that jailing nurses and pharmacists for their roles in systemic errors is NOT the way to improve safety and quality. Even Chris Jerry, whose daughter Emily died as the result of a medication error doesn't think jailing the pharmacist was the right thing to do (listen to my podcasts with him here and here).
When Warnings Aren't Heeded in Aerospace
Here's another article from the WSJ, this time about a mishap with a Virgin Galactic test spaceship: Ex-Consultant: FAA Didn't Act on Warnings.
From the article:
Federal Aviation Administration officials repeatedly failed to act on safety warnings about an experimental rocket ship backed by billionaire British entrepreneur Richard Branson that crashed in 2014, according to a former agency consultant.
The spaceship broke apart when the pilot mistakenly pulled a handle or lever that released a movable tail surface at the wrong time, throwing off the aerodynamics of the craft.
The design was “inadequate” and lacked mistake proofing (which is, of course, a core Lean mindset — to mistake proof designs and processes). The ship was:
“…lacking fail-safe protections against a pilot mistakenly releasing a movable tail surface at the wrong time–led to the October 2014 event that broke the spaceplane apart roughly 10 miles high and killed the co-pilot.”
The article blames the FAA, but why didn't Richard Branson and company heed warnings or consider that risk?
The extensive collection of documents and other interview summaries released by the NTSB underscores that Virgin Galactic, Scaled Composites and the FAA all recognized the potential for a catastrophic event caused by what is known as a single-point human failure. But over the years, the design remained unchanged and the FAA, without a request from Scaled, issued a waiver in 2013 from its own regulations.
Did they tell the pilot and co-pilot to “be careful” and “don't pull that handle at the wrong time?”
That's not how you ensure safety.
Did they have a warning sign in the cockpit like this?
As I blog about here, warning signs often don't work. We need a better strategy than that.
When Warnings and Recommendations are Ignored in Food Production
Here's a sad, and again deadly, story about the biggest Texas producer of ice cream, Blue Bell. From the WSJ again: The Lessons of a Deadly Ice Cream Recall. The listeria outbreak, tied to Blue Bell products, killed three people in Kansas earlier this year.
Far too often, people in healthcare will excuse deaths that occur due to hospital acquired infections. They'll say things like, “Well, that's bound to happen, our patients are really sick.” Hospital infections kill far more than a handful of people each year. Yet, there's far less talk about it in the media and no hospital executives are being threatened with prosecution as are Blue Bell execs (which, again, probably isn't helpful).
Federal records show that Blue Bell failed to follow practices recommended by government and industry groups that might have prevented listeria contamination of ice cream at all three of its main plants. At the same time, some food-safety professionals say the crisis is indicative of insufficient attention, beyond Blue Bell, of the risks of listeria.
In 2008, the FDA issued guidelines.
In 2009, inspection reports showed that there were problems at Blue Bell.
In 2013, Blue Bell KNEW they had listeria in their plants and they didn't do enough:
Beginning in 2013, Blue Bell repeatedly found listeria in its Broken Arrow, Okla., facility–including on floors, a drain and at equipment that fills half-gallon containers with ice cream–indicating the company didn't do enough to identify the underlying cause or eliminate the source, said David Acheson, a food-safety consultant who was previously associate commissioner for foods at the FDA.
Then, people died.
Maybe I'm wrong, maybe some executive SHOULD go to jail?
Thinking back to a story about peanuts…
“A man who knowingly sold tainted peanut butter that killed 9 people could be sentenced to life in prison”
That's the punishment federal officials have recommended for Stewart Parnell, the former CEO of the Peanut Corporation of America, who knowingly sold truckloads of the condiment from his Georgia plant even though he allegedly knew they were contaminated with Salmonella.
This sort of thing seems to happen way too often. What is society going to do about it? Just jail executives after people die?
Experts say Blue Bell is not the only ice cream maker that has these problems or risks. I hope those other companies are being more proactive.
When Warnings or Concerns Aren't Shared in Healthcare
You might have previously read about the outbreak of a “superbug” at UCLA Medical Center that was first linked to two deaths in February.
In some of these types of cases, where a bug was spread through contaminated scopes, my first thought is that the hospital wasn't following proper protocols. That wasn't the case here, as UCLA was following the recommended cleaning procedures as given by Olympus, the manufacturer of the duodenoscopes in question.
A design flaw in the duodenoscopes made them, apparently, inherently hard to clean.
The LA Times has a new, in-depth story about all of this: A Killer on the Loose.
One day after the initial reports, the FDA reacted quickly:
The incident prompted the FDA to issue a safety alert to all U.S. hospitals the next morning, Feb. 19, warning them to take extra caution when cleaning the scopes.
Was “taking extra caution” really enough if the scopes were so hard to clean? It seems like it was not. Maybe the safety alert should have told hospitals and doctors to stop using them, but that could have jeopardized some lives in a different way (since the scopes are very helpful).
A month later, there was a change to what I'd call the “standardized work” for cleaning the scopes:
U.S. hospitals received new cleaning instructions for duodenoscopes a month later, in March, from the leading manufacturer, Olympus, which controls 85% of the market. The company also recommended a new cleaning brush.
From the article comes these chilling details:
After investigating previous outbreaks and receiving injury reports, U.S. health officials also were aware of an infection risk from contaminated scopes and had been working on new cleaning standards since 2011.
So the powers that be were just going to accept that known risk? I understand this mother's reaction:
“It's just shocking that so many people were affected by this without anything being done about it,” said Lori Smith, Young's mother and herself a nurse. “Does business go on as normal and people keep getting infected?”
Food and Drug Administration officials counter that it took time to evaluate the evidence and develop a response to a complex problem.
Back to the “cleaning brush:”
Olympus had taken similar steps more than two years earlier, in Europe. The company warned European hospitals and doctors in January 2013 that lethal bacteria could become trapped at the tip of its scopes. It had issued no warning in the U.S., however.
Didn't Olympus and their leaders have an ethical obligation, if not legal, to warn American hospitals and doctors (and the rest of the world outside of Europe)?
When a problem is known and the world stands and watches while people die as a result of that problem… again, maybe somebody should go to jail since these problems aren't “accidents” – they're the result of somebody's choices and decisions.
What do you think?
What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn – and follow me or connect with me there.
Did you like this post? Make sure you don't miss a post or podcast — Subscribe to get notified about posts via email daily or weekly.
Check out my latest book, The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation:
If you’ve been reading about the river that turned yellow in Colorado (blame the closed mine or blame the EPA, I guess), but the EPA was slow to own up to the spill they caused:
Read more:
http://www.wsj.com/articles/mine-busters-at-the-epa-1439336495
Great post, Mark! It’s a complex issue with a lot of psychology mixed up with “common sense.” A hospital executive team about fell off their chairs when one of the KPIs I suggested for an improvement projects to reduce medication errors at the administration stage was around the number of actual errors. Sounds like a good metric, right? They were stunned when I suggested we wanted to see the number go UP. After they heard me out, they agreed and prayed the decision wouldn’t find its way into local media.
Sure enough, patient harm went down as a result of REPORTED medication errors going up. Because we finally knew what we were dealing with. And we could finally do decent root cause analysis.
Fear is the enemy of improvement. Blame is the enemy of improvement. And, unfortunately, today’s media and our litigious society are both enemies of improvement.
Yes, the environment of fear and “name, blame, and shame” sure does get in the way of speaking up, which then prevents improvement.
Those hospital executives should have been ready to step up and educate the local media. Otherwise, they’re not being leaders…
A related problem that annoys me is when the written (safety) procedure is never followed but is then used to blame whatever person happened to not follow it when the accident took place. “Standard work” isn’t about having some written document that tells how things are suppose to be done; it is about having procedures that are safe and actually followed. And if the written guidelines are unreasonable they need to be examined and fixed to something that is safe and will actually be done.
On the side of things I really like is mistake proofing processes. It is so wonderful when we design processes to avoid errors (especially when errors lead to serious injury or death).
What you described there, “when the written (safety) procedure is never followed but is then used to blame whatever person happened to not follow it when the accident took place” — that’s a HUGE problem in hospitals.
I remember the CMO of Cedars-Sinai beating his chest (after the Quaid twins were given the wrong medication) about how there’s “no excuse” for people not following procedures.
OK, that’s on you as senior managers to make sure there’s an environment where people ARE following procedures… each and every day.