What Didn’t Happen After This Preventable (and Potentially-Fatal) Medication Error

50
4

Here is a story that was sent to me by a blog reader, who needs to remain anonymous. I know the blog reader fairly well and believe what's written here to be true. The blog reader is a career-long Lean practitioner who has done a lot of work in healthcare. This isn't a masked story about me and my sister, by the way.


I wanted to share a story about a medication error that took place last week. This story is about my sister, who is a nurse at a large hospital in a major American city.

My sister works the night shift in the Neonatal Intensive Care Unit (NICU). A few nights ago, she was flushing some lines from a little baby that led to an IV. 

There were two syringes next to each other. 

One syringe had a dose of Fentanyl, an extremely powerful opioid. The other syringe had the saline she was going to use to flush the line

Another nurse was talking to her while she was performing the work and she failed to notice that she mistakenly picked up the Fentanyl syringe.

She flushed the line with Fentanyl and immediately realized her mistake. She was able to flush out the Fentanyl with the saline syringe before the patient would have received a massive dose of Fentanyl.

That quantity of Fentanyl would have amounted to over 15 times the intended dose that was supposed to be supplied via the pump. Had this mistake gone unnoticed, the baby likely would not have survived.

I asked her later whether she notified anyone of the error. 

She told me that she didn't say anything because she knew that they would have immediately fired her, without question. 

This really struck me because it's a real-life example from a person very close to me who responded just as you might expect (protecting themself) when they are, unfortunately, working in a professionally unsafe environment. 

Unfortunately, in this case, no one learned from the incident and the system remains the same (with the same risks for future patients) as a result of the lack of professional safety. 

I spent some time with her trying to walk through how she might be able to share this information with others so that learning (and improvement) could take place.

Unfortunately, she was so certain of the personal consequences that she decided not to share anything internally. This same organization is rife with physicians who yell at their nursing staff for even daring to challenge them, as evidence of the lack of professional safety.

I wanted to share this story to continue to drive home the importance of our Lean culture change work, and of the role we all share in creating the environment that supports and celebrates transparency and a focus on the process.


My thoughts: Whew. It's scary to think how slim the margin of error can be — both for that baby and for the nurse.

The engineer (and Lean practitioner) in me asks, “How is it even possible to grab the wrong syringe? Why are both syringes in the field of work?”

I'll assume that the syringes were labeled. Even then, it's far too easy for a mistake to happen, especially with the distraction factor of another nurse talking to her during this critical work time.

It's easy to sit back and say things like, “Well, she should have double checked the syringe immediately before administering it. There's no excuse!”

Well, we're all human. That's not an excuse. When people ask, “Well what can we do about human error?” the answer is “Design better systems!”

A better system would incorporate mistake proofing.

More importantly, a better system is one where people can speak up about risks, potential errors, and mistakes like this that could have caused great harm or death.

There's a reason that many hospitals are still working toward a shift toward a “Just Culture” model (which has a high degree of overlap with Lean thinking). See more posts that reference Just Culture.

What's described above is also referred to as a lack of “psychological safety” by some. You can listen to my podcast with Amy Edmondson on this topic:

https://www.leanblog.org/2020/01/podcast-356-amy-c-edmondson-on-psychological-safety-and-the-fearless-organization/

When people are pressured into covering up errors, it's impossible to learn. Instead of creating a better system where an error like this is less likely to occur, there's the same risk that the error won't be caught immediately next time.

It's easy to say things like, “The nurse should speak up… it's their professional obligation!!!”

That's easy to say when it's not your paycheck, your family, or your mortgage on the line.

Sadly, if the nurse here hadn't caught the mistake, she might have not only been fired… she might have been prosecuted or even convicted for her role in what's arguably a systemic error, as has happened to others.

There are pockets of healthcare that are changing… but we don't have a culture of safety (as a reality, not a vision) in every hospital and health system out there. I'm fortunate to be working with one right now that is working seriously to make a culture of safety a real reality.

I'll end the post with two quotes — one from Toyota and one from British healthcare:

“You respect people, you listen to them, you work together.  You don't blame them.  Maybe the process was not set up well, so it was easy to make a mistake.”

Gary Convis, Former President of Toyota Kentucky

And

“Human error is inevitable.  We can never eliminate it.”  We can eliminate problems in the system that make it more likely to happen.”

Sir Liam Donaldson, former Chief Medical Officer of England

What do you think? What is your organization doing to create a safety culture and a safer environment for patients?


What do you think? Please scroll down (or click) to post a comment. Or please share the post with your thoughts on LinkedIn – and follow me or connect with me there.

Did you like this post? Make sure you don't miss a post or podcast — Subscribe to get notified about posts via email daily or weekly.


Check out my latest book, The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation:

Get New Posts Sent To You

Select list(s):
Previous articleThe Way We’ve Always Done It: Astronaut Edition
Next articleLearning Process Behavior Charts and the More Difficult Challenge of Unlearning Old Lessons or Habits
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's new book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation. He is also the author of Measures of Success: React Less, Lead Better, Improve More, the Shingo Award-winning books Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean. Mark is also a Senior Advisor to the technology company KaiNexus.

4 COMMENTS

  1. I’m reminded of the Aviation Safety Reporting System, run by NASA for FAA as an independent, anonymous reporting system for aviation-related near-misses. The idea is that because FAA is a regulatory body, and NASA has no power to fine or punish, NASA can be trusted to protect those reporting issues. By reporting them, trends are tracked and countermeasures can be put in place.

  2. Hello Mark,

    I really enjoyed reading this blog post because it gives me an insight about the healthcare industry and the immense pressure that employees have to experience on a daily basis, on top of ensuring that patients receive the best care possible. My name is Nancy Alvarez and I’m currently taking a Lean Six Sigma Green Belt Certification class at URI where we are encouraged to read blog posts regarding Lean principles. It is evident that without transparency and a safe environment to share previous mistakes with others, it is impossible to improve on an existing system. When working with a company that has such issues/barriers, what do you feel is the most effective way of demonstrating that transparency will only benefit the industry/company in the long run?

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.