Featured Articles

Using External Errors to Signal a Clear and Present Danger

Chances are you’ve scanned the headlines and read many of the stories about medication errors published in the ISMP Medication Safety Alert!, particularly the tragic errors: the death of a young child who received a massive overdose of zinc due to a miscommunicated compounding process; the prolonged readmission of an elderly man with deep vein thrombosis and pulmonary emboli because he was not instructed to continue taking warfarin upon discharge a week earlier; the death of an infant who received 5 mg of morphine instead of the prescribed “.5 mg” dose when the naked decimal point was not seen. You’ve learned about many recurring errors: the death of another young man after intrathecal administration of IV vincristine; the death of another elderly patient from misprescribing of a fentanyl patch; the death of another young mother in labor after an intrathecal bupivacaine infusion was administered IV. There has been no shortage of harmful medication errors for us to relate in our newsletter.  

As you’ve read these stories, you’ve probably felt surprised or startled, saddened, anxious, unsettled, and perhaps even angry or frustrated. These initial gut feelings, which the airline industry has termed “leemers,” cause you to feel “leery” about errors (thus, the term “leemers”), even if you can’t put your finger on the exact cause of your unease.1 Unfortunately, we have a tendency to gloss over these initial gut feelings and treat many errors as inconsequential in our own lives and work.1 The stories you hear about tragic medication errors may be compelling but are perhaps felt to be irrelevant to your practice—a sad story, but not something that could happen to you or at your hospital. People tend to “normalize” the errors that have led to tragic events, and subsequently, they have difficulty learning from them.

There are several biases that lead to normalization of errors and thwart our learning from mistakes, particularly the mistakes of others. First, we have a tendency to attribute good outcomes to skill, and bad outcomes to sheer bad luck.2 We have a relatively fragile sense of self-esteem and a tendency to protect our professional self-image (and the image of our workplace) by believing the same errors we read about could not happen to us. It was just terrible luck that led to the bad outcome in another organization, soon to be forgotten by all except the few who were most intimately involved in the event.

Next, we have a tendency to be too optimistic and overconfident in our abilities and systems,2 particularly when assessing our vulnerability to fatal events. We thirst for agreement with our expectations that the tragic errors we read about could not happen in our workplace, seeking confirmation about our expectations of safety while avoiding any evidence of serious risk.1-2 We may go through the motions of looking at our abilities and systems to determine if similar errors might happen in our organizations, but in the end, we tend to overlook any evidence that may suggest trouble (much like confirmation bias in which we view what we expect to see on a medication label, failing to see any disconfirming evidence). We subconsciously reach the conclusions we want to draw when it comes to assessing whether our patients are safe.2

To best promote patient safety, it is crucial to seek out information about external errors, to hold on to your initial feelings of surprise and uncertainty when you read about these errors, and to resist the temptation to gloss over what happened.1 It is in the brief interval between the initial surprise and unease when reading about an external error, and the normalization of error—convincing yourself that it couldn’t happen to you—that significant learning can occur. If you wait too long, you can be easily convinced that there is nothing applicable to learn. Most opportunities for learning come in brief “ah-ha” moments that need to be frozen in time and remain connected to your initial feelings of surprise and unease in order to adequately learn from them and take action. 

In our February 25, 1998 and January 13, 2005 newsletters,3-4 we suggested making a New Year’s resolution to learn from published reports of errors, anticipating the same risks in your organizations, and making substantial improvements in patient safety. We repeated this recommendation in our November 29, 2007, newsletter5 after writing about repeated mix-ups between heparin vials of varying concentrations (10 units/mL and 10,000 units/mL), which led to the deaths of numerous infants. James Conway, senior vice president of the Institute for Healthcare Improvement, also published a recent article in Healthcare Executive outlining the importance of learning from other organization’s errors.6 Mr. Conway lists ISMP, FDA, The Joint Commission (TJC), the Pennsylvania Patient Safety Authority, and the National Quality Forum (NQF) as reliable sources of information on external errors. The ISMP and Conway articles outline the steps organizations can take to establish and maintain a system for ongoing learning from external tragic medical events. Table 1 summarizes these steps.

Table 1. Steps for Learning About External Errors1-6
Leaders should convey that high-profile, external events offer necessary learning and should be reviewed.6
Define the types of events you should monitor and know about (e.g., serious event definition from the NQF).6
Develop reliable sources of information (e.g., ISMP, FDA, TJC) to determine how and why the errors occurred.5-6
Assign a specific professional(s) to routinely search the literature for published error experiences.5-6
Establish group (unit-level, safety committees, administrative team, board) responsibility, with standing space
on agendas, to ensure review of published external errors.4-6
Establish a systematic way to review information about external errors and assess the organization’s vulnerability to similar errors.4-6
Ask yourself, “Could this event happen here?”6
Listen to updated reports about events, particularly updates concerning why the errors occurred, and learn
how the affected organizations are handling the events, if possible.6
Determine a workable action plan to address vulnerabilities, and assign staff to ensure the action occurs.5
Reassess vulnerabilities after the action plan has been implemented.
Use error stories as persuasive tools to drive improvements.6
Let others know that you consider an external error to be a “clear and present danger” in your organization,
and the steps you have taken to prevent such an occurrence.6

The only way to make significant safety improvements is to challenge the status quo, inspire and encourage all staff to track down “bad news” about errors and risk—both internal and external—and to learn from the “bad news” so that targeted improvements can be made. We need to shatter the assumption that systems are safe until proven dangerous by a tragic event. No news is not good news when it comes to patient safety. Each organization needs to accurately assess how susceptible its systems are to the same errors that have happened in other organizations, and acknowledge that the absence of similar errors is not evidence of safety. Personal experience is a powerful teacher, but the price is too high to learn all we need to know from firsthand experiences. Learning from the mistakes of others is imperative.

References:

  1. Weick KE, Sutcliffe KM. Managing the Unexpected. San Francisco: Jossey-Bass John Wiley & Sons, Inc. 2001.
  2. Montier J. The limits to learning. In: Behavioral Investing: A Practitioner’s Guide to Applying Behavioural Finance. New York, NY: John Wiley & Sons, Inc.; pg. 65-77; 2007.
  3. ISMP. It’s not too late for one more New Year’s resolution. ISMP Medication Safety Alert! 1998;3(4):1.
  4. ISMP. Looking forward: make pro-change your New Year’s Resolution. ISMP Medication Safety Alert! 2005;10(1):1-2.
  5. ISMP. Another heparin error: learning from mistakes so we don’t repeat them. ISMP Medication Safety Alert! 2007;12(24):1-2.
  6. Conway J. Could it happen here? Learning from other organization’s safety errors. Healthcare Executive. November/December 2008:64,66-67.