Featured Articles

Safety Requires a State of Mindfulness

In an effort to make healthcare safer, many healthcare organizations are attempting to adopt the characteristics of high-reliability organizations (HROs) that have achieved impressive safety records despite operating in unforgiving environments. Several examples of HROs include nuclear power plants, air traffic control systems, naval aircraft carriers, and wildland firefighting crews. HROs consistently navigate through complex, dynamic, and time-pressured conditions in a nearly error-free manner.1,2 Research suggests that HROs achieve their exceptional performance through a collective behavioral capacity to detect and correct errors and adapt to unexpected events despite a changing environment.3-6 

Reliability in healthcare

In healthcare, errors and adverse events are often viewed as deviations from established practices and system failures. To increase reliability, organizations strive for wider adoption of best practices and improved system performance. We attribute unreliability to unwanted variability in tasks, and reliability to consistency with established routines. To improve reliability, our efforts have focused primarily on error prevention, which requires identifying lapses in care, understanding their causes, and implementing strategies that prevent lapses from recurring or causing harm. Lapses in care can be defined broadly as something that has gone wrong in the care of a patient regardless of the outcome. While this approach to reliability is certainly worthy of continuation, it only addresses part of the problem given that lapses in care have been found to be highly variable, novel, and often unexpected, making prevention ahead of time difficult.7  

Reliability in HROs

HROs approach reliability from a different angle. HROs believe that variability in practices in the form of timely adjustments and moment-to-moment adaptations to work is exactly what improves reliability.8 While HROs have established procedures to guide their work, HROs argue that requiring strict compliance to a single standard of performance at all times may not help workers cope with an unexpected event. To deal with unexpected events, HROs are alert to the possibility of errors and share a collective mindset necessary to detect, understand, and recover from unexpected events before they cause harm.1,3,7 These cognitive processes are driven by a deep, chronic sense of unease that arises from admitting the possibility of failure even with familiar, well-designed, stable procedures.1,3 People in HROs expect surprises and consider them a valuable resource because they encourage learning and discovery, and discourage complacency or inertia.1 Workers are empowered to act on surprises to achieve reliable outcomes (first-order problem solving). They are also encouraged to go beyond first-order problem solving by taking action to prevent problem recurrence.9 This, called second-order problem solving, includes reporting the problem to those who are in a position to address the underlying causes. Second-order problem solving is required for lasting improvement.9

Principles of mindfulness

At the core of HROs is a set of principles that enable organizations to focus attention on evolving problems and to address those problems before they escalate.10 These principles, termed mindfulness, directly impact reliability in a manner different than strategies traditionally employed by healthcare organizations.1 This state of mindfulness embodies five cognitive processes that capture the essence of HROs:

  • Preoccupation with failure
  • Reluctance to simplify interpretations
  • Sensitivity to operations
  • Commitment to resilience
  • Deference to expertise.1-6

Here in Part I of a series of articles on mindfulness, the first two elements are explained. In subsequent newsletters, the remaining elements will be explained along with the concept of sensemaking and the differences between reliability and repeatability.

Preoccupation with failure

A chronic worry about system failure is a distinctive attribute in HROs.1-6,8,10,11 People in HROs are naturally suspicious of “quiet periods” and reluctant to engage in any activities that are not sensitive to the possibility of error.1 They ask, “What happens when the system fails?” not, “What happens if the system fails?”4 Workers in a HRO possess an intelligent wariness about their work and an enhanced sense of error wisdom and risk awareness.8 They have moved from a mindset of ‘no harm, no foul’ to searching out and reviewing close calls or near failures to address areas of potential risk to prevent future adverse events.10 Examples of clinical applications of a preoccupation with failure include immediate post-code debriefings to continuously identify potential failure points that require correction, or change-of-shift discussions of the most likely ways each patient may decompensate or suffer complications so staff remain on-guard.10

This preoccupation with failure is a rather interesting phenomenon given that it runs counter to various human cognitive biases—those glitches in our thinking that cause us to make questionable decisions, err in judgment, and draw incorrect conclusions.12 For example, a normalcy bias makes it difficult for us to engage in “worst-case” thinking and plan for a serious failure or disaster.13 A normalcy bias causes us to assume that, although a catastrophic event has happened to others, it will not happen to me. If it does, we are shocked and unable to effectively cope with it, often underestimating its full effects. Other challenges that make it difficult to maintain a preoccupation with failure include: an optimism bias, which leads to overestimation of favorable outcomes; a valence effect, which causes people to expect that good things are more likely to happen than bad things; and the ostrich effect, which is the tendency for people to avoid unpleasant information.13

Actual failures in HROs are a very rare occurrence. With little data about actual failures, HROs encourage and reward error and near miss reporting. They clearly recognize that the value of remaining fully informed about safety is far greater than any perceived benefit from disciplinary actions. Landau and Chisholm5 emphasized this point more than two decades ago when describing a seaman on a Navy nuclear aircraft carrier who broke a vital rule; he did not keep track of all his tools while working on the landing deck. He subsequently found one of his tools missing and immediately reported it. All aircraft en route to the carrier were redirected to other land bases until the tool was found. The next day, the seaman was commended for his disclosure during a formal ceremony—a very different response than one might expect if, for example, reporting a lost sponge after an operative procedure, thus delaying or postponing other scheduled procedures.

HROs work hard to extract the most value from the data they have. They pay close attention to near misses and can clearly see how close they came to a full-blown disaster; less safe organizations consider close calls to be evidence of their successful ability to avoid a disaster.1 HROs work on the assumption that what seems to be an isolated event is likely caused by the confluence of numerous upstream errors.8 Less safe organizations also tend to localize failures (e.g., the problem is in the ICU, so changes are needed in the ICU). HROs generalize even small failures and consider them a lens to uncover weaknesses in other vulnerable parts of the system.1,3 HROs also acknowledge that the accumulation of small failures increases the risk of large failures.

Because HROs focus on failures, they avoid many of the dysfunctional temptations that arise from success, such as complacency, overconfidence, and inertia.4 HROs do not expect success to breed success, and managers do not attribute success to their own abilities or the organization as a whole. Instead, they are wary of the potential to drift into rote routines during periods of success. Less safe organizations might call this efficiency, but HROs consider this drift a failure because continuous adjustments to changing conditions might not occur.1,4 Preoccupation with success encourages largely mindless acts such as habitual work habits and overconfidence.1-5

Reluctance to simplify interpretations

Organizations typically handle complex issues by simplifying them, thus ignoring certain aspects. HROs, however, attempt to suppress simplification because it limits the ability to envision all possible undesirable effects as well as the precautions necessary to avoid these effects.1,4,5 They take nothing for granted. Otherwise, every seemingly inconsequential detail that is ignored can accumulate and come rushing to the forefront as complex problems.4 Conversely, HROs pay attention to detail and actively seek to know what they previously didn’t know.1 They do not concentrate on things that seem certain, factual, explicit, and agreeable to all. Instead, they attempt to uncover things that might disconfirm their hunches and are unpleasant, uncertain, and disputed. Workers are conditioned to notice more and to strip away stereotypes that conceal differences that may be hidden in the details. Clinical examples of the application of this principle include resisting the tendency to ascribe only one cause to incidents and errors, and frequently revisiting differential diagnoses that are broad to determine if more focused diagnoses can be identified.10

HROs also resist simplification by seeking out different points of view, because differences, not commonalities, hold the key to detecting potential failures.1-4 Diversity also takes the form of checks and balances, from hiring new employees with varied prior experience to novel redundancies. Most often, redundancies involve duplication of work, but redundancies also take the form of healthy skepticism driven by wariness about claimed competencies, and a respectful mindfulness about safety.1 Skepticism is also deemed necessary to counteract the complacency that many typical redundant systems foster.

Diversity has a potential downside: miscommunication and conflicts among workers with differing views. However, HROs are distinguished not only by their resistance to simplification through diverse viewpoints, but also by the way they manage workers with differing views.1,4 While diverse groups will have more information upon which to base decisions, HROs understand that failed communications and mistrust can lead to withheld information. Thus, HROs place a high value on interpersonal skills, mutual respect, norms that curb arrogance and self importance, continual negotiation, teamwork, cultivation of credibility, and deference to expertise.1,2,4 HROs also promote feelings of trust among diverse groups by fostering the belief that humans are fallible, and that skeptics and diversity are necessary to improve reliability.1

References

  1. Weick KE, Sutcliffe KM, Obstfeld D. Organizing for high reliability: processes of collective mindfulness. Research in Organizational Behavior. 1999;21:81-123.
  2. Vogus TJ, Rothman NB, Sutcliffe KM, Weick KE. The affective foundations of high-reliability organizations. J Organiz Behav. 2014;35(4):592-6.
  3. Weick KE, Sutcliffe KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Francisco: Jossey-Bass; 2001.
  4. Leonard M, Frankel A, Simmonds T, Vega K. Achieving Safe and Reliable Healthcare: Strategies and Solutions. Chicago: Health Administration Press; 2004.
  5. Vogus TJ, Welbourne TM. Structuring for high reliability: HR practices and mindful processes in reliability-seeking organizations. J Organiz Behav. 2003;24(7):877–903.
  6. Weick KE, Sutcliffe KM. Managing the Unexpected: Resilient Performance in an Age of Uncertainty, 2nd ed. San Francisco, CA: Jossey-Bass; 2007.
  7. Blatt R, Christianson MK, Sutcliffe KM, Rosenthal MM. A sensemaking lens on reliability. J Organiz Behav. 2006;27(7):897-917.
  8. Reason J. Individual and collective mindfulness. In: Reason J, ed. The Human Contribution. Burlington, VT: Ashgate Publishing Company;  2010:239-63.
  9. Tucker AL, Edmondson AC. Why hospitals don’t learn from failures: organizational and psychological dynamics that inhibit system change. Calif Manage Rev. 2003;45(2):55-72.
  10. Christianson MK, Sutcliffe KM, Miller MA, Iwashyna TJ. Becoming a high reliability organization. Crit Care. 2011;15(6):314-8.
  11. Reason J. Managing the Risks of Organizational Accidents. Burlington, VT: Ashgate Publishing Company; 2000.
  12. Dvorsky G. The 12 cognitive biases that prevent you from being rational. October 9, 2013.
  13. Bryant B. Cognitive biases. Diamond Website Conversion. June 25, 2012.