healthcare columnist who died from a chemotherapy overdose, a Florida man who had the wrong leg amputated, and an infant in Texas who died from a drug overdose. Coming one right after the other, these tragedies jolted the healthcare community into action. In 1997, The Joint Commission began to require The Report: To Err Is Human In 1999, the secrecy surrounding medical errors was broken when the Institute of Medicine (IOM; now the National Academy of Medicine) published To Err Is Human: Building a Safer Health System. The report captured worldwide attention, and efforts to address the problem began in earnest. This signature report called for an increase in national awareness, namely research, practice, training, and collaboration to reduce the incidence of accidental harm to patients (Bates & Singh, 2018; Mueller et al., 2019, Oster & Braaten, 2021). Although the report focused on hospitals, the IOM’s definition of a medical error—the failure of a planned action to be completed as intended or the use of the wrong plan to achieve an aim—is as relevant to behavioral health settings as to medical settings. The report recounted two separate studies that documented as few as 44,000 and as many as 98,000 deaths annually from preventable medical accidents. More recent studies have estimated numbers between 210,000 and 400,000 according to data extracted from inpatient health and medical records (Makary & Daniel, 2016; Panagioti et al., 2019). Even at the low end, these figures placed preventable medical accidents as the eighth leading cause of death in the U.S.—ahead of car crashes (43,000 deaths annually), breast cancer (42,300 deaths annually), and AIDS (16,516 deaths annually). These IOM statistics did not include deaths in ambulatory care settings, behavioral health facilities, or long-term care settings, and they did not account for hospital- acquired infections. When the number of deaths resulting from these causes is added, preventable harm to patients in the U.S. healthcare system jumps to the third leading cause of death (Makary & Daniel, 2016). The error and safety movement has moved toward new concepts, including the idea that incidents of nonserious harm should not be minimized and that these nonserious incidents have the unfortunate potential to become serious incidents that lead to significant patient harm (Young et al., 2020). It assumes that humans are fallible, and errors are to be expected, even in Human Factors: A New Look at Error The IOM report drew heavily from the field of human factors. Also called human factors research, or human factors engineering, it is a multidisciplinary field that draws upon diverse disciplines—psychology, engineering, industrial design, statistics, and operations research—to understand the interactions among people, technology, and work environments and enhance human performance in the workplace. In other words, how do people interact with technology and process systems, but most importantly, how can these interactions be studied, improved upon, and understood to reduce adverse events for patients? From a safety perspective, the field examines work processes, equipment, and devices, and then redesigns them to accommodate the physical and cognitive limitations of human beings. The human factors approach does not excuse individual incompetence or negligence. This approach attributes harm to insufficient layers of protection embedded in work processes, emphasizing distraction, communication failures, and fatigue as major contributors to individual mistakes According to Oster and Braaten (2021), the science of human factors considers a variety of factors, such as: ● Components of human–system interfaces ● Working environments: Organizational, social, and physical ● The precise nature of the work being done ● Individual characteristics, including performance factors Questions such as how the human factor influences the critical work done in healthcare are central to human factors thinking.
all healthcare organizations participating in its accreditation program to have a process in place for analyzing sentinel events and created a voluntary reporting program to gather information and increase knowledge about sentinel events (The Joint Commission, 2012). To achieve safer care, the IOM cautioned that American medicine would have to look beyond its traditional knowledge base and learn from other disciplines, such as engineering, as well as other high-risk industries, such as aerospace and aviation (Oster & Braaten, 2021). The report introduced concepts that have the potential to transform healthcare’s approach to patient safety. ● Most harm to patients occurs because of flaws in the system, not because of individual performance. ● Harm will be reduced only as safer systems of care are designed. ● For safer systems of care to be designed, the culture of healthcare must change. Mark Chassin, an author of the 1999 report, suggests that hospitals and healthcare systems have made critical efforts in the past two decades to reduce the incidence of unintended harm to patients. However, Chassin (2019) maintains that the “one size fits all” approach to systemic change will not be realistic for widespread adoption. In order to elevate the patient safety movement, particularly over the next two decades, the author went on to suggest three areas of critical focus. ● Focus on zero harm ● Commit to restructuring organizational culture for uninhibited reporting of safety problems ● Incorporate process enhancement approaches that use Six Sigma, change management, and lean principles
CLIENT SAFETY & HARM REDUCTION
the best organizations. The underlying premise of the systems approach is that the human condition (namely, that human beings are fallible and make mistakes) cannot be changed. However, the environment (i.e., the conditions under which humans work) can be modified by building mechanisms into the system to prevent harm or lessen the effects of human error. For example, how do providers and frontline staff perform multiple tasks simultaneously without sacrificing accuracy and skill? This question is critical, given the automatic mental patterns that can govern the actions of staff who all too regularly perform similar tasks over and over again. These professional habits can be quite effective, though they may need to be analyzed if they create conditions for error. The science of human factors attempts to circumvent habits and cognition that can create risky situations. One approach is a mindset of anticipation. The conscious adaptation of anticipating harm or mistakes is fundamentally different than the organizational pattern of simply reacting to error (Oster & Braaten, 2021). Another key concept derived from human factors is the active development of a new organizational mindset—identifying system failures versus merely identifying human failures. Of course, people are responsible for their actions within a healthcare setting; however, the system, or lack thereof, is also a considerable determining factor in adverse patient events. Gravitating to a systems perspective is essential because healthcare in general, and nursing in particular, are operations that are perpetually exposed to frequent interruptions of all kinds. This can create risk for patients, and a systems approach would seek to ameliorate this facet of the system. For example, according to Oster and Braaten (2021), nurses are interrupted approximately 12 times each working hour, and many of these interruptions occur at critical moments that can affect patient
Page 27
Book Code: PCUS1624
EliteLearning.com/Counselor
Powered by FlippingBook