U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Hughes RG, editor. Patient Safety and Quality: An Evidence-Based Handbook for Nurses. Rockville (MD): Agency for Healthcare Research and Quality (US); 2008 Apr.

Cover of Patient Safety and Quality

Patient Safety and Quality: An Evidence-Based Handbook for Nurses.

Show details

Chapter 3An Overview of To Err is Human: Re-emphasizing the Message of Patient Safety

.

Author Information and Affiliations

Introduction

On November 29, 1999, the Institute of Medicine (IOM) released a report called To Err is Human: Building a Safer Health System.1 The IOM released the report before the intended date because it had been leaked, and one of the major news networks was planning to run a story on the evening news.2 Media throughout the country recognized this opportunity for a headline story describing a very large number of hospital deaths from medical errors —possibly as great as 98,000 per year. The problem in other care settings was unknown, but suspected to be great.

The search was on to find out who was to blame and how to fix the problem. Congressional hearings were subsequently held. Governmental agencies, professional groups, accrediting organizations, insurers, and others quickly responded with plans to define events and develop reporting systems. Health care organizations were put on the defensive. Recognizing that individual accountability is necessary for the small proportion of health professionals whose behavior is unacceptable, reckless, or criminal, the public held organizational leadership, boards, and staff accountable for unsafe conditions. Yet imposing reporting requirements and holding people or organizations accountable do not, by themselves, make systems safer.

What was often lost in the media attention to hospital deaths from medical errors cited by To Err is Human was the original intent of the IOM Committee on Quality Health Care in America, which developed the report. That committee believed it could not address the overall quality of care without first addressing a key, but almost unrecognized component of quality; which was patient safety. The committee’s approach was to emphasize that “error” that resulted in patient harm was not a property of health care professionals’ competence, good intentions, or hard work. Rather, the safety of care—defined as “freedom from accidental injury”3 (p. 16)—is a property of a system of care, whether a hospital, primary care clinic, nursing home, retail pharmacy, or home care, in which specific attention is given to ensuring that well-designed processes of care prevent, recognize, and quickly recover from errors so that patients are not harmed.

This chapter focuses on the principles described in the IOM report, many of which can be mapped to what are now called safe practices4 and all of which are valuable guides. This chapter is not intended to address the growing body of evidence; rather, the chapter summarizes the starting point—the IOM recommendations based on the literature and the knowledge of the committee members who developed the report.

Moving the Focus From Errors to Safety

Errors occur in health care as well as every other very complex system that involves human beings. The message in To Err is Human was that preventing death and injury from medical errors requires dramatic, systemwide changes.1 Among three important strategies—preventing, recognizing, and mitigating harm from error—the first strategy (recognizing and implementing actions to prevent error) has the greatest potential effect, just as in preventive public health efforts.

The IOM committee recognized that simply calling on individuals to improve safety would be as misguided as blaming individuals for specific errors. Health care professionals have customarily viewed errors as a sign of an individual’s incompetence or recklessness. As a result, rather than learning from such events and using information to improve safety and prevent new events, health care professionals have had difficulty admitting or even discussing adverse events or “near misses,” often because they fear professional censure, administrative blame, lawsuits, or personal feelings of shame. Acknowledging this, the report put forth a four-part plan that applies to all who are, or will be, at the front lines of patient care; clinical administrators; regulating, accrediting, and licensing groups; boards of directors; industry; and government agencies. It also suggested actions that patients and their families could take to improve safety.

The committee understood that need to develop a new field of health care research, a new taxonomy of error, and new tools for addressing problems. It also understood that responsibility for taking action could not be borne by any single group or individual and had to be addressed by health care organizations and groups that influence regulation, payment, legal liability, education and training, as well as patients and their families. The report called on Congress to create a National Center for Patient Safety within the Agency for Healthcare Research and Quality, to develop new tools and patient care systems that make it easier to do things right and harder to do things wrong. This handbook is a direct result of the implementation of those recommendations.

Improving Safety by Understanding Error

Every day, physicians, advance practice nurses, nurses, pharmacists, and other hospital personnel recognize and correct errors and usually prevent harm. Errors, defined as “the failure of a planned action to be completed as intended or the use of a wrong plan to achieve an aim,”1 do not all result in injury or harm. Errors that do cause injury or harm are sometimes called preventable adverse events—that is, the injury is thought to be due to a medical intervention, not the underlying condition of the patient. Errors that result in serious injury or death, considered “sentinel events” by the Joint Commission (formerly the Joint Commission on Accreditation of Healthcare Organizations [JCAHO]),5 signal the need for an immediate response, analysis to identify all factors contributing to the error, and reporting to the appropriate individuals and organizations7 to guide system improvements.

The key question for the IOM, as for many health professionals now, was what could be done to improve safety. To differentiate between individual factors and system factors, the report distinguished between the “sharp” end of a process in which the event occurs (e.g., administration of the wrong dose of medication that is fatal, a mishap during surgery) and the “blunt” end in which many factors (called latent conditions), which may have seemed minor, have interacted and led to an error.6 These latent conditions may be attributable to equipment design or maintenance, working conditions, design of processes so that too many handoffs occur, failures of communication, and so forth.7–9

Leape8 greatly enhanced our understanding of errors by distinguishing between two types of cognitive tasks that may result in errors in medicine. The first type of task occurs when people engage in well-known, oft-repeated processes, such as driving to work or making a pot of coffee. Errors may occur while performing these tasks because of interruptions, fatigue, time pressure, anger, distraction, anxiety, fear, or boredom. By contrast, tasks that require problem solving are done more slowly and sequentially, are perceived as more difficult, and require conscious attention. Examples include making a differential diagnosis and readying several types of surgical equipment made by different manufacturers. Errors here are due to misinterpretation of the problem that must be solved and lack of knowledge. Keeping in mind these two different kinds of tasks is helpful to understanding the multiple reasons for errors and is the first step in preventing them.

People make errors for a variety of reasons that have little to do with lack of good intention or knowledge. Humans have many intellectual strengths (e.g., large memory capacity and an ability to react creatively and effectively to the unexpected) and limitations (e.g., difficulty attending carefully to several things at once and generally poor computational ability, especially when tired).12 Improving safety requires respecting human abilities by designing processes that recognize human strengths and weaknesses.

There are many opportunities for individuals to prevent error. Some actions are clinically oriented and evidence-based: communicating clearly to other team members, even when hierarchies and authority gradients seem to discourage it; requesting and giving feedback for all verbal orders; and being alert to “accidents waiting to happen.” Other opportunities are broader in focus or address the work environment and may require clinical leadership and changing the workplace culture: simplifying processes to reduce handoffs and standardizing protocols; developing and participating in multidisciplinary team training; involving patients in their care; and being receptive to discussions about errors and near misses by paying respectful attention when any member of the staff challenges the safety of a plan or a process of care.

However, large, complex problems require thoughtful, multifaceted responses by individuals, teams, and organizations. That is, preventing errors and improving safety require a systems approach to the design of processes, tasks, training, and conditions of work in order to modify the conditions that contribute to errors. Fortunately, there is no need to start from scratch. The IOM report included some guidance based on what was known at the time, and other specific evidence has accumulated since then that can be put in practice today. Designing for safety requires a commitment to safety, a thorough knowledge of the technical processes of care, an understanding of likely sources of error, and effective ways to reduce errors.

A Report From the Trenches—Systems, not Shame

Nurses sometimes comment:

  • “We are really short-staffed. Sometimes I am so busy and distracted that I am sure I must make mistakes when calculating the doses of meds. I haven’t killed anyone, but I know when I’ve made a mistake. How can I make sure I don’t make errors?”
  • “I was supposed to administer chemotherapy to a patient. Even though I tried hard, I couldn’t figure out from the chart what kind of cancer the patient had. What can I do to make sure this sort of thing doesn’t happen again?”
  • “There is a piece of equipment on our unit that is an accident waiting to happen. The experienced staff knows about it and has learned how to work around it, but what happens when new staff are assigned?”

These types of questions are by no means unusual. Partly because of its sheer complexity and the number of different individuals with different training and approaches, health care is prone to harm from errors—especially in operating rooms, intensive care units (ICUs), and emergency departments where there is little time to react to unexpected events—and consequences can be very serious. Although most early studies focused on the hospital setting, medical errors present a problem in all settings, including outpatient surgical centers, physician offices and clinics, nursing homes, and the home, especially when patients and families are asked to use increasingly complicated equipment.

Patients should not be harmed by the health care system that is supposed to help them, but the solution does not lie in assigning blame or urging health professionals to be more careful. In what seems to be a simple example, an ICU nurse was wheeling a patient on a gurney to radiology when his knee struck a fire extinguisher hanging on the wall, resulting in the patient needing extra care. In response, the nurse may have been scolded by her supervisor and told to be more careful, or punished in some other way; everyone would feel the problem had been solved. Yet, would that make the hospital safer? Would it prevent other events that are similar but slightly different in circumstances from happening with other staff and patients in other units? The answer is an emphatic no.

Improving safety, arises from attention to the often multiple latent factors that contribute to errors and in some cases, to injury. In the above example, such factors included: 1) the nurse having to move the patient herself because transport had never arrived; 2) a change in hospital policy, so that only one instead of two people guide gurneys; 3) the failure to mount the fire extinguisher in a recessed niche; 4) the decision to transport a seriously ill patient rather than having mobile equipment come to him, requiring extra “handoffs” and opportunities for injury; and 5) poor gurney design, making steering difficult, and possibly still other factors.

The IOM’s Four-Part Message

The IOM committee sought what could be learned from other disciplines and applied in health care by clinical and administrative leadership. It described actions that health care professionals can take now in their own institutions, whether they are new trainees, experienced clinical leaders, or instructors. The major thrust of the report was a four-part plan, intended to create financial and regulatory incentives to create a safer health care system and a systematic way to integrate safety into the process of care (the focus of this chapter). The four parts of the IOM recommendations are described below:

  • Part 1: National Center for Patient Safety – The IOM recommended the creation of a National Center for Patient Safety in the U.S. Department of Health and Human Services’s Agency for Healthcare Research and Quality (AHRQ), because health care is a decade or more behind other high-risk industries in its attention to ensuring basic safety, establishing national safety goals, tracking progress in meeting them, and investing in research to learn more about preventing mistakes. This center would also serve as a clearinghouse and source of effective practices that would be shared broadly.
  • Part 2: Mandatory and Voluntary Reporting Systems – To learn about medical care associated with serious injury or death and to prevent future occurrences, the IOM recommended establishing a nationwide, mandatory public reporting system, where Federal legislation would protect the confidentiality of certain information (e.g., medical mistakes that have no serious consequences). The intent was to encourage the growth of voluntary, confidential reporting systems so that practitioners and health care organizations could learn about and correct problems before serious harm occurs.
  • Part 3: Role of Consumers, Professionals, and Accreditation Groups – The IOM believed that fundamental change would require pressure and incentives from many directions, including public and private purchasers of health care insurance, regulators (including the Food and Drug Administration), and licensing and certifying groups. A direct result was the announcement of new standards on safety from the Joint Commission and a report, Safe Practices for Better Health Care. A Consensus Report, by the National Quality Forum.10
  • Part 4: Building a Culture of Safety – The IOM urged health care organizations to create an environment in which safety becomes a top priority. This report stressed the need for leadership by executives and clinicians and for accountability for patient safety by boards of trustees. In particular, it urged that safety principles known in other industries be adopted, such as designing jobs and working conditions for safety; standardizing and simplifying equipment, supplies and processes; and avoiding reliance on memory. The report stressed medication safety in part because medication errors are so frequent11 and in part because a number of evidenced-based practices were already known and needed wider adoption. Though at the time of publication, the levels of evidence for each category varied, the members of the committee believed that all were important places to begin to improve safety.

The committee recognized that some actions could be taken at the national level as described in the recommendations contained in Parts 1–3. Yet if patient safety were really to improve, the committee knew it would take far more than reporting requirements and regulations. Creating and sustaining a culture of safety (Part 4) is needed, which would require continuing local action by thousands of health care organizations and the individuals working in these settings at all levels of authority. Hospital leadership must provide resources and time to improve safety and foster an organizational culture that encourages recognition and learning from errors. A culture of safety cannot develop without trust, keen observation, and extensive knowledge of care processes at all levels, from those on the front lines of health care to those in leadership and management positions.

Basic Concepts in Patient Safety

Opportunities to improve safety have been drawn from numerous disciplines such as engineering, psychology, and occupational health. The IOM report brought together what had been learned in these fields and then applied the opportunities to health care, as described in the nine categories that follow.

1. User-Centered Design

Understanding how to reduce errors depends on framing likely sources of error and pairing them with effective ways to reduce them. The term “user-centered design” builds on human strengths and avoids human weaknesses in processes and technologies.12 The first strategy of user-centered design is to make things visible─including the conceptual model of the process─so that the user can determine what actions are possible at any moment, for example, how to return to an earlier step, how to change settings, and what is likely to happen if a step in a process is skipped. Another principle is to incorporate affordances, natural mappings, and constraints into health care. Although the terms are strange, their meaning can be surprisingly easily applied to common everyday tasks, both in and out of the workplace.

An affordance is a characteristic of equipment or workspace that communicates how it is to be used, such as a push bar on an outward opening door that shows where to push or a telephone handset that is uncomfortable to hold in any but the correct position. Marking the correct limb for before surgery is an affordance that has been widely adopted. Natural mapping refers to the relationship between a control and its movement, for example, in steering a car to the right, one turns the wheel right. Other examples include using louder sound or a brighter light to indicate a greater amount.

Constraints and forcing functions guide the user to the next appropriate action or decision. A constraint makes it hard to do the wrong thing. A forcing function makes it impossible to do the wrong thing. For example, one cannot start a car that is in gear. Forcing functions include the use of special luer locks for syringes and indwelling lines that have to be matched before fluid can be infused, and different connections for oxygen and other gas lines to prevent their being inadvertently switched. Removing concentrated potassium chloride from patient units is a (negative) forcing function because it should never be administered undiluted, and preparation should be done in the pharmacy.

2. Avoid Reliance on Memory

The next strategy is to standardize and simplify the structure of tasks to minimize the demand on working memory, planning, or problem-solving, including the following two elements:

  • Standardize process and equipment. Standardization reduces reliance on memory and allows newcomers who are unfamiliar with a given process or device to do the process or use a device safely. For example, standardizing device displays (e.g., readout units), operations, and doses is important to reduce the likelihood of error. Other examples of standardizing include standard order forms, administration times, prescribing protocols, and types of equipment. When devices or medications cannot be standardized, they should be clearly distinguishable. For example, one can identify look-alike, but different, strengths of a narcotic by labeling the higher concentration in consistent ways, such as by shape and prominent labeling.
    When developed, updated, and used wisely, protocols and checklists can enhance safety. Protocols for the use of anticoagulants and perioperative antibiotics have gained widespread acceptance. Laminated dosing cards that include standard order times, doses of antibiotics, formulas for calculating pediatric doses, and common chemotherapy protocols can reduce reliance on memory.13
  • Simplify key processes. Simplifying key processes can minimize problem-solving and greatly reduce the likelihood of error. Simplifying includes reducing the number of steps or handoffs that are needed. Examples of processes that can usually be simplified are writing an order, then transcribing and entering it in a computer, or having several people record and enter the same data in different databases. Other examples of simplification include limiting the choice of drugs and dose strengths available in the pharmacy, maintaining an inventory of frequently prepared drugs, reducing the number of times a day a drug is administered, keeping a single medication administration record, automating dispensing, and purchasing equipment that is easy to use and maintain.

3. Attend to Work Safety

Conditions of work are likely to affect patient safety. Factors that contribute to worker safety in all industries studied include work hours, workloads, staffing ratios, sources of distraction, and shift changes (which affect one’s circadian rhythm). Systematic evidence about the relative importance of various factors is growing with particular emphasis on nurse staffing.14–16

4. Avoid Reliance on Vigilance

Individuals cannot remain vigilant for long periods of time. Approaches for reducing the need for vigilance include providing checklists and requiring their use at regular intervals, limiting long shifts, rotating staff, and employing equipment that automates some functions. The need for vigilance can be reduced by using signals such as visual and auditory alarms. Also, well-designed equipment provides information about the reason for an alarm. There are pitfalls in relying on automation, if a user learns to ignore alarms that are often wrong, becomes inattentive or inexpert in a given process, or if the effects of errors remain invisible until it is too late to correct them.

5. Train Concepts for Teams

People work together throughout health care in multidisciplinary teams, whether in a practice; for a clinical condition; or in operating rooms, emergency departments, or ICUs. In an effective interdisciplinary team, members come to trust one another’s judgments and expertise and attend to one another’s safety concerns. Team training in labor and delivery and hospital rapid response teams are examples. The IOM committee believed that whenever it is possible, training programs and hospitals should establish interdisciplinary team training.

6. Involve Patients in Their Care

Whenever possible, patients and their family members or other caregivers should be invited to become part of the care process. Clinicians must obtain accurate information about each patient’s medications and allergies and make certain this information is readily available at the patient’s bedside. In addition, safety improves when patients and their families know their condition, treatments (including medications), and technologies that are used in their care.

At the time of discharge, patients should receive a list of their medications, doses, dosing schedule, precautions about interactions, possible side effects, and any activities that should be avoided, such as driving. Patients also need clear written information about the next steps after discharge, such as followup visits to monitor their progress and whom to contact if problems or questions arise.

Family caregivers deserve special attention in terms of their ability to provide safe care, manage devices and medication, and to safely respond to patient needs. Yet they may, themselves, be affected by physical, health, and emotional challenges; lack of rest or respite; and other responsibilities (including work, finances, and other family members).

Attention is now being given to problems resulting from lack of patient and family health literacy. For example, information may be too complex to absorb or in a language unfamiliar (even to educated and English-speaking patients)—and frightening. A simple example is rapidly given instructions on home care of a Foley catheter when, as often occurs, the patient is being discharged shortly after surgery and knows nothing about sterile technique or the design of the device. Another ubiquitous example is the warnings and dosage information on medication bottles, which many patients cannot understand how to apply.

7. Anticipate the Unexpected

The likelihood of error increases with reorganization, mergers, and other organization-wide changes that result in new patterns and processes of care. Some technologies, such as computerized physician order entry systems (CPOE), are engineered specifically to prevent error. Despite the best intentions of designers, however, all technology introduces new errors, even when its sole purpose is to prevent errors. Indeed, future failures cannot be forestalled by simply adding another layer of defense against failure.17–19 Safe equipment design and use depend on a chain of involvement and commitment that begins with the manufacturer and continues with careful attention to the vulnerabilities of a new device or system. Health care professionals should expect any new technology to introduce new sources of error and should adopt the custom of automating cautiously, always alert to the possibility of unintended harm, and should test these technologies with users and modify as needed before widespread implementation.

8. Design for Recovery

The next strategy is to assume that errors will occur and to design and plan for recovery by duplicating critical functions and by making it easy to reverse operations and hard to carry out nonreversible ones. If an error occurs, examples of strategies to mitigate injury are keeping antidotes for high-risk drugs up to date and easily accessible and having standardized, well-rehearsed procedures in place for responding quickly to adverse events. Another strategy is to use simulation training, where learners practice tasks, processes, and rescues in lifelike circumstances using models or virtual reality.

9. Improve Access to Accurate, Timely Information

The final strategy for user-centered design is to improve access to information. Information for decision-making (e.g., patient history, medications, and current therapeutic strategies) should be available at the point of patient care. Examples include putting lab reports and medication administration records at the patient’s bedside and putting protocols in the patient’s chart. In a broader context, information is coordinated over time and across settings.

Conclusion

Now, 7 years after the release of To Err is Human, extensive efforts have been reported in journals, technical reports, and safety-oriented conferences. That literature described the magnitude of problems in a variety of care settings, the efforts to make change, and the results of those efforts in improving patient safety. Many of those studies are referenced and discussed throughout this book. Other authors have written incisively about what progress has and has not been made in the past 7 years and the challenges in creating cultures of safety.20, 21 The greatest challenge we all face is to learn, use, and share better information about how to prevent harm to patients.

References

1.
Kohn LT, Corrigan JM, Donaldson MS, editors. To err is human: building a safer health system . Washington, DC: National Academy Press, Institute of Medicine; 1999. [PubMed: 25077248]
2.
News Release: Medical Errors Report for Immediate Release, Nov. 29, 1999, National Academy of Sciences. “Preventing Death and Injury from Medical Errors Requires Dramatic, System-Wide Changes.”
3.
Reason JT. Human Error . New York, NY: Cambridge University Press; 1990.
4.
Safe Practices for Better Health Care. Rockville, MD: Agency for Healthcare Research and Quality; Mar, 2005. Fact Sheet AHRQ Publication No 04-P025. Executive Summary of the National Quality Forum’s report, Safe Practices for Better Healthcare: A Consensus Report is available at www​.ahrq.gov/qual/nqfpract.htm.
5.
The Joint Commission on Accreditation of Healthcare Organizations. Sentinel Event. [accessed October 31, 2006]. http://www​.jointcommission​.org/SentinelEvents/
6.
Cook RI, Woods D, Miller C. A tale of two stories: contrasting views of patient safety . Chicago: National Patient Safety Foundation; 1998.
7.
Reason J. Human error: models and management. BMJ. 2000;320:768–70. [PMC free article: PMC1117770] [PubMed: 10720363]
8.
Leape LL. Error in medicine. JAMA. 1994;272( 23):1851–57. [PubMed: 7503827]
9.
Haberstroh CH. Organization, design and systems analysis. In: March JJ, editor. Handbook of Organizations. Chicago: Rand McNally; 1965.
10.
National Quality Forum. Executive Summary, Safe Practices for Better Health Care. A Consensus Report 2003. http://www​.ahrq.gov/qual/nqfpract.pdf.
11.
Barker KN, Flynn EA, Pepper GI, et al. Medication errors observed in 36 health care facilities. Arch Intern Med. 2002;162:1897–903. [PubMed: 12196090]
12.
Norman DA. The Design of Everyday Things. New York: Doubleday/Currency; 1988.
13.
Leape LL, Kabcenell A, Berwick DM, et al. Reducing Adverse Drug Events . Boston: Institute for Healthcare Improvement; 1998.
14.
Savitz LA, Jones CB, Bernard S. Quality indicators sensitive to nurse staffing in acute care settings. In: Hendriksen K, Battles JB, Marks ES, Lewin DI, editors. Advances in Patient Safety: From Research to Implementation. Programs, Tools & Products. Vol. 4. Rockville, MD: Agency for Healthcare Quality and Research; 2005. pp. 375–85. AHRQ Publication No. 05-0021-4. [PubMed: 21250026]
15.
Clarke SP, Aiken LH. More nursing, few deaths. Qual Saf Health Care. 2006;15:2–3. [PMC free article: PMC2563991] [PubMed: 16456201]
16.
Needleman J, Buerhaus PI, Stewart M, et al. Nurse staffing in hospitals: is there a business case for quality? Health Affairs. 2006;25(1):204–11. [PubMed: 16403755]
17.
Cook RI. Two Years Before the Mast: Learning How to Learn About Patient Safety. Presented at “Enhancing Patient Safety and Reducing Errors in Health Care.”; Rancho Mirage, CA. November 8–10; 1998.
18.
Garg AX, Adhikari NK, McDonald H, et al. Effects of computerized clinical decision support systems on practitioner performance and patient outcomes: a systematic review. JAMA. 2005;293(10):1261–3. [PubMed: 15755945]
19.
Koppel R, Metlay JP, Cohen A, et al. Role of computerized physician order entry systems in facilitating medication errors. JAMA. 2005;293(10):1223–38. [PubMed: 15755942]
20.
Leape LL, Berwick DM. Five years after To Err is Human. What Have We Learned? JAMA. 2005;293:2384–90. [PubMed: 15900009]
21.
Wachter RM. Health Affairs. Web Exculsive. Vol. 30. 2004. The end of the beginning: patient safety five years after “To Err is Human” pp. W4 534–43. [PubMed: 15572380]

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this page (82K)

Other titles in this collection

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...