U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

StatPearls [Internet]. Treasure Island (FL): StatPearls Publishing; 2024 Jan-.

Cover of StatPearls

StatPearls [Internet].

Show details

EMS Quality Improvement Programs

; ; .

Author Information and Affiliations

Last Update: July 17, 2023.

Introduction

Quality Improvement (QI) is the intentional process of making system-level changes in clinical processes with a continuous reassessment to improve the delivery of a product. In Emergency Medical Services, this product is essentially the delivery of high-quality prehospital care. This differs from quality assurance, which is more consistent with protocol, process, or policy compliance. Quality improvement programs typically work best in an environment that implements change through a robust, non-punitive education program. Effective QI programs are transparent; both administration and clinical staff understand the goals and methods of any ongoing quality improvement project. Quality improvement programs often use Key Performance Indicators (KPIs) to measure ongoing clinical performance, identify areas for improvement, and assess the impact of process changes. EMS systems should build their KPIs on clinical evidence, a perceived system deficit, or an operational need. The goal of QI is to develop a high-reliability organization that operates in a relatively error-free state over a long period of time.

Quality Improvement in EMS

Quality improvement practices vary significantly among EMS agencies across the United States; however, a survey of EMS agencies nationwide in 2015 revealed that 71% of agencies surveyed report having dedicated quality improvement personnel [1]. Examples of quality improvement projects include improving prehospital aspirin administration rates in patients with acute coronary syndromes, improving paramedic identification of STEMI, and decreasing peri-intubation hypoxia, [2] among others. Developing EMS systems have also successfully implemented continuous QI programs to specifically address pre-hospital trauma care, with significant improvement in pre-specified KPIs following targeted education[3]. Each of these projects began with identifying the need for improvement. They developed a plan that included a process change and a means for assessing the impact of that change. When developing a plan, consider three questions prior to choosing key performance indicators, and implementing any process improvement project or particular system.

  • What is the aim?
  • How and what should be measured?
  • What changes should be made to improve the process/system/outcome?

The aim should be very specific, evidence-based, and focus on patient-centric outcomes. Measurements and goals are also best if defined with a patient-centric focus, specific, and numeric. Changes to be made rely on making a prediction regarding a system or process change that will result in achieving the previously defined aim [4].

Many EMS organizations elect to use the Institute of Healthcare Improvement Model of improvement: the Plan-Do-Study-Act (PDSA) cycle[5]. Effective PDSA cycles should be organized with staff involved in all aspects of the process being improved. For example, a PDSA cycle with an aim to improve cardiac arrest survival should include field paramedics as well as staff from the medical director's office, administration, and logistics personnel.

Plan-Do-Study-Act Cycle

Plan

The purpose of the “plan step” is to clearly and concisely define the objective of the project and align with the aim and measurement statements as previously defined. This step should also brainstorm solutions, pick one solution to try, and generate a plan to test and implement the proposed solution. The QI committee should define the problem using as much objective data as possible. The committee should be clear about how they will measure both the extent of the problem and how they will determine if their change is an improvement. For example, if a system is attempting to improve aspirin administration rates, a successful change could be “aspirin administration is documented in 95% of patient encounters with a chief complaint of chest pain.” The “plan” step also includes brainstorming potential solutions to answer the question “what intervention will lead to improvement?” After the selection of a specific intervention, such as employee education, a plan for reevaluation must be outlined as well. The plan should answer several questions, including “What is the problem?" “What is the intervention?” “How will we measure the problem, the change, and the outcome?” and “How do we know a change is an improvement?”

Do

This is perhaps the least complex, but often the most difficult step to accomplish. Once a plan is made, the "Do" step is simply executing the plan. Pick a specific day in the immediate future to implement the plan. Instead of immediately implementing the plan across the entire system, first, perform a small trial of the change. This small step, known as a "test of change" allows the team to see if their change has the desired effect. Often, this small test identifies unexpected areas that should be addressed before the wider implementation of the change. For example, if the change being tested is a checklist to improve intubation success, the checklist could be developed and trialed with one shift at a single ambulance station before deploying it for an entire system.

Study

The purpose of the “study” step is to determine if the plan that was designed and implemented caused a change that was an improvement. This should reflect the aim defined in the “plan” step. During the “study” phase, participants in the project should also look for any unintended outcomes. The team should discuss what aspects of the plan were functional and what parts of the plan did not work as intended. The objective data necessary to evaluate change and improvement should be collected as defined in the “plan” step [4]. For the above intubation checklist example, this step could include evaluating success rates of intubations before and after the checklist, compliance with the use of the checklist. The QI committee or staff should also get feedback on the checklist itself from the end-user. Other data, such as time on-scene, cardiac arrest rates, or other data that may be impacted by a change in intubation practices should be considered in this step as well. The most common tool for measuring the effects of these tests of change is the process control chart. These charts plot the proportion of cases that met the definition of success over time. They also include a marker demonstrating the point in time at which the change was implemented.

Act

The “Act” step is designed to take action on items found in the “study” step. The process change will either be deployed system-wide or readjusted prior to institution. Deployment is dependent on the results of the “study” phase, after determining if the change resulted in the desired outcome [12]. Following the prior example, this might include improving an airway checklist based on the feedback provided by end-users or providing additional training. Once the “Act” step is complete, the cycle begins again with planning: re-deploy an improved checklist, evaluate success rates, deploy the idea to an entire system, or receive additional feedback.

This PDSA cycle is continued in an iterative process until the desired improvement is achieved[5].

Key Components of Quality Improvement Program

Non-Punitive Culture

A QI program must use a non-punitive approach.  A “Just Culture” strategy is a common example of this approach. Just culture is an organizational method that emphasizes the accountability of both the individual and the organization in the prevention of errors and improvement [6]. Just culture also acknowledges that errors are often caused by a combination of factors, including system factors. In a “just culture” the organization must be responsible for improving the system and processes that providers are working in, while also ensuring the providers are responsible for safe choices.  It considers “near misses” to be as significant as actual errors.  A just culture approach encourages self-reporting of both near-misses and actual errors by promoting education rather than punishment. Providers who come forward with a report should be able to remain anonymous, be included in a closed-loop synopsis of events, and be praised for reporting [7]. It promotes accountability for one’s actions and education, an intolerance of ignorance, and a desire to improve the system for improved safety and outcomes constantly.

Education

Many quality improvement projects, especially clinical quality improvement projects, will require education of some form to propagate the information regarding the intervention. An individual or team with an educational focus is likely to be beneficial in achieving the desired improvement outcomes.

Team-Based Approach

A quality improvement project should involve representatives from any part of an organization that may be affected by the changes as a part or result of an improvement project. Additionally, involving individuals with many perspectives will increase the pool of unique ideas. The more ideas, the more likely the group is to find a successful change. The culture of the organization must foster belief in QI programs at the highest levels to encourage change (CEO, supervisors, etc.) [4].

Clear Aims

The QI committee must select aims that are well defined and evidence-based. The timeline for action and PDSA changes must be outlined as well. The data to be measured must be appropriate for the defined aim[4].

Issues of Concern

Challenges to Quality Improvement Systems in the Prehospital Environment

There are several concerns specific to prehospital quality improvement programs. One common misunderstanding is that quality improvement is the same as quality assurance. Quality improvement, by nature, is designed to improve a problem or process; quality assurance functions to ensure compliance with protocols or policies. Quality assurance and quality improvement, however, can be intertwined through the use of key performance indicators and quality metrics [8]. For example, EMS systems may set a goal to use evidence-based interventions, such as aspirin in acute coronary syndromes, or bronchodilators in reactive airway disease patients. Through the quality assurance process, the system may discover that their medics are not using these interventions effectively or documenting them appropriately. A concurrent quality improvement program would evaluate the cause of these shortcomings from multiple perspectives and develop a plan for addressing them.

Another concern is selecting quality indicators and improvement projects that are meaningful for the patient. Previously, quality improvement in EMS had a narrow focus that was often provider centric rather than patient-centered. Clinically, this means quality indicators should be evidence-based as much as possible, with a specific focus on areas where EMS can make a difference in patient outcomes [9]. Some of these areas include high-quality CPR with early defibrillation for out-of-hospital cardiac arrest,[10] administration of aspirin to patients with acute coronary syndromes, and use of bronchodilators in bronchoconstrictive disease, among many others.  Non-clinical EMS QI projects may focus on alternative measures, such as on-scene time, cost-related concerns, public health outcomes, [11] or even workforce safety and wellness.

Quality improvement projects often result in significant volumes of data, almost all of which require interpretation.  Clinical data often comes from patient care records, but improvement data may also come from dispatch information, hospital records, system financial records, surveys, and other sources, depending on the scope of the project. This data is typically plotted over time in control charts to give real-time feedback[5]. While it is tempting to jump to conclusions based on early changes, the team should look for sustained change. A small variation in data is natural (common cause variation) and does not likely represent a real change secondary to quality improvement initiative (special cause variation). Fortunately, there are several ways to analyze data in order to identify true changes instead of these normal variations [12][5][12]. The QI team should include individuals with experience in interpreting and managing data[13].

EMS-specific quality improvement projects face unique challenges, including difficulty obtaining desired data from healthcare records, both EMS records as well as follow-up data from the hospital, limiting evaluation of patient-centered outcomes. Such data can be time- and labor-intensive to obtain. Some challenges are unique to the unpredictable nature of EMS. For example, a system may want to evaluate a way to improve the outcomes of critical trauma patients, but may not be able to collect useful data due to the relatively rare occurrence of these events in their system. The system may also be limited in the availability of data from the associated hospital admission.  EMS systems must also consider the multiple layers of EMS delivery. If attempting to minimize the time from 911-call to CPR in cardiac arrest patients, systems must not only consider the responding EMTs and paramedics but must also consider the 911-call takers and dispatchers when considering changes and solutions.

Clinical Significance

Legal requirements for medical directors vary by state. In most states, however, EMS medical directors are required to ensure EMT and paramedic compliance with system clinical guidelines and protocols as routine quality assurance. This does not ensure continuous quality improvement. While compliance is critical to ensure baseline performance, these performance indicators will likely be indicative of any necessary changes to implement. Essentially, baseline protocol compliance and quality assurance can be paired with quality improvement projects and initiatives to improve clinical outcomes[14].

Some organizations have organized large-scale databases that can be used for both research and large-scale quality improvement. A commonly cited example is the Cardiac Arrest Registry to Enhance Survival (CARES) database. This database is designed to collect data regarding cardiac arrest outcomes with the goal of improving cardiac arrest survival. Prehospital data is paired with hospital outcomes for review. This data can be used at an agency level or extrapolated to large scale initiatives at the regional or state level through an integrated QI program, [[15] leading to significant improvements in EMS systems, reliability, and ultimately the quality of care.

Quality improvement may also result in better integration of care from the prehospital to the hospital environment by ensuring that appropriate treatments begin in appropriate patients and that the treatments are evidence-based.  Operational improvement projects may also result in improved response times, improved on-scene times when appropriate, and safer work environment for medics.

Review Questions

References

1.
Redlener M, Olivieri P, Loo GT, Munjal K, Hilton MT, Potkin KT, Levy M, Rabrich J, Gunderson MR, Braithwaite SA. National Assessment of Quality Programs in Emergency Medical Services. Prehosp Emerg Care. 2018 May-Jun;22(3):370-378. [PubMed: 29297735]
2.
Jarvis JL, Gonzales J, Johns D, Sager L. Implementation of a Clinical Bundle to Reduce Out-of-Hospital Peri-intubation Hypoxia. Ann Emerg Med. 2018 Sep;72(3):272-279.e1. [PubMed: 29530653]
3.
Scott JW, Nyinawankusi JD, Enumah S, Maine R, Uwitonze E, Hu Y, Kabagema I, Byiringiro JC, Riviello R, Jayaraman S. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis. Injury. 2017 Jul;48(7):1376-1381. [PubMed: 28420542]
4.
Silver SA, Harel Z, McQuillan R, Weizman AV, Thomas A, Chertow GM, Nesrallah G, Bell CM, Chan CT. How to Begin a Quality Improvement Project. Clin J Am Soc Nephrol. 2016 May 06;11(5):893-900. [PMC free article: PMC4858490] [PubMed: 27016497]
5.
McQuillan RF, Silver SA, Harel Z, Weizman A, Thomas A, Bell C, Chertow GM, Chan CT, Nesrallah G. How to Measure and Interpret Quality Improvement Data. Clin J Am Soc Nephrol. 2016 May 06;11(5):908-914. [PMC free article: PMC4858492] [PubMed: 27016496]
6.
Boysen PG. Just culture: a foundation for balanced accountability and patient safety. Ochsner J. 2013 Fall;13(3):400-6. [PMC free article: PMC3776518] [PubMed: 24052772]
7.
Gallagher JM, Kupas DF. Experience with an anonymous web-based state EMS safety incident reporting system. Prehosp Emerg Care. 2012 Jan-Mar;16(1):36-42. [PubMed: 22128906]
8.
Jarvis JL. How to Improve (and How to Tell). Key indicators can help you determine if an improvement really enhances performance. EMS World. 2017 Jun;46(6):36-40, 49. [PubMed: 29966075]
9.
Myers JB, Slovis CM, Eckstein M, Goodloe JM, Isaacs SM, Loflin JR, Mechem CC, Richmond NJ, Pepe PE., U.S. Metropolitan Municipalities' EMS Medical Directors. Evidence-based performance measures for emergency medical services systems: a model for expanded EMS benchmarking. Prehosp Emerg Care. 2008 Apr-Jun;12(2):141-51. [PubMed: 18379908]
10.
Gonzales L, Oyler BK, Hayes JL, Escott ME, Cabanas JG, Hinchey PR, Brown LH. Out-of-hospital cardiac arrest outcomes with "pit crew" resuscitation and scripted initiation of mechanical CPR. Am J Emerg Med. 2019 May;37(5):913-920. [PubMed: 30119989]
11.
Malta Hansen C, Kragholm K, Dupre ME, Pearson DA, Tyson C, Monk L, Rea TD, Starks MA, Nelson D, Jollis JG, McNally B, Corbett CM, Granger CB. Association of Bystander and First-Responder Efforts and Outcomes According to Sex: Results From the North Carolina HeartRescue Statewide Quality Improvement Initiative. J Am Heart Assoc. 2018 Sep 18;7(18):e009873. [PMC free article: PMC6222952] [PubMed: 30371210]
12.
Gupta M, Kaplan HC. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts. Clin Perinatol. 2017 Sep;44(3):627-644. [PubMed: 28802343]
13.
Lickiss P. HOW TO USE DATA AND TECHNOLOGY in Quality Improvement and Training. EMS World. 2017 Jun;46(6):42, 44, 46. [PubMed: 29966076]
14.
O'Connor RE, Slovis CM, Hunt RC, Pirrallo RG, Sayre MR. Eliminating errors in emergency medical services: realities and recommendations. Prehosp Emerg Care. 2002 Jan-Mar;6(1):107-13. [PubMed: 11789638]
15.
Mears GD, Pratt D, Glickman SW, Brice JH, Glickman LT, Cabañas JG, Cairns CB. The North Carolina EMS Data System: a comprehensive integrated emergency medical services quality improvement program. Prehosp Emerg Care. 2010 Jan-Mar;14(1):85-94. [PubMed: 19947872]

Disclosure: Erin Lincoln declares no relevant financial relationships with ineligible companies.

Disclosure: Essie Reed-Schrader declares no relevant financial relationships with ineligible companies.

Disclosure: Jeffrey Jarvis declares no relevant financial relationships with ineligible companies.

Copyright © 2024, StatPearls Publishing LLC.

This book is distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0) ( http://creativecommons.org/licenses/by-nc-nd/4.0/ ), which permits others to distribute the work, provided that the article is not altered or used commercially. You are not required to obtain permission to distribute this article, provided that you credit the author and journal.

Bookshelf ID: NBK536982PMID: 30725667

Views

  • PubReader
  • Print View
  • Cite this Page

Related information

  • PMC
    PubMed Central citations
  • PubMed
    Links to PubMed

Similar articles in PubMed

See reviews...See all...

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...