By Ken Simpson, MedStar and Jimmy Pierson, Medic Ambulance Service
Scenario: You’re sipping your morning coffee when you notice a new e-mail from your city manager in your inbox. The subject line is “ICMA Webinar Question.” The e-mail relates that your city manager just participated in a webinar by the Center for Public Safety Management (CPSM) provided to the members of the International City/County Manager’s Association about high performance/high value EMS delivery models. The email concludes with the following question:
“The presenters said that response time performance should be balanced with safety and operational metrics such as “HOT” response percentage, crash rate per 100,000 miles and ambulance unit hour utilization (UHU). What are your current metrics for these measures? Oh, and they also mentioned a process in which patient satisfaction is measured and reported by an outside agency. As you know, we’re in a very difficult financial season and I’d like to know how we compare to the performance metrics from the Academy of International Mobile Healthcare Integration’s benchmarking study?”
The email provides a link to the Benchmarking Study which shows comparable data from numerous AIMHI member EMS agencies across the U.S. and Canada.
How would you respond? Are these metrics you currently track and report? Are your metrics comparable to the ones your city manager referenced? If not, where do you start?
In a recent webinar, AIMHI members Ken Simpson of MedStar Mobile Healthcare, and Jimmy Pierson of Medic Ambulance in Solano, California, outlined the various operational metrics their systems track and publicly report to continuously demonstrate high performance and high-value operations. For those who may have missed the webinar, here’s a link to a recording and a summary of the salient topics we covered.
Response times
Response times continue to be one of the most widely tracked and reported operational metrics for EMS. While there may be some customer satisfaction related to response times, there is a growing body of peer-reviewed research that seems to indicate ambulance response times have very little impact on patient outcomes for the vast majority of EMS responses. This means that we may be able to revise our approach and staffing for EMS responses by focusing on outcomes that really matter, such as compliance with evidence-based clinical bundles of care. But, we will likely need to educate local communities by publishing data on those metrics as much as we do response times.
Response time reporting should be standardized, and reported from the patient’s perspective; starting the clock from the time a call is answered in the PSAP and ending with time on scene. This means including times for call processing, turnout and travel time to the scene. Response times should also be reported in terms of average and fractile. Average because that’s easily understood by the public. Fractile because it demonstrates a measure of reliability. For example, the MedStar system desires an ambulance fractile response time reliability of 85% in 11 minutes or less from the time the call is answered at the PSAP. The example report below, published in MedStar’s monthly metrics publication, shows they achieved an 11 minute or less response time in 87.0% of the calls. This means that only 13% of the total responses had a response time of greater than 11 minutes, zero seconds. Analyzing the response times, the average for that month was 7 minutes, 52 seconds.
Emergency medical dispatch compliance and hot responses
Emergency medical dispatch is an exceptionally valuable service. Not only does it provide life-saving instructions to callers, it also assists with the proper allocation of resources. Is this an ALS level response? Does it require a lights and sirens (hot) response? Does it require first responders? These are important decisions that need to be as accurate as possible. Measuring the performance of the PSAP is an important metric for value and improvement. Two of the performance metrics that should be tracked and reported are compliance with EMD protocol (measured as high compliance, compliant, partial compliance, low compliance and non-compliant) and call processing time (measured from the time the call is answered to the time the first unit is dispatched).
Example reports from the MedStar system are shown below.
Ambulance operations
Operating an ambulance is dangerous, and lives depend on it, not only during the response, but during transport. As we write this article, EMS1 reported an ambulance crash in which five people were injured, one critically; the two crew members, the pediatric patient they were transporting, the mom accompanying the patient, and the driver of the other vehicle. Preliminary findings indicate it was an intersection crash while the ambulance was operating hot.
Effective and compliant EMD processes can reliably assign response modes based on the likelihood of a hot response, making a clinical difference in the patient’s outcome. AIMHI member agencies use EMD effectively to reduce the frequency of hot responses to enhance safety for crewmembers and the public with an average of only 53% of 911 calls classified by EMD as requiring a hot response.
There are many things that influence safe ambulance operations, such as speed, cornering, braking, seatbelt use and attentiveness, to name a few. Medic Ambulance of Solano County, California, uses active monitoring to identify unsafe ambulance operating practices. It’s a real-time reporting feature that notifies managers and supervisors of potential safety issues in the ambulance such as speed and inattentive operators. This system also creates a safety dashboard that is used to improve ambulance operation safety across the agency and with individuals.
Ambulance crashes should be tracked and reported to identify trends and opportunities for improvement. An example of a report used by MedStar for this purpose is shown in below.
Resource allocation and utilization
Unit productivity is a crucial measure of system performance. Productivity levels that are too high negatively impact crew safety and morale, as well as response times. Productivity levels that are too low result in high cost and low skills utilization, which may lead to poor clinical performance due to under-utilization. Matching resource availability with anticipated response volume is a delicate and detailed science that requires careful analysis, forecasting and adjustment.
The two key data points necessary for matching supply to demand are response volume by hour of day, and cycle time (sometimes referred to as task time). The more responses you have, the more resources you need. Similarly, the longer your cycle time, the more resources you need. Unit hour utilization (UHU) is a measure of productivity. It’s essentially a percentage of time an on-duty unit is committed to a response or transport. A UHU of 0.450 means that on average, a unit is on a call 45% of the time they are on-duty. To determine your UHU, take the number of produced unit hours for a given time period and divide it into the number of responses for the same period. For example, if you staff 4 ambulances for 24 hours yesterday, you staffed 96 unit hours. If the system generated 20 calls during that same 24-hour period, your UHU was 0.208, meaning, on average, on duty ambulances were on a response 20.8% of the time they were on-duty.
While there is no standard UHU (because of the variability of cycle times), most of the AIMHI member agencies achieve a UHU between 0.228 and 0.476.
EMS accreditation
We cannot discuss quality clinical and operations metrics without mentioning the value of achieving agency accreditation. Accreditation is a rigorous process that essentially tells everyone that an agency has not only implemented numerous programs for achieving the highest quality standard, but that the agency has held itself accountable through verification of those standards through an outside accrediting body. The Commission on Accreditation of Ambulance Services provides accreditation for ambulance operations and the International Academies of Emergency Dispatch provides accreditation for dispatch centers as Accredited Centers of Excellent (ACE).
All 13 AIMHI members included in our recent Benchmarking Study hold dual accreditations from CAAS and IAED. 6 hold additional accreditations or quality awards.
We hope that this article helps you generate the data to answer the city manager’s questions. If you have not actually received that e-mail, you may in the near future! And, it’s likely she has a copy of the AIMHI System Performance Report because it was offered to all participants in the webinar referenced at the start of this article, and will likely compare your metrics with those in the report. If you’d like any assistance with developing and evaluating your system performance metrics, contact AIMHI at Hello@aimhi.mobi.