Adventist Health System’s Intensive Drive to Address Patient Harms

Sept. 14, 2015
Loran Hauck, M.D., SVP for clinical effectiveness and CMO at the Florida-based Adventist Health System, has been helping to lead an intensive, years-long initiative to address hospital-acquired patient harms, with astonishing results in patient safety improvement

A very major patient safety initiative has been underway for a few years now at the Altamonte Springs, Fla.-based Adventist Health System, a 44-hospital-campus integrated health system operating in 10 states, and which encompasses 7,700 licensed beds, 9,050 physicians (of whom 1,436 are employed physicians), and 55,000 employees. At Adventist Health System, Loran Hauck, M.D., senior vice president for clinical effectiveness and chief medical officer, has been helping to lead his colleagues in an evidence- and consensus-based ongoing initiative to reduce and eliminate hospital-based incidents of patient harm.

In early August, Dr. Hauck and his colleague, David Stockwell, M.D., an associate professor of pediatrics and a critical care specialist at National Children’s Health System  in Washington, D.C., and vice president of clinical services at Pascal Metrics, a Washington, D.C-based software vendor with which the Adventist Health System leaders have partnered in their initiative, presented a webinar sponsored by the Scottsdale Institute entitled “Real-Time ID of At-Risk Patients at Adventist Health System,” in which the described in detail Adventist Health System’s patient safety initiative. And a short time after that, HCI Editor-in-Chief Mark Hagland interviewed Dr. Hauck to get his further perspectives and insights on the initiative.

During the August 6 webinar, Dr. Hauck explained that “This journey really started in 1996, when our CEO hired me to begin to implement evidence-based practices. We began to go live in 2005 with Cerner Millennium. In 2006, we began doing an annual safety culture survey, and we’ve been doing it for nine years. Peter Pronovost [Peter Pronovost, M.D., Ph.D., director of the Armstrong Institute for Patient Safety and Quality at Johns Hopkins Medicine, as well as Johns Hopkins Medicine’s senior vice president for patient safety and quality] asked me in about 2009 whether we’d be interested in participating in a follow-up study to a Keystone Study on central line infection prevention originally done by the Michigan Hospital Association and published in the New England Journal of Medicine.  At the conclusion of the Adventist Health System study, they achieved an average of 0.4 infections per 1,000 device days or line days.

Loran Hauck, M.D.

"We implemented the IHI [the Cambridge, Mass.-based Institute for Healthcare Improvement] Global Trigger Tool methodology for 24 AHS hospitals in 2009. By 2010, after one year of data,” Hauck told webinar attendees, “we had identified the most frequent ways patients experienced harm: hypoglycemia induced by insulin therapy; over-sedation with opioids and sedatives; falls with injury; catheter-associated urinary tract infections – those were the biggest causes. So beginning in 2010, we started system-wide collaboratives around those most frequent harms, and began to implement system-wide work in those areas. We also began talking to hospital leadership at board and medical staff retreats about safety and harm. By 2012, all our hospitals were fully live on Cerner Millennium with electronic evidence-based physician electronic order sets and interdisciplinary plans of care. Then in 2012, we implemented barcoding at the point of medication administration.”

Dr. Hauck went on to say, “Then, somewhere in the fall of 2012 or winter of 2013, I met David Classen [David Classen, M.D., an assistant professor of medicine at the University of Utah, and currently CMIO of Pascal Metrics] at a national conference—he was a speaker and I was on a panel discussion.  In private we got to talking about some of the tools we had available in AHS that would lend themselves to extending the work that David had done at Intermountain Healthcare and later Kaiser Permanente of Northern California to use the EHR [electronic health record] to electronically identify risk factors that might be indicative of harm. We had four nurses reviewing 20 charts from 24 hospitals every month for four years; 21,000 charts were reviewed using the Global Trigger Tool Methodology. Pascal helped us to develop a more robust analytics database using the Pascal Health Data Bench, more robust than the homegrown tool we had started with. And, starting in January 2013,” he added, “we stopped using the IHI methodology and began building and moving towards real-time trigger detection.”

Among the contrasts that Dr. Hauck noted in the webinar, between retrospective safety information and real-time safety information based on patient clinical data, were the following: retrospective safety information involves interventions after the event; systems detect only a fraction of all events (from random sampling); and retrospective data-gathering is very labor-intensive. Meanwhile, real-time safety information based on clinical data involves automated detection of harm; and it has the ability to trend and prioritize, and lets its users learn from all identified defects (every patient, every day, real time). What’s more, in contrast to the use of retrospective safety data, which does not improve overall patient safety or create a culture of measurement and improvement, using real-time data offers a “socio-technical” approach to patient safety improvement, with a focus on learning systems and creating a culture of learning and safety, and the enablement of actionable safety insights.

Among the types of harm that Dr. Hauck and his colleagues have been measuring are temporary harms that require intervention but do not increase lengths of stay; and severe harms, which fall into one of four categories: temporary harms that prolong hospitalization; permanent patient harms; harms that cause life-sustaining intervention; and harms that contribute to patient deaths.

Among the initial results of the review of 20,952 records from 2009 to 2012 were the following: 15,564, or 74 percent, involved no patient harm. Among the 5,388, or 26 percent, of records that documented patient harm, 3,941 cases, or 73 percent, involved one harm, while 1,447, or 27 percent, involved more than one harm. Meanwhile, 4,280 cases, or 57 percent, involved temporary patient harm, while 3,191 cases, or 43 percent, were deemed severe, per the four categories mentioned just above.

Using the Global trigger Tool methodology, within three years, from 2009, through 2012, the Adventist Health System leaders had been able to document a 67.4 percent reduction in hospital-acquired events.

Below are excerpts from the interview that Hagland recently conducted with Dr. Hauck, in reference to the August 6 webinar.

What were your organization’s main goals around this ambitious initiative?

To characterize this broadly, in 2009, the only well-standardized, documented methodology for measuring and accurately quantifying the rate of harm that occurs in hospitals, was the IHI Global Trigger Tool Methodology. The IHI had a very detailed white paper on their website, with trigger tools, so we used that to train our staff and quantify the rate of harm we might find here. That methodology is retrospective, based on data 60-90 days post-discharge. So that gave us a baseline measurement of harm. When we published our first paper on this topic in the Joint Commission Journal on Quality and Safety, Roger Resar, M.D., and David Classen, M.D., wrote an editorial saying said that most hospitals and health systems have been reluctant to measure the rate of harm and quantify it, and publish the results, and congratulated Adventist Health System for being among the first hospital organizations to do this. And yes, we of course found harm. But it was not very actionable data, because the patient had already been discharged, and in a few cases, expired, so the opportunity to create real-time change did not exist. So that began when I met David Classen in 2012, and we began a dialogue around whether we could automate this process, using tools within Cerner Millennium.

So we began the journey to move from retrospective quantification and measurement of harm, to near-real-time measurement, meaning within a few minutes to hours. We wanted to find out whether we could do something about harm while patients were in the hospital, and maybe even prospectively by predicting possible harm and intervening. And now we’re just entering the phase where we can get predictive.

Were you surprised at your findings?

No, not really. After the first year, for example, we sometimes gave hyperglycemic patients too much insulin, and they were becoming became hypoglycemic (medication-related glycemic events). Another example involved over-sedation with sedative drugs or opioid narcotics given in excess or in combination that required rescue reversal; another was hospital-acquired catheter-induced urinary infections, and hospital-acquired pressure ulcers. And no, none of those surprised us. I’m sure I would have predicted that those would have ended up on a possible list of the top ten even before we had started the work. I think the magnitude of the harm was what hit us—our reaction was, oh my goodness, we can’t be having that many adverse events.

David Classen had worked with another health system prior to working with us, and their senior management team was absolutely unwilling to publish that data; they didn’t want it in the press. But these adverse events are happening in every hospital, and we decided that we were willing to be publicly accountable.

In doing so, you and your colleagues were opening up a black box and peering in.

Yes, and it starts with senior leadership, the CEO, COO, and myself: we felt that if we were going to be a learning organization and use our learning to prevent adverse events in the future, we had to be transparent. And we were internally transparent. Pascal Metrics was extremely innovative in creating their Pascal HealthBench™, a reporting platform. So our hospital could go on there and look at the adverse events that had occurred in any quarter, looking at the rate and type, etc.—Pascal has very robust reporting functionality. So we focused on that. And we have a process now called the Clinical Close, a process of monthly reviewing quality, safety, and patient experience data very structured. And this data from the Global Trigger Tool fed into that process. So it starts at the top, with senior leadership committing to data transparency, and a relentless focus on actively and accurately focusing on the types of harm occurring in our hospitals, and developing very specific strategies to reduce and eventually eliminate harm.

Can you drill down a bit on the 67.4-percent figure, around the reductions in harm that that statistic represents?

Because we have such varying sizes of hospitals, we had to have a way of normalizing the data, so you normalize it by looking at serious adverse events per 1,000 patient days. In 2009, we were having 43 serious adverse events per 1000 patient days, and by 2012, that number had fallen to 14 per 1,000 patient days, for a reduction of 67.4 percent. Meanwhile, among all 21,000 patients across all 24 hospitals—the percent of patients who experienced an adverse event went from 28 percent to 17 percent, in three years. And that represents a 39.3-percent reduction in the percentage of patients who had experienced either temporary or serious adverse events.

So let’s talk about some of the work that went into reducing harms.

In January 2009, we started doing the Global Trigger Tool review, which we described in the paper and editorial I discussed in the web conference. And then in May 2009, we started implementing evidence-based computerized physician order entry (CPOE). CPOE was mandatory and we had 100-percent use of by our medical staffs; use of evidence-based sets was not mandatory, but they were widely used—so from May 2009 to August 2011, we rolled out CPOE [computerized physician order entry].

Meanwhile, 12 months into this, in January 2010, we had a year’s worth of data (about 5,200 charts that had been reviewed) and we knew from the Pascal tool, what our most frequent causes of harm were.  So we began system-wide improvement collaboratives, in which all 24 hospitals participated in this initiative, run out of the corporate Office of Clinical Improvement, where we worked on improvement in catheterization, focused on strategies to improve our results around hypoglycemia, and began system-wide systemic improvement processes.

And then we got began a monthly detailed clinical performance review process (the Clinical Close) using a detailed scorecard of patient safety, quality, and patient experience data. And all the division CEOs, plus the members of the corporate C-Suite – basically our senior leadership – attend those meetings every month, and four hospitals report on their clinical close dashboards. We started that process in October 2010. In our health system, the degree and intensity of focus on that process driving towards clinical improvement was focused on the items on that scorecard. So all the glycemic measurement measures, hospital-acquired infections, etc., patient experience, etc., are on that scorecard. And when a hospital falls into the red on those measures in the scorecard, its leaders are required to develop action plans for improvement.

Meanwhile, we’ve developed evidence-based electronic interdisciplinary plans of care, with tools to document nursing interventions based on physician orders. So our plan of care consists of evidence-based physician order sets, along with allied health interdisciplinary plans of care (IPOCs) that create our standard of care for our patients. And a lot of the elements of that plan of care are designed to prevent harm, such as orders for ordering DVT prophylaxis, antibiotic prophylaxis, reminders to remove Foley drainage catheters when no longer necessary; a lot of the triggers of harm for patients are built into those evidence-based order sets and interdisciplinary plans of care.

So all of these things were going on in that four-year period, and all of those elements contributed to the change. Obviously, in the aggregate, they produced a dramatic reduction in harm; this work was all going on during this four-year period in data collection.

What have your organization’s biggest lessons learned been so far, and what are the implications of those, for the U.S. healthcare industry overall?

For me, one lesson learned is that if you leverage an electronic medical record appropriately, it can be a very significant tool for improving patient safety. If you just implement it and hope it works for the desired effect, you won’t get these kinds results. But if you begin with commitment from senior leadership, approach this rigorously, and implement it seriously, you can get these results. When I presented this same presentation in May in Boston at an AMIA meeting, my title for my talk was, “Leveraging Your HIT Investment to Improve Patient Safety.”  That is what I believe we have accomplished

What would your advice to CIOs and CMIOs, about participating in and helping to facilitate this kind of work?

I think you have to start with the end in mind. By that, I mean, when you have your electronic medical record fully implemented and deployed, with all the features I’ve described such as evidence-based order sets in CPOE, barcoding, and evidence-based interdisciplinary care plans, then it becomes more than just turning your paper chart into an electronic chart; it becomes a platform you leverage to drive quality and safety across the system. That’s our vision at Adventist Health System, and it’s proven, in my mind, to be the right vision, because we’ve gotten amazing results.

Sponsored Recommendations

A Cyber Shield for Healthcare: Exploring HHS's $1.3 Billion Security Initiative

Unlock the Future of Healthcare Cybersecurity with Erik Decker, Co-Chair of the HHS 405(d) workgroup! Don't miss this opportunity to gain invaluable knowledge from a seasoned ...

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...

Spotlight on Artificial Intelligence

Unlock the potential of AI in our latest series. Discover how AI is revolutionizing clinical decision support, improving workflow efficiency, and transforming medical documentation...

Beyond the VPN: Zero Trust Access for a Healthcare Hybrid Work Environment

This whitepaper explores how a cloud-enabled zero trust architecture ensures secure, least privileged access to applications, meeting regulatory requirements and enhancing user...