The 2017 Healthcare Informatics Innovator Awards: Co-Second-Place Winning Team—Mercy-St. Louis

Oct. 2, 2017
Mercy has made great strides once it turned to a machine learning application that uses advanced analytics to help it identify hidden patterns in its own data.

As part of a long-term effort to improve operational efficiency, the St. Louis-based Mercy Health system has spent years developing clinical pathways—a way to identify best practices for high-cost procedures such as total knee replacements and systematize them across the organization. Although the nonprofit, 45-hospital group had some success with that approach, Mercy made even greater strides once it turned to a machine learning application that uses advanced analytics to help it identify hidden patterns in its own data.

“We had a great EHR and tons of data,” says Vance Moore, president of business integration at Mercy, speaking of the organization’s electronic health record. “We had tried a couple of different data-mining solutions, and they showed promise, but they weren’t giving us what we were looking for. We had to find the truth within our data.”

The data-driven approach appears to have done that. In one example, an original care pathway developed manually at Mercy reduced the cost of total knee replacements by 7 percent. But the machine-learning approach cut an additional 5 percent off the cost of knee replacement, while improving or maintaining low rates of mortality and morbidity across all cases.

For this big-data breakthrough, the editors of Healthcare Informatics have selected Mercy as the co-second-place winning team in the 2017 Innovator Awards program. 

Mercy, which is the fifth-largest Catholic healthcare system in the United States with operations in four states, has been unified on Epic Systems’ EHR for almost 10 years and has worked to integrate that data with nonclinical data for analytics purposes. “Our clinical data set is extremely rich, so we have been doing multiple projects to try to operationalize the opportunities that come out of that,” says Todd Stewart, M.D., vice president of clinical integrated solutions. Among those efforts were the first steps to standardize care processes and the creation of care pathways, including the establishment of a governance structures to operationalize best practices. Each care pathway had a specialty council assigned to working with peers on identifying variances in care and working through common solutions where possible.

Health IT leadership team at St. Louis-based Mercy Health system

Stewart also notes that Mercy keeps in touch with clinical leaders at other health systems working on the care pathways concept. “Our specialty council structure is modeled after work Mayo Clinic has been doing for years, and we have worked quite a bit with Intermountain Healthcare as well,” he adds.

Although the early work with care pathways was valuable, the executives noticed a few limitations holding them back. First, there were inefficiencies, because the typical pathway took up to six months to develop. Physicians found it difficult to take time away from patient care to attend quality improvement meetings. Second, the pathways were vulnerable to the biases of the clinicians involved. The best practices they identified reflected their own clinical experience, but there was no way to tell whether it was backed up by patient data. Finally, they found that at least 20 percent of Mercy clinicians failed to adopt care pathways because they were skeptical of the process, as no internal data was available to back up the best practices.

It is one thing for an administrative team to look at a best practice or set up an expert panel, and develop an optimal way to do something, Stewart says, “But anyone who has worked with a large group of physicians knows it is very difficult to motivate experienced clinicians who are driven by their own best practices and the way they were trained.” He says that they learned early on that they had to take a peer-driven approach. “If your peers are saying this is a better way to do a total knee replacement, and they are doing those procedures all day long, it is a different conversation than hearing it from an administrator who is just looking at data.”

Additionally, they had to show clinicians their own data, not industry-wide benchmark studies. “When you take that peer-to-peer process and combine it with our own data, and benchmark their results and costs against their peers internally, it is a very different discussion,” Stewart says. They can get down to the granular level of whether a scalpel tip that costs $100 more is really worth it.

In mid-2013, Mercy started realizing it had to find a better way to analyze its own data. Moore happened to be at a meeting in Silicon Valley, where he had a dinner conversation with Amy Chang, who was formerly in charge of Google Analytics. “I told her I have all this information, but I don’t know how to surface the truth out of it,” he recalls. She pointed him to a startup company called Ayasdi that was being developed by former Stanford University researchers. She told him that Ayasdi doesn’t start out with a theory and try to prove it; it starts with the unknown and provides you patterns in your data that you should investigate. Moore set up a meeting with Ayasdi executives right away.

Ayasdi, which Healthcare Informatics profiled in 2016 as one of its “Up and Coming” companies, has created clinical variation management tools that leverage both machine learning and what it calls “topological data analysis” (TDA) to extract insights from millions of data points. TDA brings together machine learning with statistical and geometric algorithms to create compressed representations and visual networks that allow organizations to more easily explore critical patterns in their data.

In a 2016 interview with Healthcare Informatics, Ayasdi CEO Gurjeet Singh noted that health systems want to wring out all the variation in their systems, so that they can determine which type of surgery is best for patients with specific co-morbidities. “A hospital system like Mercy believes that a system for discovering and operationalizing care paths could save them $50 million to $100 million over the course of three years,” he said.

In its initial work with Ayasdi, Mercy picked three care pathways it had already established to see if the machine learning could improve on what it had already done, or if it could at least validate the work Mercy had done. “Maybe it doesn’t work at all, and we can minimize our investment,” Moore remembers thinking. 

“It turns out that even with a procedure we had just completed, it was able to show an improvement of 5 percent,” Moore says. “And in the ones we had not done yet, it showed a savings of 15 percent. All of a sudden, that trial gave us the hope that we could extend the use of Ayasdi to the next level.”

As an example of the type of hidden insight Ayasdi helped discover, one group of surgeons’ patients consistently had a shorter length of hospital stay and shorter time to ambulation than other total knee surgery replacements across Mercy. These doctors prescribed a medication that is not widely used at an earlier postsurgical time than their peers. The medication reduced patients’ pain so they could get out of bed and walk around sooner, improving their outcomes and reducing costs.

“One of the attractive things about Ayasdi was its ability to rapidly explore very large, complex data sets to find significant relationships and then generate hypotheses for you,” Stewart says.

As Mercy enthusiastically embraced the objectivity that the machine learning approach brought, one of the challenges has been getting comfortable sharing so much data with Ayasdi. “We have tended to say, ‘Tell me the data you want and we will send it over.’ Their approach is the opposite,” Moore explains. “They say, ‘Send us all the data and we will find patterns within it.’ That has been an internal struggle and one for Ayasdi as well. But we are going to be involved in more third-party relationships, so we are going to have to get more savvy about extraction, conditioning and transmission of data in both directions,” he says.

Another thing the work with Ayasdi helped Mercy realize was that some of its data is incomplete and potentially inaccurate. “They would ask if we knew a certain field was only being filled in 60 percent of the time,” Moore says. “We didn’t know that.” That led them to investigate what was happening in the care setting. In some cases, they realized they could stop collecting that data because it was wasting people’s time. In other cases, they had to mandate collecting it across the board because they were missing critical information.

Stewart says the Ayasdi tool is valuable for monitoring adherence to the pathway as well as non-adherence. “We could be extremely rigid in saying that everyone has to use a pathway and be 100 percent compliant or their pay goes down,” he says, “but there has to be a balance with that because it is almost like a biological system. You have to allow for some mutations. We have learned that sometimes a group doing something different than the care pathway could have better outcomes or lower cost. That can cause you to re-examine your pathways.”

To date, Mercy has been able to work through approximately 35 care pathways, and plans to address as many as 80. “The 35 we have done make up a pretty substantial part of the care we provide,” Moore reports, “so we have to decide whether to prioritize refining the ones we have already built or continue building out the ones we haven’t done yet. Eventually we will do both.”

In the last year, the care pathways support team of six nurses and nurse practitioners has moved from Moore’s department to the quality department. “I am an operations guy, not a clinical guy, and initially we were approaching this almost as an engineering exercise: build a process that highlights variation and let the clinicians work it out,” he says. “Now that the process has been worked out, it has become more of a clinical and quality activity.”

What has been most beneficial about the new approach, Moore says, is that it is bringing objective information to the surface. That has led to a much more collaborative culture, he added. “Everybody is now focused not only on reducing variation, but also reducing cost and improving quality. We are expanding our use of knowledge and tools to positively impact care.”