Uncovering Some of the Key HIT Challenges Inherent in ACO Development

Feb. 5, 2016
The Advisory Board Company's Nicole Latimer parses some of the intense strategic IT-related challenges inherent in moving forward in the accountable care sphere, in the wake of CMS's unveiling of the Next Generation ACO Program

On Monday, Jan. 11, the federal Centers for Medicare & Medicaid Services (CMS) fully unveiled the new Next Generation ACO program, revealing both the details of the program’s parameters, and the list of 21 accountable care organizations (ACOs) that had joined the program, which was officially fully launched on Jan. 1.

As noted in this publication’s report on the unveiling on Monday, the Next Generation ACO Program now joints the Pioneer ACO Program and the Medicare Shared Savings Program (MSSP) as programs that patient care organizations nationwide can join and participate in, around accountable care concepts and principles.

Among those closely tracking all the CMS-sponsored ACO programs are senior leaders at The Advisory Board Company, the Washington, D.C.-based research, technology, and consulting firm. Last week, HCI Editor-in-Chief spoke first to Rob Lazerow, the organization’s practice manager, Research and Insights, for his perspectives on the overall strategic issues facing the developers of accountable care organizations sponsored by CMS. Hagland then spoke with Nicole Latimer, The Advisory Board Company’s senior vice president, Performance Technologies, to get a sense of the broad strategic IT issues facing ACO leaders. Below are excerpts from that interview.

People often understand the very broadest strategic concepts around accountable care, but their implementation/execution is so much more difficult in practice, correct?

Yes. And when I look at the introduction of the Next Generation ACO Program, I agree with Rob [Lazerow] that it’s great that providers are getting greater flexibility. And it’s great that they have more flexibility about what they’re going to put into place and how they’re going to do it. On the other hand, because there are greater risks and rewards involved, there’s going to need to be much greater rigor and precision around the use of data. To get those greater rewards, you’ll need to be very effectively manage your cost of care, and to coordinate care. There aren’t a lot of financial incentives for Medicare patients to stay inside a network, so it’s going to be up to the network to keep them in. And that coordination will include the sharing of patient records to see where the patient has been and what kinds of services they’ve received.

Nicole Latimer

And lastly, how do you use your data to improve access? Because if your patient can’t get in to see a provider in a timely manner, they’re going to go somewhere else. So for me, I think implementation starts with making sure that your existing data sources are well-used and accurate. Particularly around costs, we see a lot of organizations that don’t have a robust cost accounting system; and they can’t get to accurate cost per case, cost per episode, or cost per patient. So if you don’t have a good sense of your costs right now—your first step is figuring out, do I need a new cost accounting system? How will I implement that system? And how will I change clinical workflows to capture costs? Because right now, some of those costs are captured when nurses scan medications and add those scans to the chart, and somebody adds those to your bill. If there’s a more automated way to do that, you won’t need people having to go into every chart to scan those items into it.

Marrying clinical and claims data and analyzing the data, is difficult, correct? Even with prospective attribution now under the Next Generation ACO Program?

Yes, it’s definitely difficult, along multiple dimensions. First is the difficulty  involved in the actual physical matching of clinical and claims data—you have to make sure you’re matching the right data for the right Mrs. Smith. We have a patient-matching algorithm that requires 12 separate elements need to create a match; and I’m sure others have other algorithms. And those algorithms need to be made more sophisticated and successful.

Then, once the data is merged, it’s hard to figure out what I should be looking for, absent a ton of knowledge about specific disease states. The classic area we talk about is heart failure readmissions. The first thing an emergency physician or cardiologist would ask is, what is the patient’s ejection fraction?—basically, the percentage of blood ejected out of the ventricle when your heart pumps.

And you may not even know that, right?

Yes, because ejection fractions are usually calculated through a test conducted in a cardiologist’s office. And you’ve got to be able to grab that information from the continuum. You’ve actually got to go out and get the data from the entire network, to get the most comprehensive view of the patient.

What are the big challenges providers face in tracking that data down?

The biggest operational challenge we face is that the data we need for comprehensive care management is a little bit ahead of where the policy and the informatics are. Eventually, standard packets of information will come out of every information system, and packets of information will automatically come out of systems and be usable. But we’re not there yet. So either you have to put all your physicians on a standard EMR, or you need to be able to work with them to set up a system where they’re sending a data extract every day or week over to your system, so you can digest their data and add it to your data warehouse to do your data analysis. And a lot of health systems have primarily thought about data warehouses around their inpatient systems, rather than around entities that aren’t even necessarily closely aligned with them. So you need to get the data from all different sources, and be able to match up not only patients, but also providers.

And I think for many health systems, they think the answer is, my EMR is going to do all this for me. I’ve invested billions in my EMR, and eventually it will be able to do everything. I think it’s unrealistic to think you’re going to be interacting only with providers on your MER. You’re going to have to be getting data from everywhere, including from Walgreens, Wal-Mart, RiteAid, and commercial pharmacies. And not everybody’s going to go onto Epic.

Can you say a few words about interoperability in this context?

Here at The Advisory Board Company, we’re participating in a major project on interoperability, partnered with all the major EMR vendors and a number of other interested organizations around the country, sponsoring FHIR. We believe that interoperability will be incredibly important. We’re working on a FHIR-based [based on the standard, and we think it’s important to have that developed and implemented. We do think that that is the future. It is years away. And so of course, there are a ton of challenges between here and there on how health systems create a modicum of interoperability, or begin to integrate all this information in piecemeal, to get the best picture they can of all their data and systems.

What do you think that CIOs and CMIOs in organizations developing ACOs or thinking of developing ACOs, should do, in the context of everything we’ve discussed here?

I would think of it as a series of questions to ask of their organizations and infrastructure. First, do my source systems provide me with the level of precision I need to manage costs, coordination and access in the future? You want your source systems to be the best possible, of course. My second question: do I have a way of integrating my source systems, so that I can get a more comprehensive picture of the care I’m providing? And then the third question would be extending that to, can I integrate in data from outside the system, from across the continuum? In other words, can I deal with the heterogeneity of data? And then the last question would be, as I am considering new investments—new source systems, new applications, how advanced are they with regard to interoperability? If that’s the wave of the future, every investment is going to have to be moving down that path.

How do figure out the answers to those questions?

First, how compliant are you with meaningful use standards already? Second, are you participating in industry-wide efforts around interoperability? What standards are you thinking of adopting? Then, how much integration have you done with other partners in the industry? If you have a source system that’s already partnered with two or three other vendors, that will give you great confidence around their ability to move forward on this, rather than being completely closed off.

Overall, are you optimistic about the ability of patient care organization leaders to leverage data and information systems to help their organizations make all these leaps? There’s so much to do here.

I’m optimistic that we’ll eventually get to an interoperability standard, and that will be the step function improvement we need to get to in the healthcare industry. But there’s a long way to get to that point. But I’m optimistic that we’ll see healthcare costs meaningfully change until we can get there. I would say that’s five to seven years off. It will be a big challenge for the next administration to take on. And how well they handle it may actually be a key factor around whether that person has an eight-year or four-year administration. There are some powerful lobbies that will be fighting against it.