Will Financial Incentives to Radiologists Propel IT-Facilitated Quality Processes Forward?

March 17, 2015
Some of the financial imperatives facing practicing radiologists may prompt the adoption of IT-facilitated peer review—but forward progress remains slow

I read with considerable interest an article in Diagnostic Imaging online last month. The report, by Aine Cysts, covered the topic of data analytics solutions for optimizing patient scheduling in radiology practices and hospital radiology departments.

Among other people, the article quoted Nadim Daher, principal analyst for medical imaging at the San Antonio-based Frost & Sullivan, as saying that there has been pressure in the past to years for radiology to align with the “new realities of healthcare, where it’s about cost efficiency, outcomes, payments, quality, and value. These are things that radiology has not been prepared for, Daher said. “Instead, radiology has been rooted in the fee-for-service model, the ‘do more and earn more mentality.’” Daher categorized radiology analytics solutions into three key areas—operational, financial, and clinical.

What’s more, DI’s Cysts quoted Tessa Cook, assistant professor of radiology at the Perelman School of Medicine at the University of Pennsylvania, as saying that “It feels like the specialty is being challenged in different ways: reimbursement cuts, the job market. Radiology is in the spotlight—and not in a good way,” she said. “Analytics give us the ability to really start to show what we [as radiologists] bring to the table in terms of contributing to patient care.”

Much of the rest of the DI article focused on patient scheduling and throughput issues. Reading it, though, the thought occurred to me that other policy and reimbursement trends are in play as well, and that radiologists’ need to optimize throughput and satisfy patients in terms of convenient scheduling, may also lead radiologists to more eagerly embrace other types of analytics, most especially clinical peer review of outcomes quality on the part of radiologists in radiology groups.

For example, two years ago, I interviewed radiologist leaders at Multi-Care Health System, a Tacoma, Washington-based health system with four acute-care hospitals and 20 sites of care for imaging services, as well as an employed physician group and two affiliated radiology groups.

As I noted in that article, “Radiologists in [the] two different radiology groups have been participating in an initiative that offers a great deal of potential for specialty medical management going forward. The initiative encompasses 15 of the 22 radiologists at Medical Imaging Northwest, and 23 of the 46 radiologists from Tacoma Radiological Association; as well as eight orthopedic physicians in various locations. Those physicians are participating in a program in which radiologists receive assigned radiological studies and review them in a quality review process.”

As I reported back then, “Using information technology from the Sarasota, Fla.-based PeerVue, those radiologists are ensuring that some core quality assurance/peer review processes that all radiologists should be engaged in, are performed, tracked, and analyzed, and that the information that comes out of that process is then plowed back into a continuous performance improvement cycle. The PeerVue solution supporting the radiologic study peer review process went live in November 2009.”

At the time, I interviewed Jim Sapienza, administrator for imaging services, MultiCare Health System, and Andrew Levine, M.D., chairman of the Executive Committee, Medical Imaging Northwest, and Medical Director of South King County Diagnostic Imaging Services. What Sapienza and Dr. Levine told me seemed very promising.

As Dr. Levine told me in that interview, “In the past, if problems arose in radiologic interpretations, addressing them would involve someone like myself who’s a medical director. I’d look at the case and talk to the radiologist who had made the mistake, but it would pretty much stop there; and there would be no follow-up or analysis examining why the same individual was making the same mistakes, or multiple people were making the same mistakes.”

What’s more, Levine said, “Trying to do all of this in a paper-based system was not user-friendly; it required the radiologist to pull out paper and make notes and give those notes to a technical or clerical person and then have that person give it to me. This way, we know that every week, a certain number of cases are reviewed on or by each person, and we can go in and do the tech QA [quality assurance] and also the retrospective and prospective stuff—someone might have had a significant miss. And someone can handle the software in the background, and we don’t have to deal with that stuff. Like other clinicians, radiologists want to do what we want to do, not clerical things. We want to have QA [quality assurance], but we want the process to be efficient.”

All of this seems to be self-evidently commonsensical. And yet this kind of IT-facilitated quality assurance process is still, two years later, only beginning to make serious inroads among radiologists. That’s why reading articles like the DI article from February are heartening, because the more intensified the financial incentives for radiologists, the greater the chance that those specialists will also find themselves ready to adopt other types of information technology to address other areas, such as radiological quality assurance. Given how important it will be for radiologists to engage in serious, meaningful quality assurance processes around interpretations and studies, all this ramping up of IT adoption for other reasons, including “purely business” reasons, will also pave the way for more clinically focused adoption—and that is all to the good.

In sum, radiologists—who after all are the most technologically oriented of all medical specialists to begin with—are finding that various types of information technology are increasingly going to be essential to their practice going forward. And that trend will only accelerate over time.

Sponsored Recommendations

Clinical Evaluation: An AI Assistant for Primary Care

The AAFP's clinical evaluation offers a detailed analysis of how an innovative AI solution can help relieve physicians' administrative burden and aid them in improving health ...

From Chaos to Clarity: How AI Is Making Sense of Clinical Documentation

From Chaos to Clarity dives deep into how AI Is making sense of disorganized patient data and turning it into evidence-based diagnosis suggestions that physicians can trust, leading...

Bridging the Health Plan/Provider Gap: Data-Driven Collaboration for a Value-Based Future

Download the findings report to understand the current perspective of provider and health plan leaders’ shift to value-based care—with a focus on the gaps holding them back and...

Exploring the future of healthcare with Advanced Practice Providers

Discover how Advanced Practice Providers are transforming healthcare: boosting efficiency, cutting wait times and enhancing patient care through strategic integration and digital...