Can Healthcare IT Leaders Help Radiology Leaders Improve Radiology Quality through New Peer Review Strategies?

Aug. 10, 2015
Innovative processes like retrospective time-limited radiological peer review absolutely call out for strong, smart IT facilitation and support

It was interesting reading a report written this spring about radiology peer review, in the context of the European Congress on Radiology, which was held March 4-8 in Vienna, Austria. To begin with, I was very intrigued by the headline of the report, written by Liza Haar, editor of Diagnostic Imaging. It was this: “Peer Review in Radiology Is Not a Punishment.”

Haar was commenting on an ECR 2015 presentation by David A. Koff, M.D., professor and chair of the Department of Radiology at McMaster University in Hamilton, Ontario, and chief of diagnostic at Hamilton Health System. Dr. Koff had noted that a 2013 quality incident  had led Health Quality Ontario, the provincial agency focused on healthcare quality, to announce that it would lead a province-wide physician peer review program in all facilities in which diagnostic imaging services were provided.

The key quote from Haar’s report on ECR 2015 was this: “Peer review is non-punitive. It’s an opportunity for quality improvement. It helps to identify trends,” she quoted Dr. Koff as saying, with him adding that “We have to identify and correct the systemic barriers to a quality product, and therefore improve everybody’s performance.” Dr. Koff went on to note that at Hamilton Health System, a system combining both retrospective and prospective quality assurance in radiology and nuclear medicine was advanced through a pilot whose goal was to improve the entire process. One element in that advancement was adding in the following elements: “making the review blinded to the identity of the institution and the radiologist; managing user-defined variables like frequency of double-reads and modalities to be automatically selected for review and by which reviewers, and highly customized metrics on enterprise error rate, modality-specific error types, and the incidence of these errors over time”; and using a scoring system developed by the American College of Radiology.

David A. Koff, M.D.

Here’s where things got particularly interesting. “Prospective” peer review means a double read, so that a second radiologist is reading the same case, and could send a message in a very timely way to the reading radiologist if that second radiologist saw something amiss. Meanwhile, the Hamiltonians have also made use of what’s being called “retrospective time-limited review,” which, as Dr. Koff explained it in his presentation, means a review that takes place on the same day, with the first radiologist’s read being sent to the emergency department, and the second radiologist reading the study the same day and notifying the emergency department if she or he sees a discrepancy.

Dr. Koff had noted that such a system for quality improvement can only be undertaken by leveraging strong information technology, in concert with randomizing such strategies as “retrospective time-limited review.” In other words, the system must be fair and must present itself to radiologists, as fair. And the only way to do that is to intelligently leverage a combination of information systems, including peer review tracking and management systems.

This is where healthcare IT leaders and clinician leaders can be minor heroes to radiologists in clinical practice. When I speak with radiology leaders, I inevitably find them to be thoughtful and discerning. And their concerns are usually quite legitimate. For example, a reader who identified himself only as “Timothy,” wrote this comment:  “I agree with everything said here. As someone who has been and continues to be significantly involved with quality assurance, the problem is that selling the notion that peer review is nonpunitive is difficult if not impossible. We have all been faced with getting the news that we have made a mistake,” Timothy noted. “Many different feelings rush through us as we come to grips with the idea that we have indeed, made a mistake. The first concern is that a mistake was minor and the patient did not suffer because of it. The overriding feeling most commonly seen and felt; however, is embarrassment. Next, feeling vulnerable and open to ridicule because of the mistake and then depression in the feeling that we let ourselves and the patient down. Finally, if we're lucky, acceptance that we are not perfect and that we do make mistakes, and hopefully learn from them.

Timothy added that “I believe that if we want to really make peer-review feel nonpunitive, we have to find a way to short circuit the initial feeling of embarrassment. Perhaps, we should first acknowledge that in this situation, it is normal to feel embarrassed. Then, we might be better able to discuss a specific mistake, and, quality assurance and quality improvement in general as colleagues.”

Can you see the opening here for collaboration between and among clinical informaticists, other informaticists, clinician leaders, and practicing specialists? I certainly can. I think this is one of those perfect examples in which the implementation of, and the intelligent leveraging of, healthcare IT with useful, specialized functionality, can make all the difference in the world—provided that end-user education, ongoing support, and the intelligent ongoing monitoring and evaluation of the use of such IT, are all made part of the package deal.

The challenge, as so often is the case, will not be in implementing the technology perse, though there certainly will be a few nuances around usability, user-friendliness, and interoperability with other information systems, that will need to be worked out.

Primarily, though, the challenge will be getting everyone on the same page, and especially getting end-user radiologist buy-in. But with shifting incentives and greater scrutiny of radiological practice for outcomes quality and cost-effectiveness evolving forward anyway, now is definitely the time for those who would innovate in this area, to do so.

Sponsored Recommendations

Healthcare Rankings Report

Adapting in Healthcare: Key Insights and Strategies from Leading Systems As healthcare marketers navigate changes in a volatile industry, they know one thing is certain: we've...

Healthcare Reputation Industry Trends

Navigating the Tipping Point: Strategies for Reputation Management in a Volatile Healthcare Environment As healthcare marketers navigate changes in a volatile industry, they can...

Clinical Evaluation: An AI Assistant for Primary Care

The AAFP's clinical evaluation offers a detailed analysis of how an innovative AI solution can help relieve physicians' administrative burden and aid them in improving health ...

From Chaos to Clarity: How AI Is Making Sense of Clinical Documentation

From Chaos to Clarity dives deep into how AI Is making sense of disorganized patient data and turning it into evidence-based diagnosis suggestions that physicians can trust, leading...