Peer Review of Radiologic Studies for Quality Assurance: One Health System’s Experience
MultiCare Health System is a not-for-profit health system based in Tacoma, Washington. Pierce County, Wash. It encompasses four acute-care facilities with 868 beds, seven ambulatory surgical centers, seven urgent care centers, and 20 sites of care for imaging services. The system employs 400 physician FTEs in its MultiCare Medical Associates; what’s more, radiologists in two different radiology groups have been participating in an initiative that offers a great deal of potential for specialty medical management going forward. The initiative encompasses 15 of the 22 radiologists at Medical Imaging Northwest, and 23 of the 46 radiologists from Tacoma Radiological Association; as well as eight orthopedic physicians in various locations. Those physicians are participating in a program in which radiologists receive assigned radiological studies and review them in a quality review process.
Using information technology from the Sarasota, Fla.-based PeerVue, those radiologists are ensuring that some core quality assurance/peer review processes that all radiologists should be engaged in, are performed, tracked, and analyzed, and that the information that comes out of that process is then plowed back into a continuous performance improvement cycle. The PeerVue solution supporting the radiologic study peer review process went live in November 2009.
Jim Sapienza, administrator for imaging services, MultiCare Health System, and Andrew Levine, M.D., chairman of the Executive Committee, Medical Imaging Northwest, and Medical Director of South King County Diagnostic Imaging Services, spoke recently with HCI Editor-in-Chief Mark Hagland regarding the initiative taking place at MultiCare, and its implications for medical management going forward. Below are excerpts from that interview.
What made you decide to move forward into this area, strategically?
Jim Sapienza: It was our Quality Committee for Diagnostic Procedure Specialties: cardiology, radiology, some others, feeling there wasn’t a close enough review by the radiologists of the exams they were reading. And ultimately, any study review goes to a committee if there’s an escalation of any concerns or issues that come out of any study that was read.
So, essentially, the review process is triggered in case there is a problem with any particular radiologic study?
Sapienza: Yes, the desire to have case review, or peer review, of our imaging studies, was the initiator. Dr. Levine was doing research on this and found PeerVue.
Andrew Levine, M.D.: Part of what happened was that Lori Morgan, M.D., the head trauma surgeon and chair of that committee a number of years ago, had actually requested that we put together some kind of peer review process including radiologists and surgeons, and so on; so it wasn’t only radiologists, but other clinicians who had requested this. Let’s say a trauma surgeon orders a total body set of scans, and then the next day, the surgeons would look at the interpretations. In the past, if problems arose in radiologic interpretations, addressing them would involve someone like myself who’s a medical director. I’d look at the case and talk to the radiologist who had made the mistake, but it would pretty much stop there; and there would be no follow-up or analysis examining why the same individual was making the same mistakes, or multiple people were making the same mistakes.
Trying to do all of this in a paper-based system was not user-friendly; it required the radiologist to pull out paper and make notes and give those notes to a technical or clerical person and then have that person give it to me. This way, we know that every week, a certain number of cases are reviewed on or by each person, and we can go in and do the tech QA and also the retrospective and prospective stuff—someone might have had a significant miss.
And someone can handle the software in the background, and we don’t have to deal with that stuff. Like other clinicians, radiologists want to do what we want to do, not clerical things. We want to have QA [quality assurance], but we want the process to be efficient.
Among the benefits of developing an automated quality assurance program are timeliness, greater accountability, and transparency, then?
Levine: And closing the loop; that’s the big issue.
So you’re able to connect right away with the physician whose interpretations are problematic, right? Let's call him “Dr. Smith.”
Yes. It might be Tuesday, and I might review the case and put an addendum on it, and Dr. Smith is on vacation for a week, and there’s then a gap. And medico-legally, you don’t want to put this on e-mail. PeerVue is considered part of a hospital’s quality assurance process, and is protected.
What kinds of changes have been made since implementing PeerVue?
Levine: I’m on the QI [Quality Improvement] Committee; every two months, we have 15-25 cases that fall out [of the norm] for some reason and are flagged for a group review. And we look at these before our QA meeting, and decide whether it’s something that should have been picked up, or challenged, or may perhaps have been a protocol error or something. But in this process, we bring cases back to our group and say, this is the kind of thing people are missing. Sometimes, things are ‘one-offs’: someone missed a lymph node or a fractured wrist or something. But a lot of times, you see patterns; so, for example, when we talk about diagnosis of intercranial aneurysms, we’ll ask a neuroradiologist to come talk to us about that.
It’s almost a kind of continuing medical education process, then, isn’t it?
Levine: Exactly. We could almost get CME credit for that.
Sapienza: We really can’t compare it to before we had the tool. But per what Dr. Levine said, since we went live, we have had the ability to report out how well the radiologists are responding to the requests to review their studies; and we’ve tracked that 99 percent of radiologists in both participating radiology groups, 99 percent of the studies they’re asked to review are reviewed within 30 days. That helps us be complete on this randomized review.
Levine: And I can tell you that at the one hospital that’s not yet online with PeerVue, the rate of completion [of radiologic study peer review] is considerably less. Because it pops up on your screen every morning and says, ‘You’ve got two cases or four cases to review; and we can get absolute numbers on it, too.’ We can say, ‘Dr. X, everybody else is at 100 percent and you’re at 60 percent, and that’s not acceptable.’
How big is the scope of the peer review process right now?
Levine: Both groups have a significant number of people participating in MultiCare, and they get two to three cases every week to review, and if they’re in your specialty you can read them but if they’re not, you can refer them to subspecialists, like musculoskeletal or neuro [neuroradiology]. And there are about 12-13 radiologists in my group getting about eight cases each month, an average of two to three a week. So over the course of the year, we’re reviewing about 100 cases each, so there are four or five thousand cases being reviewed at random over the course of the year.
Is your peer review process a standard thing among radiology groups?
Levine: Yes, there’s an expectation by the Joint Commission and the ACR [American College of Radiology] that you do some version of QA.
What kinds of improvements in all your processes have taken place because of this?
Sapienza: The PeerVue tool and radiologist peer review are just a couple of components of our overall quality work, and PeerVue supports our work in other ways as well. Radiologist review, transcription review, and a number of other aspects, are all involved
Levine: PeerVue has really improved what we do. When we first started with PeerVue, one of the things that was lacking was that closed feedback loop, to get the feedback to the individual; and this has resolved that. And we can get comments into the QA process; and the second advantage is that we have a section in there that allows radiologists to comment on reason for test—and that’s been a bugaboo in radiology forever; and that is, how do I provide you and your patient a good interpretation if I don’t have good clinical information on your patient. Medical-legally, it could be a problem, billing-wise it could be a problem; and the better information I can get, the better I can do. And not every test we do is 100-percent without risk.
Do you have any advice for your peers in other groups?
Levine: I would say that this tool goes beyond the basics, like the RadPeer product from the ACR; that’s a very mediocre, very cursory product, that doesn’t allow easy modification or tailoring to the needs of your practice or provide a feedback loop. This allows any radiologist, you don’t have to have any great computer literacy or anything, to handle a case in a minute or two at most ; and it allows you to provide your partners with any information you need. And we also specifically added an “accolades” section to allow us to provide kudos as well.
Sapienza: Dr. Levine really hit the highlights here in terms of core values, and in terms of differentiation with other situations.
Levine: And it’s interesting, the hospital doesn’t have PeerVue yet, and I gave a presentation there recently, and the guy who’s the head of the committee, a non-radiologist who happens to be the head of the overall physician peer review committee at one of the hospitals, said he wished they had this in that hospital. And hopefully in the next two or three months, we’ll be fully on board with this in all MultiCare hospitals; right now, we’re live at three of the four.