At Ohio State, a Breakthrough Leveraging AI in Pathology

Feb. 13, 2024
At The Ohio State University Comprehensive Cancer Center, pathologists, led by Anil Parwani, M.D., Ph.D., have made a major AI-fueled clinical process breakthrough

Clinician leaders are rapidly developing more and more uses for artificial intelligence (AI) in the clinical realm in healthcare. One case study that has arisen recently has been emerging at The Ohio State University Comprehensive Cancer Center/the Arthur G. James Cancer Hospital and Richard J. Solove Research Institute. There, Anil Parwani, M.D., Ph.D. director of the Division of Anatomical Pathology, has been using a powerful decision-support resource to confirm cancer diagnoses made by trained pathologists. Long-term, this use of AI could address a global shortage of pathologists and fill gaps in care.

In a press release published to the Cancer Center’s website on Jan. 19, Dr. Parwani noted that AI “machine-learning” processes are especially impactful for determining how high or low risk a cancer is, a concept known as “grading” used to determine how likely a cancer is to spread and how quickly. “There is no routine cancer, so an accurate diagnosis and an understanding of the specific molecular characteristics of each person’s tumor is critically important to match patients to the best treatment options for their specific cancers,” Parwani said in the press release. “We already see the positive impacts of this in breast cancer and now in prostate cancer, which are two of the most common forms of cancer.” And he added that AI technology also has the potential to pre-screen patients before the pathologist reviews the slides to make a diagnosis so the team can prioritize possible high-risk cancers to speed up confirmation and start treatment sooner. “From a screening perspective, it is not unusual for me to have 40 cases to review in a day, and many of these are benign. Imagine if we could prescreen all these cases with AI to put the most concerning cases at the top of the stack,” he noted.

What makes this such a potentially important breakthrough, the press release explained, is that “Pathologists are trained to look at specific features of the cell that are seen through a microscope. Digital pathology took these cellular images – which are traditionally viewed on a series of single flat images on glass slides – and combined them into a single 3D image that can be studied from all angles on a computer. AI can find concerning features outside the cancer cell that are not obvious to the human eye. Parwani explains, these serve as signals to create a risk ranking (low to high) and give important information for guiding treatment decisions.”

“So today, we can use AI tools to help us predict these features in patient samples. AI does not make a diagnosis – but once I, as a pathologist, make a diagnosis, I can then use AI to find additional clues that will help the treating oncologist suggest a treatment plan for the patient,” Parwani said. “This is especially helpful in rare cancers or cancers with a specific genetic mutation that might benefit from a very specific therapy.” The OSUCCC- James is currently testing the use of AI in prostate, breast, gastric and other cancers for diagnosis validation and risk assessment.

Per all that activity, Parwani spoke recently with Healthcare Innovation Editor-in-Chief Mark Hagland regarding this AI-facilitated breakthrough and what’s next in the digital pathology world. Below are excerpts from that interview.

To summarize this breakthrough, what you and your colleagues have developed here is a kind of AI-facilitated early-flagging process for abnormalities?

Absolutely, yes. Traditionally, and this is how I was trained, the tissue comes to the lab, and slides are made. And it normally takes 24 hours to get to a pathologist. We were the first hospital system in the country to adopt digital pathology, in March 2018. I was probably the first pathologist in the country to sign out my first case entirely in a monitor. That was the firs step. At that time, our goal was to have this capability. We can review, collaborate; I can receive consults from other parts of the country or internationally. That took almost a year to implement. And we wanted to integrate that with the electronic health record right away. So I could click and review images from biopsies. We moved to an offsite lab, but we didn’t have to wait for the glass slides anymore, they could be digitized. Then we started looking at what else we could do. You’re basically converting a glass side into millions of pixels. And similar to what you do with Google Maps or tools helping you to find objects.

And there were some startup companies focusing on building tools for breast, prostate, colon cancer. And in 2020-2021, we started testing some of these tools. We have to make sure the tools are targeted for our patient demographics. Tools built in Japan or Europe might not be the same. But we worked with a lab company in Sweden working with data from Switzerland. We’ve collected data on more than 800 patients so far.

And once that was done, we were able to bring data from multiple labs and validating it. So to see whether a general pathologist who doesn’t specialize in prostate cancer. The more challenging cases are usually referred to an expert consult. What if we can use AI to help with the screening and to improve the quality of the diagnosis? We’re also a teaching hospital, so those tools can help. We did a study of the eye movements of pathologists-in-training, to determine what areas they looked at compared to experts. Focused review versus broader review. We’ve now tested the algorithms and are integrating them with the EMR and with the digital pathology dashboard. Right now, we have to feed in the prostate biopsy images into the system and take them back into the pathology report. But you have apps on your dashboard for prostate, colon cancer. We want to stay in the same system, as is done in radiology.

In other words, this process will flag priority cases for the pathologists, based on unusual findings?

Yes, that’s exactly right; it will flag unusual things and flag them; meanwhile, the AI algorithm will also check the work of the pathologist-in-training. So, every time a patient goes for a prostate biopsy, the urologist takes tissue samples and puts them into jars. But it’s almost like looking for a football in a football field. Let’s say they did twelve core biopsies—a tiny cylindrical piece of tissue, about one centimeter in length and one-tenth of a centimeter in diameter, is normally produced. Each review of 12 biopsies would take 30-40 minutes, and would involve counting the number of glands, estimate how much cancer is present, that’s what takes time. So we don’t want pathologists to be replaced by AI, but we can outsource manual tasks of counting and synthesizing, and get the data into the report. This will save 20-25 percent of pathologists’ time. And we have a nationwide and global pathologist shortage. There are places in the world with only one pathologist per one million people.

So the goal is, ultimately, three functions for AI: one, to assist us in counting and measuring and assembling; the higher-level task would be augmenting my ability as a pathologist to perceive abnormalities; and the third thing would be doing things autonomously, such as screening. So our goal is to get to the point where this is integrated into our workflow and where we can save 15-20 percent of our time; and then add to it. If a patient’s genes are changing—gene signatures changing—based on digital pathology and AI algorithms that can create risk signatures for each patient. And that would not be possible without an individual patient profile. So can an image, fed into an algorithm, predict which treatment might be most useful for a patient? And there is the time and the cost.

Right now, once patients are diagnosed with prostate cancers, we have to send the tissue to labors that perform genomic assays; we use expensive genetic assays, and it takes two weeks to get the test back and it costs thousands of dollars. And the results will help the oncologist and urologist to treat the patient. Today, there are AI-based assays that will take the same image I’m looking at on my monitor, and using a comprehensive algorithm, will provide valuable information about risk stratification: how will this patient be different from another patient, as this patient undergoes changes that can’t be visually dictated? Replacing a genomics-based assay with an image-based assay. And this is a journey starting in 2017-2018. And in 2024, we will integrate these assays into the EHR, with predictive, image-based assays. So these assays are essentially “image-omics,” assays arising from images. So all of this is part of a bigger strategy.

What have the biggest lessons been so far around process?

One of the limitations or barriers to adopting AI has been the reimbursement. Currently, these assays are not reimbursable; but there are CPT codes—research codes that will become class 1 clinical codes. So we’ve learned about cost, and about AI integration; we’ve learned a lot about the process of working with an AI team, and about how to bring this into workflow. The third thing we’ve learned is trust. It comes from relationships and learnings. And because we are more advanced with digital pathology, pathologists are becoming more comfortable. So we’ve learned about costs, about IT challenges, and about people and change management.

Is there anything you’d like to add?

The final perspective I’d like to share is that I don’t want this to scare pathologists or other physicians; I want people to think of this as an ally or friend. AI will never replace humans; but a human pathologist working with AI will be a better pathologist. We have to learn how to use technology to help us. The sooner we can do that, the better. And AI has to be handled very carefully; we’re taking our time to do this safely and ethically. This is something we’re very, very passionate about. We want this to become an everyday assay available to physicians, so we’ve invested a lot in digitizing our workflow and testing algorithms, and now we’re in the process of integration, where we can start to see the fruits of our labor.

 

 

Sponsored Recommendations

How AI-Native Locating Intelligence Revolutionizes the RTLS market

Discover how leveraging an RTLS solution with artificial intelligence as the location engine can increase efficiency, improve safety, and elevate care without the compromises ...

Harnessing the True Power of Cultural, Clinical and Operational Data

Optimize healthcare performance by combining clinical, operational, and cultural insights. A deeper understanding of team factors improves care and resource management.

How Digital Co-Pilots for patients help navigate care journeys to lower costs, increase profits, and improve patient outcomes

Discover how digital care journey platforms act as 'co-pilots' for patients, improving outcomes and reducing costs, while boosting profitability and patient satisfaction in this...

5 Strategies to Enhance Population Health with the ACG System

Explore five key ACG System features designed to amplify your population health program. Learn how to apply insights for targeted, effective care, improve overall health outcomes...