Numerous subjects around artificial intelligence were discussed at the AI Preconference Forum during HIMSS25, being held at the Venetian Sands Convention Center in Las Vegas this week, all of them layered with complexity. But the final panel of the day on Monday, March 3, centered around a potentially very prickly subject: how patient care organization leaders should think about partnerships with information technology vendors.
Flying under the banner “Synaptic Sync: Building Strategic Technology Partnerships for Effective AI Integration: Technology Panel,” Neri Cohen, M.D., of The Center for Healthcare Innovation, moderated the panel. He was joined by Deepti Pandita of the University of California Irvine Health; Punit Singh Soni of Suki; Nancy Beale, Ph.D., R.N., NI-BC, Famia: and Amy Zolotow of Catholic Health.
Early on in the discussion, Dr. Cohen asked his panelists, “How do we create the partnerships to align to improve care?” Beale said, “There are a lot of stakeholders in the space who don’t even understand what AI is. Ai is not one thing, it’s many things. And we need to ask ourselves, what is the problem we’re trying to solve? And if AI is the right tool, then we should pursue it.”
Per that, Zolotow said she agreed with Beale; and she further noted that “The AMA just did an update focusing on the documentation, the ambient listening. We’re seeing a lot of focus on lifting the administrative burden. We’re also focusing a tremendous number of efforts on diagnostics as well. And the data shows we’re seeing a significant increase in provider buy-in. So the opportunity is to focus on the people. And once we start investing in our people, that’s where we’re going to see the most opportunity taking shape.”
What about the differences in dynamics in academic medical centers and community hospitals? How are those dynamics playing out now in patient care organizations? “That’s a great question,” Panditi responded. “Academic institutions come with skepticism: researchers want to question and challenge what’s being provided and then find answers. So anything implemented in an academic setting comes with a hefty dose of skepticism. And so there’s innovation, versus third-party tools. So where do you draw the line between tools you build, versus externally validated tools, which in academia, you want to re-validate anyway? So there is a longer journey. And it’s not just one data source we’re wanting to use or implement. And that’s where robust governance comes into play.”
Per that, Cohen turned to Singh Soni and asked him, “Punit, how do you talk to different organizations? And what are some of the misperceptions?” “How do I talk to them? With great difficulty,” Sing Soni said. In fact, he reported, “I started having these conversations seven years ago, but all I heard was crickets at first. In fact,” he said, “I don’t think of AI as a tool; I actually think of it as an epoch. We’re all talking about tools, but electricity was an epoch, the Internet was an epoch; AI is the same. You’re either going to be using AI tools, or nothing. Per academia,” he said, “you have to walk in with respect. And technologists are often perceived as arrogant, and clinicians think of technologists as simply working for vendors. So, respect. And then you have to agree on joint metrics that you’ll measure. You have to respect the academic study and research, and you have to move forward as partners, before you can accelerate.”
And, Singh Soni added, “Being called vendor makes me feel like I’m hawking vegetables in a market or something. I want to be a partner. And when you put a partnership framework in place, you start asking what matters to you? Efficiency? Satisfaction? Understanding burnout? Revenue? What types of metrics matter to you? You start there.”
Accountability moves in both directions
Cohen asked, “How do you hold your future partner accountable to deliver on those metrics?”
I’ve been on all sides of the equation—clinician, vendor, consultant—and the way we hold our vendor partners accountable is that we have our own internal accountability; we have to hold up our part of the bargain,” Beale said. “So you as a vendor partner have to hear from me, in a timely fashion, if I have any issue, so we can map out a strategy together to resolve any issues. And I’ll also be holding you accountable. So it’s really about holding each other accountable.”
Language actually matters in that context, Zolotow interjected. “I don’t think we put enough emphasis on using the right language and defining things together,” she said. “Working in partnership means you develop appropriate language together. We have to be very intentional about the words, even to the point of getting the definitions right.”
So, Cohen asked, “How do you build in ethical governance that is respectful of both sides?”
“I want to place significant emphasis on getting a diverse group at the table,” Zolotow said, “getting that governance committee to lay the groundwork, is incredibly important. And on that earlier panel, Brenton [Hill] mentioned that we don’t hear enough from nurses.
And how do we align the governance work with investigational use and true science? “In the context of AI governance,” Pandita emphasized, “those should not be in separate silos; governance should be nuclear, singular, and not divorced. The last thing you want to do is to separate research and clinical governance, because if you’re not aligned, you’ll end up with very different outcomes.”
Working forward fast in the real world of budgets and limited resources
Later on in the discussion, the subject of budgeting and financial resources came to the fore. Cohen noted that, “When we were running 15-20-percent margins, we had the luxury of time. Nowadays, the vast majority of healthcare organizations are running at sub-1-percent margins or in the red. So how do we accelerate processes and think about scalability?”
“When I think about that in the context of systemness, it really builds on the foundation of systemness, to the extent you have it in your organization to start out with,” Beale said. “So what assumptions are being leveraged to drive the model? If you don’t have solid foundations, you’re buying new curtains for your house, but the foundation is crumbling. So you have to have a solid foundation, and it all goes back to, what is the problem you’re trying to solve? None of those things are really separate. AI is a tool, like all software.” And, Pandita insisted, “Whether it be a small, medium, large, system, academic, non-academic, what does AI do? It needs to be part of all of those areas; it’s not a strategy on its own.”