At Northwell Health, IT Leaders are Revamping the EHR with AI, NLP and Voice Tools
For physicians and clinicians, electronic health record (EHR) usability and the time spent box-checking and on data entry are oft-cited sources of frustration and stress. Numerous studies and surveys indicate widespread dissatisfaction among physicians with the time spent on EHR documentation and the impact on patient interactions.
A study published in the Annals of Family Medicine last fall found that primary care physicians spend nearly two hours on EHR tasks per hour of direct patient care. Another time and motion study published in the Annals of Internal Medicine in October 2016 found that, outside office hours, physicians spend another one to two hours of personal time each night doing additional computer and other clerical work. What’s more, a Mayo Clinic study linked EHRs with physician burnout.
Some of the biggest names in technology are working to tackle this problem, as well as innovative teams at many large health care systems, with a focus on developing “smart” EHRs using artificial intelligence and voice recognition technology. As recently reported by CNBC, Google is exploring ways to use AI and voice recognition to improve patients’ visits to the doctor. CNBC reported that four internal job openings at Google describe building the "next gen clinical visit experience" and using audio and touch technologies to improve the accuracy and availability of care.
The project falls under the healthcare group on Google Brain, part of its Google AI division, and is sometimes referred to internally as "Medical Brain,” according to CNBC. “The project would likely take advantage of the complex voice technologies Google already uses in its Home, Assistant, and Translate products,” CNBC reporters Jillian D’Onfro and Christina Farr reported.
Last fall, Stanford Medicine and Google Research announced a pilot project to study the use of a digital-scribe to replace a human scribe in order to save the physician time on data entry and improve physician-patient interaction. The digital-scribe system uses speech recognition technology and machine learning tools to automatically enter the information from the office visit into an EHR system.
Earlier this year, Microsoft announced a collaboration with the University of Pittsburgh Medical Center (UPMC) on its Intelligent Scribe platform, a virtual AI assistant that “listens in” during a doctor’s visit and takes notes. The application analyzes a doctor’s conversation with a patient and then makes suggestions in the patient’s electronic health record.
Back in April, Vanderbilt University Medical Center unveiled a voice assistant for VUMC clinicians to interact with the hospital’s EHR system, developed by VUMC’s biomedical informatics and health information technology innovations division. Using natural language processing (NLP) and AI, the EHR Voice Assistant is designed to understand and fulfill verbal requests from clinicians and other hospital staff.
Additionally, startup companies like SayKara and Notable are also developing solutions in this space. SayKara developed an AI-powered scribe for doctors and Notable developed what it calls “voice powered healthcare” using AI to automate and structure patient-physician interaction.
At Northwell Health, a New York City-based health system with 23 hospitals, clinical and IT leaders collaborated on an innovation project, called EMRbot, that allows physicians and nurses to “talk” with a patient’s medical record. The technology enables users to interact with EHRs using voice, natural language text messages and an adaptive user interface using chatbots, according to the project’s leaders, Vishwanath Anantraman, M.D., Northwell’s chief innovation architect, and Michael Oppenheim, M.D., the health system’s chief medical information officer.
Northwell Health’s IT department plans to begin pilot testing the technology at several hospitals by the end of the summer.
The EMRbot was developed as part of Northwell Health’s third Innovation Challenge, which rewards innovative employee-led projects. The team behind the EMRbot project and another winning project team each received $500,000 to bring their concepts to market.
“EMRbot will completely revolutionize how clinicians interact with patient data and will restore the human `face-to-face’ interaction that EHRs have slowly eroded,” Dr. Anantraman said in statement when the Innovation Challenge winners were announced back in May.
In developing the EMRbot technology, the project team sought to solve an ongoing challenge in healthcare: physician and patient dissatisfaction with how much time doctors spend with computers during the patient visit. “Computers have become very disruptive to the workflow that physicians and nurses want to have, which is at the patient bedside or talking with the patient, more face-to-face interaction. Physicians and patients want interactions that are more natural,” Anantraman says.
“The second problem we’re trying to solve is that, most of the time, there is too much information available in the EHR, and every time you want to find one piece of information, you have to click through several screens before you can find it. It’s time consuming, it can be highly dependent on how much training you’ve had in the EHR, and, often, it doesn’t give you the information you want quickly,” he says.
The project team focused on building a user interface on top of the EHR using chatbots, NPL and voice recognition technology.
“We’re layering the technology on top of several EHRs that will allow the user to use voice and text messaging effectively to get back answers to questions that they have about the patient,” he says. “It also uses several techniques to contextually provide the correct information. You can ask a question and get an answer, but if the program knows the context of the question and why it’s being asked, then it can give you a more meaningful answer, and can learn your own individual preferences. That’s the AI piece in the whole thing.”
He adds, “Technology now is moving so fast, particularly with natural language interactions and AI, and that gives you the ability to be more tailored and more specific to what every single provider wants.”
Anantraman acknowledges that the project is in early stages, and has not yet been pilot tested, but the goal is to eventually build specialty-specific modules. “There is a lot of variation in how providers interact with the systems and there is variation among specialties. The kinds of questions an obstetrician may have are very different from the questions asked in a general unit or an outpatient setting. This is not an easy problem to solve. But, over the next couple of months, we want to build out those specialty areas.”
When pilot testing begins, the project team plans to study targeted outcomes in terms of patient satisfaction and physician satisfaction and the team plans to conduct time and motion studies to study the impact on productivity, he says.
“Being at Northwell is a big benefit for us because we have 23 different hospitals, everything from community hospitals to large academic medical centers, specialties to general practices, and from inpatient settings to outpatient settings, so we can pilot test this is different scenarios; it’s not a one trick pony. We want to try to use Northwell to roll out different modules, and to roll it out in different settings, before we take this outside,” he says.
Ultimately, the goal is to use the EMRbot technology to create a more seamless patient experience. “We want to build these collaborative tools that will allow physicians and nurses to get the information they need quickly and collaborate with each other more effectively,” Anantraman says.