CMS Unveils Roadmap to Digital Quality Measurement

April 17, 2022
The federal agency has set an ambitious goal of transitioning to fully digital quality measures, leveraging FHIR APIs, which are already required for interoperability

The Centers for Medicare & Medicaid Services has created a plan to transition to fully digital quality measures. Its framework is built around four key domains: advancing technology; enabling measure alignment; improving the quality of data, such as standardized data elements and validation programs; and optimizing data aggregation.

During CMS’ quality conference last week, executives described the importance of this transition. They noted that digital quality measurement must leverage valid and reliable digital data captured within healthcare settings, clinical encounters, and critical data sources beyond clinical settings. Currently, however, there are limitations in standards adoption, inconsistent quality data capture, and data quality assurance.

As part of the transition, CMS noted that it is considering how to use established standardized sets of data, such as ONC’s USCDI, and how to support data standardization requirements while aligning with measurement needs and other use cases. Focusing on standardized data – FHIR, USCDI, and supplemental standards that allow for automated extraction – will allow for the interoperable exchange of data across the healthcare data ecosystem, serving many data needs and drive toward a learning health system that effectively measures quality and improves patient care, CMS said.

Michelle Schreiber, M.D., serves as deputy director of the Center for Clinical Standards and Quality for the Centers (CCSQ) for CMS. In addition, she serves as the group director for the Quality Measurement and Value-Based Incentives Group (QMVIG).

She noted that traditional quality measures, frequently abstracted by hand in the past, included poring through illegible notes, or data from patient surveys sent in the mail or measures based on claims. “Quality measures have traditionally been retrospective, sometimes with three-year lookbacks, which can make them feel less relevant. With the rise of electronic medical records, we've seen electronic quality measures (ECQMs) with data obtained primarily from EHRs.”

ECQMs have been a great advance as they capture rich clinical information, often reflect all-payer data, and can be more timely. “However, we recognize that they take work to build in an EMR and often workflow or process change is required to be most effective,” Schreiber said. “We envision quality measures to be all fully digital, which means data is drawn from an electronic interoperable source, which may be the electronic medical record, but may also be from other digital sources.”

CMS has set an ambitious goal of transitioning to fully digital quality measures, leveraging FHIR APIs, which are already required for interoperability.

Digital data for quality measures can be derived from many sources, including administrative systems, health information exchanges, electronic medical records, medical devices, such as ventilators, wearable devices such as heart rate or blood sugar monitors, Schreiber explained. Other sources may include digital patient surveys, census data, public health data, or registry data. “All of these sources can eventually contribute to digital quality measures. This will enable robust information reflective of the patient journey across the continuum of care as well as across time. Data can be much more timely, and can be leveraged to create a continuous learning system, and can also be analyzed using machine learning or other advanced tools for prediction as well as retrospection.”

CMS has been developing a strategic roadmap around four key domains: advancing technology, enabling measure alignment, improving the quality of data, such as standardized data elements and validation programs, and optimizing data aggregation. “There's a great amount of work that needs to be done by many different partners, measure developers, payers, providers, health systems, vendors, and third parties as well as government agencies,” Schreiber said. “We will all need to align to move this work forward. But quality measures are a good use case to advance standardized interoperable data, which will also help reduce burden and assist with improving quality outcomes as we have more timely and robust information.”

Digital quality measures should flow seamlessly as a byproduct of clinical workflow so as to reduce burden, Schreiber said. “And not only does this inform health systems and providers, but it will also help inform individuals, patients and caregivers to have better access to their data and provide more robust information to assist with care choices. We are really excited about the digital transformation of measures, but recognize it will take all of us working together to achieve this.”

Grace Glennon, M.S., R.D., is a project lead for quality measurement programs and a health outcomes researcher at Yale New Haven Health Services Corp.’s Center for Outcomes Research and Evaluation (Yale-CORE). She has led the development of several quality measures under contract with the Centers for Medicare & Medicaid (CMS) and has supported the development of data standardization advancements and strategies to support CMS in identifying how the agency can achieve its goal of fully transitioning to digital quality measurement. She spoke spoke about why data standardization is the foundation to effective and lower-burden digital quality measurement.

“By standardizing the data into common, agreed-upon formats, including the data terminologies and the data models, we enable the data to flow across existing institutional silos,” she said. “This, in turn, provides a platform to improve the quality of data, as individuals in the healthcare ecosystem securely gain access to a wider array of information less constrained by those existing data silos.”

As the ecosystem establishes agreed-upon data standards, there's the ability to align data requirements across quality measure men and programs within CMS, among other federal agencies and beyond the domain of quality measurement, Glennon explained. That in turn allows for the individual stages of the learning health system to consume and produce information that is understandable across all the stages.” Data standardization allows for the data to be usable for multiple use cases, including patient health data access to their own data, data quality measurement, big data analytics and research.

The ONC’s USCDI is a foundational set of data that must be made interoperable for patient care. This standard doesn't include all the necessary data for CMS quality measurement, Glennon said, but it sets a foundation and CMS will continue to collaborate with ONC to align where possible. “By using and building on these common data elements and common formats, with aligned definitions as the basis for data necessary for quality measurement, it begins to enable the automation of data transmission. It also provides a common basis for valid and reliable data mapping, and supports auditing of data capture with less manual effort required. Finally, it limits the extent to which data capture, especially for the purpose of quality measurement may interfere with appropriate routine clinical workflows.”

Faseeha Altaf, M.P.H., is project lead for Digital Quality Measurement Division, Quality Measurement Programs, Yale/Yale-New Haven Hospital Center for Outcomes Research and Evaluation (Yale-CORE).  She elaborated on earlier mentions of digital quality measurement’s role in the learning health system. Sharing across the various functions of the system offers an opportunity for continuous improvement. With this opportunity in mind, we want to ensure data used for measurement live on to serve the healthcare system to provide better care to patients. “The transformation of the quality measurement enterprise will bring forth quality measures that are a seamless outgrowth of data generation from routine workflows,” Altaf said. The new quality measurement paradigm can be accomplished through digital measures combined with interoperability, ideally through FHIR API technology. This is an area where CMS is collaborating with ONC and federal partners to expand the USCDI and contribute to the USCDI+ program and certification data requirements. CMS is also developing and collaborating on maintenance of FHIR implementation guides.

She noted that an ideal future state would include measures across public and private payers and value-based payment programs would be highly aligned where feasible, providing a coherent and coordinated assessment of healthcare quality. “In this future state, across payers, programs and providers, we aim to use better-aligned patient-centered measure sets that would cover the highest-priority quality domains,” she said. This would allow providers to be evaluated in a more cohesive way, follow the patient care journey and do it with minimal reporting burden. Ultimately, this would better facilitate health system learning and improve patient outcomes. “The work will undoubtedly be phased and incremental and will build as we go forward together, and as we hear from stakeholders through the process.”

Sponsored Recommendations

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...

Spotlight on Artificial Intelligence

Unlock the potential of AI in our latest series. Discover how AI is revolutionizing clinical decision support, improving workflow efficiency, and transforming medical documentation...

Beyond the VPN: Zero Trust Access for a Healthcare Hybrid Work Environment

This whitepaper explores how a cloud-enabled zero trust architecture ensures secure, least privileged access to applications, meeting regulatory requirements and enhancing user...

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...