In early 2008, the Massachusetts eHealth Collaborative began supporting the Beth Israel Deaconess Medical Center (BIDMC) and Beth Israel Deaconess Physician Organization (BIDPO) with their electronic health record (EHR) initiative. This was a jointly managed program to provide EHRs to approximately 300 affiliated physicians located in more than 170 physical offices, geographically dispersed across eastern Massachusetts.
Primary objectives for BIDMC-BIDPO were to enhance quality, foster clinical integration across the network of providers and construct a foundation for health information exchange and continuity of care. Additionally, there were specific goals for all eligible providers to achieve meaningful-use objectives and clinical quality measures (CQM), improve diagnostic coding and documentation processes and advance other pay-for-performance incentives focused on advanced diabetes care, cardiac disease, asthma, depression bronchitis, appropriate radiology test ordering and use of e-prescribing.
BIDMC-BIDPO provided a centrally hosted, service-as-a-software (SaaS) EHR – eClinicalWorks (eCW) application delivered to practices via the public Internet. However, as the majority of the practices are separate legal and business entities, they were set up on distinct and independent EHR databases to allow each group to develop practice and specialty-specific processes and isolated customization. In an effort to balance the objectives for clinical integration, retain practice and specialty autonomy and ease the deployment process, it was essential that BIDMC-BIDPO adopt a uniform approach to the EHR configuration and develop a standardized content management plan, collectively referred to as the “model office.”
Establishing a clinical standards work group
A clinical standards work group (CSWG) was formed to set priorities for the design and build of the application, identify the minimum data and documentation standards and make policy recommendations for the EHR. The CSWG is comprised of key clinicians, administrative staff and subject matter experts who represent the needs of the constituent practices and specialties and provide insight into the clinical data and reporting requirements for the various internal and external initiatives. The CSWG is a primary advocate to garner broad support toward objectives and outcomes and the forum to communicate clinical data requirements and policies. Most hospital systems already have this type of functional group in place, as do many independent practice association (IPA) and physician hospital organizations (PHO). In some cases, it makes sense to recast and reorient an existing team to focus on ambulatory systems. In other cases, you will want to solicit additional membership to better reflect the needs of the larger specialty community.
System design and build process
Ideally, all members of the CSWG will have had advanced training on the EHR application in time for them to make decisions on the design and build. However, we quickly learned that just assembling and getting time commitments from these “high-influence” individuals was a challenge in itself. At this point, we were more interested in their ability to make recommendations and decisions on standards and content, and less on developing them into application super-users. So, overview application training that covered all the functional areas of the application was provided to the group. As an overall process, we found it more effective to create a virtual environment to review and communicate the data sets amongst the working group. Using spreadsheets, Word documents and/or screenshots to present the various data tables typically allows for a rapid distribution of content and feedback exchange without getting tied down in the mechanics of the system. Once consensus on content is gained and data tables are approved, they can be assigned to the design-build team for translation into an application staging environment. From here, the CSWG can review, validate form and function, provide comment on the look and feel, and determine whether or not it will meet the business and specialty requirements.
A data evaluation and migration plan was developed for each practice install. Wherever possible and appropriate, the project leveraged shared tables for referring providers, insurances, ICD-9/CPT tables, pharmacy data, visit and appointment codes, browse tables and pick lists, etc. One specific area that created some down-stream problems was in the variability of insurance tables. Practices that utilized a third-party billing system or had elected to have a PM interface needed to retain links to existing insurance tables. The resulting variability and mix of insurance codes, identifiers and clearinghouse vendors created an attribution dilemma down the road when attempting to track payer and plan-specific activity for pay-for-performance (P4P) incentives. An external mapping and assignment process had to be developed as a bridge solution.
At a high level, certain specialty content areas can and should be designed in advance of any provider-specific customization. The primary focus should be on developing the right framework to facilitate quality and specialty relevant data capture. One lesson learned was that in some areas we were over-building the system, and it took some trial and error to find a balance between a blank-slate, out-of-the-box environment and fully prescribed and regimented progress notes and treatment plans. In the case of progress-note templates, we found that too much detail was being applied in advance. It became confusing, and it was not conducive for a provider to customize.
However, we did find value in developing other areas, such as clinical decision-support utilities – alerts (for prescription, lab, immunization, etc.) that can be established for select patient populations. Also, setting up the framework for order sets, with a recommended starter set of treatment protocols for common visit types and disease conditions, was useful. Again, the pre-build process only makes sense if you have adequate specialty representation in the CSWG (i.e. pediatricians informing pediatric content). The EHR vendor can and will pre-load a lot of specialty content, but our experience suggests a deliberate effort to provide a quality, relevant framework of content went a long way with adoption.
Workflow optimization and training
The remaining application customization became a component of the practice-level workflow-optimization process with providers and clinical and administrative support teams. Through interview and observation, practice consultant teams performed workflow assessments, identified gaps, determined specialty and practice requirements and prepared future-state workflows and transition plans. The implementation team developed a set of best-practice workflow recommendations for key functions within the practice: registration/scheduling, new-established patient flow, e-prescribing and refills, referrals, doc-folder management, in-office testing, orders and lab/radiology management. Within each of the workflows, a deliberate effort was made to highlight key data capture points and preferred entry methods with specific emphasis on data sets that fed quality and performance objectives. Finally, a master training plan was developed to support the workflow plan and to reaffirm the critical, high-value areas.
Quality measurement and data acquisition
It is important to assemble and organize the various quality reporting recipients at stake, including CMS-MU, CMS-PQRS, Public Health, NCQA, and Commercial Payer P4P, so that you can compare and prioritize the individual quality measures for the group at large and for the individual practices. If possible, the group should conduct a thorough evaluation of the measures, their definitions, target populations, inclusion/exclusion criteria and reporting periods. Although there is substantial overlap between measures, very few have the exact same definition across the criteria categories. Nonetheless, you should compare the measure definitions, identify duplication and, if needed, develop a clear measure consolidation process.
Key objectives for quality measurement and reporting include:
- Define measure definition and specifications;
- Prioritize measures;
- Conduct a gap analysis between new and existing measures;
- Build upon synergies between meaningful use and other reporting requirements;
- Identify data-capture requirements for measures;
- Develop policies and procedures for capturing data and frequency for data capture;
- Determine format and structure of reports;
- Identify frequency of report generation; and
- Identify types of reports needed (quality and management reports).
For BIDPO, the initial priority measure set included:
- 44 meaningful-use measures;
- 24 PQRS measures; and
- 35 contract-incentive measures.
This is a critical process and represents a juncture for many health systems as they evaluate the need for investing in enterprise business intelligence, community analytics and quality data-management solutions. Regardless, the group will need to clarify the quality reporting requirements and develop a clear and concise data-acquisition strategy that can be executed at the practice EHR level. This will ultimately help synchronize practice outreach and case-management efforts and go a long way in reaffirming the “Why are we doing this?” conversation between practices and providers.
Once the priority quality measures have been defined, it is important to examine and catalog the specific data-capture points that will be required to support the quality reporting. You should define exactly how and where the data (inclusion and exclusion information) will be documented in the EHR for each quality measure. These become highlights in the practice-level workflow and training plans.
Keep in mind that if you are using a meaningful-use-certified EHR system and recording structured data to meet the core and menu-set objectives, it does not always mean you are capturing enough data to support the clinical quality measures (CQM). This is often an invalid assumption. In actuality, there could be a significant amount of additional and/or non-routine data capture necessary to support the full spectrum of CQMs. Emphasis should be placed on increasing the documentation of labs (LOINC), e-prescriptions (RxNorm) and structured problem lists (ICD-9 or SNOWMED CT) to all patients seen in the EHR, not just the minimum thresholds required in Stage 1.
The BIDMC-BIDPO integration strategy has been building incrementally throughout the EHR initiative and continues to be developed. Major items completed to date include:
- Bidirectional lab and radiology interfaces with three local hospital systems.
- Bidirectional lab interfaces with two commercial reference lab systems.
- Master lab compendium – A singular, mapped compendium was developed. This allowed the providers to see one orderable for all applicable labs, and then make the routing determination.
- ED and discharge summaries (via CCD) from BIDMC are delivered as patient-assigned documents into provider-based queues within the EHR.
- “Magic button” patient record viewers from EHR to the hospital systems. This is a view-only, Web-based application that is enabled from within the EHR, retaining patient context and enabling clinicians to see data – problems, medications, allergies, labs, radiology, tests, reports and notes in the hospital. These are temporary systems and were designed as an interim solution in advance of more robust health information-exchange (HIE) model options.
- Provider to provider (P2P) is an exchange solution that allows for the sharing of patient-specific information.
Developing the model office approach at BIDMC-BIDPO enabled disparate providers to have pre-loaded, relevant content and decision-support tools to ease some of the initial paper-to-EHR transition difficulties. It allowed the implementation team to develop scalable, best-in-practice workflows and more effective training plans that helped foster a consistent and quality patient experience across different practice locations. It allowed BIDMC-BIDPO to better measure performance at all of the practices, as if they were a single integrated entity, and it provided a foundation for a longer-term interoperability strategy.
About the author
Kevin Mullen is project director, Massachusetts eHealth Collaborative. For more on Massachusetts eHealth Collaborative, click here.