Managing so called “Big Data” or “Data Inc.” is not as simple as collecting everything you can and sifting out what you don’t need later.
Well, it could be, but then that’s inherently wasteful, which is something all past, present and even future healthcare reform initiatives strive to prevent as they concurrently attempt to collect large amounts of business and clinical data for analysis and archiving.
What seems like a Catch-22 actually is a noble start to a grand experiment.
From a software perspective, you have to invest in technology running needless algorithms to control and direct data traffic; from a staffing perspective you have to dedicate employees (or employ contract consultants) to spend valuable time needlessly analyzing data for trending conclusions not wanted. And then there’s the bandwidth and archiving issues with which to wrestle.
While electronically weeding out the data chaff and dross from the useful seeds can differ by facility, some standard categories geared toward healthcare reform and population health are worth noting.
In the second of a two-part series, Health Management Technology reached out to a group of executives in the data analytics space to provide guidance in wading through the bits and bytes amassing in databases, desktop PCs, laptop PCs, tablets and smartphones nationwide.
HMT: How can administrators and clinicians know what to do with all of the data they collect?”
Administrators and clinicians need to follow the financial model in approaching data. How can you control costs? Stem patient out-migration to providers who provide the same services, or at least negotiate commissions on referrals; get control of overutilization to contain leakage of top-line revenue; and invent new ways of treating people (though this is an expensive proposition). Rather than investing in predictive analytics, providers would do better following a waterfall process that looks something like this:
- Understand your costs;
- Reduce out-migration from your network;
- Maximize pay-for-performance reimbursement;
- Identify early opportunities for utilization reductions;
- Support chronic care and disease management; and
- Predict who will develop issues.
Getting financial control and implementing more efficient workflow solutions is the top reward for using data.
There are resources for administrators and clinicians who wish to obtain actionable insight from the data that they collect. Not only are there organizations and companies who have created solutions that analyze this data, but the federal government has created quality reporting programs such as the Patient Quality Reporting System (PQRS) and the Electronic Health Record Incentive Program (a.k.a., meaningful use) that define the quality measures that should be reported. However, despite these regulatory programs, the choices of what to measure with the clinical data can be blinding for many. Explorys advocates that consumers of these analytics start with a small number of metrics – perhaps as few as five to six that discern performance between providers and practice sites. In fact, the Explorys starter set is designed to ease providers and administrators into the process to avoid “analysis paralysis.” Start with a few and add new metrics regularly that are actionable, relevant, accurate and sustainable.
Administrators and clinicians should look beyond short-term government incentives and mandates, and focus on creation of value. Of course, the incentives and mandates are intended to represent a step on that path, but they should not serve as the end goal. Stakeholders should focus on how to improve outcomes and reduce costs. Focus on how the data can be used for population health, predictive analytics and benchmarking to find areas of poor performance that can be bolstered.
Clinicians need tools to get the right information to the right people in a timely manner to enable value-based decisions. The key is to look at the information and identify where opportunities to enhance performance lie. That having been said, we can get even more basic. Before the right tools can be used, incentives to use those tools must be provided. Aligning incentives across the care continuum is where it all starts.
Payers, health systems and providers need to be more collaborative in their efforts and focus on contracts that incentivize all members of the system to improve cost and quality outcomes. There needs to be alignment through a collaborative payer who ties a significant portion of compensation to value-based initiatives, including cost, quality metrics, access, patient satisfaction and participation. Then when those metrics are met, all involved need to be rewarded for their work. After that structure is in place, it becomes a matter of having the right tools and information to make data actionable for better-informed and value-based decisions.
The copious amounts of data will be daunting at first. Without a doubt, it will take a focused effort on behavior change within the healthcare industry to begin to regularly monitor and parse through all the information, analyze it and convert it into actionable strategies. In order to be successful, there must be full buy-in from patients, providers and payers. Data is useless if no one commits to properly leveraging it. Analyzing accurate clinical data from patients will help providers gain insights into their patient populations’ larger health needs, identify challenges and opportunities for improvement, and work toward creating a healthier community.
There’s no doubt this is a mammoth task, and while we might not be there yet, we are certainly getting closer. There are still challenges ahead: organizations are learning lessons from the early adopters and trying to determine the best ways to cooperate and share data. Undoubtedly, the amount of investment required to make Big Data technologies work is more than what a single segment of the market can afford. That means all stakeholders, including pharma, will have to work toward a common vision. But with accountable care organizations (ACOs) paving the path for payers and providers to work more closely together, we are heading toward success and, more importantly, better patient care.
The weakness of analytics solutions in general is that they flood administrators and clinicians with an overwhelming amount of raw data and reports but offer no process for making use of the data. Dashboards alone are not enough to make the information meaningful and to drive lasting results. Accomplishing this requires not only advanced analytics that can integrate data from multiple enterprise systems both inside and outside the organization, but also automated tools that make the resulting data quickly actionable and meaningful for clinicians and care managers.
Here are a few examples of such a solution in use at Northeast Georgia Physicians Group (NGPG) in metro-Atlanta:
Phytel aggregates and normalizes data supplied by the group’s EHR so that the care coordinators can focus on providing exceptional patient care rather than waste time gathering the data they need from patient charts.
Care coordinators embedded in practice sites are able to look at the provider schedule and, right under each patient’s name, see opportunities for their care to be improved. They can see the patient is past due for an A1c test or hasn’t had a mammogram or hasn’t had a colonoscopy. The big advantage for the care coordinator is that when he or she clicks on a patient’s name, Phytel takes them to a patient summary profile that shows them everything they need to know about that patient: their last blood pressure reading, their last A1c, etc. While that information is also available in the EHR, the nurses might have to click through six to eight screens to collect it. In addition, Phytel aggregates data outside their EHR such as patient-reported information and predictive risk scores.
Moreover, Phytel flags all patients who are high risk and in need of immediate attention, whether or not they have visited their provider recently. Without that type of prioritization, nurses are able to see the patients in front of them, but they may not easily be able to see Mr. Jones with an A1c of 14 who hasn’t been in for nine months. Using technology that continually mines that data provides care teams with important information to improve the health of patients.
In addition, Phytel can be used to launch a variety of interventions for different segments of the population. If NGPG wants to offer a diabetic education program, for example, the Phytel solution enables the care managers to email all of the patients who could benefit from such a program with the click of a button. And they can do the same for elderly patients who are due to come in for flu shots.
Phytel also prioritizes patients for care teams. It uses green-yellow-red identifiers so those patients who are poorly controlled and have the most care opportunities appear on top with a red icon next to them. Those who are better controlled and have fewer opportunities are at the bottom and are green. So if a care coordinator has four hours on a particular day to dedicate to care coordination patients, Phytel enables them to see which patients need their time the most so they can make a real difference in their lives.
NGPG’s physicians also use Phytel to manage their daily patient load. Physicians and their nurses use the solution every evening to look for care gaps in patients scheduled for visits the following day. Knowing in advance which patients are missing tests or need additional support helps the team pre-schedule tests and assign the appropriate care team member to the patient. A patient with extensive psycho-social needs related to their diagnosis, for instance, may be better off seeing a care manager instead of, or in addition to, their physician.
HMT: Acknowledging that every patient is physiologically different and distinct, what are some of the key pieces of information that might contribute to “predict outcomes, measure trends and establish correlations that drive quality care at lower costs?”
Chris Fox, CEO, Avantas
Healthcare today is awash in data. When leveraged properly, this can be a game changer. When it is not, it can be overwhelming, if not crippling. A lot of data is sometimes that – a lot of data. Deciding what data is useful and then being able to make it “actionable” are challenges for most health systems. Within labor management, there are three metrics we feel every hospital should be monitoring.
1. FTE Leakage: Avantas coined this term more than a decade ago in reference to the hours a staff member has not worked but should have based on their [full-time equivalent] commitment. When a staff member is not working up to his or her FTE those vacant shifts are being filled by a more expensive form of contingency staff.
2. Incidental Worked Time: This refers to additional time a staff member is on the clock before or after the start or end of their original shift or during a scheduled meal break. Generally, in nursing, diagnostics or therapies, there is sound clinical justification for 40 percent of incidental worked time. The remaining 60 percent, however, represents a tremendous savings opportunity.
3. Core as Contingency: This is essentially two metrics; staff working extra and staff working overtime. At Avantas we define “contingency” as any source of staffing that is not a core staff member working within his or her FTE. Staff working above their FTE has both hard cost (overtime) and soft cost ramifications, including fatigue, staff burnout, and poor morale.
Todd Rothenhaus, Chief Medical Information Officer, athenahealth
Personalized medicine and personal health ownership is absolutely essential to more efficient, lower cost, quality care. In the future, I think we’re going to find that genetic testing is going to be an essential tool in determining medication effectiveness and therapy selection. This is an area that is ripe for automation and supported decision-making. Otherwise, at a macro level, we need to stratify our patient population – those who are completely well, stable with chronic disease and so sick they can’t be helped – and hone in on the treatable diseases and point care teams in that direction. We can identify and target programs toward those with treatable conditions and better manage their utilization of services. This is a perfect example of why predictive analytics isn’t nearly as suitable for most health systems as risk stratification is. Our doctors are too busy taking care of patients who are already sick to guess who’s going to get sick tomorrow.
Anil Jain, M.D., FACP, Senior Vice President & Chief Medical Information Officer, Explorys Inc.
Patients are indeed unique individuals and certainly physiology, environment, genetics and habits vary between them. However, what analyses of very large data sets have uncovered is that by looking statistically at the information a prediction model can be developed. Key pieces of information that impact a model may include patient demographic factors (age, gender), socioeconomic status (median household income, education), geographic factors, clinical diseases (diabetes, high blood pressure, cancer, etc.), habits (tobacco use, alcohol use, etc.), biometrics (height, weight, blood pressure, etc.), family history, surgical history and laboratory results (kidney function, blood count, cholesterol, blood chemistry, etc.). For example, most health systems are attempting to reduce hospital readmissions (admissions to the hospital within 30 days of being discharged) to help improve quality while reducing cost. Explorys has developed a heart-failure hospital readmission model that uses more than 100 pieces of information to predict the chance that a patient will be “readmitted” to the hospital within 30 days, allowing health systems care coordinators to intervene or the patient’s physicians to intensify treatment for their condition.
Dan Riskin, M.D., CEO, Health Fidelity
Since each patient is different and the population is becoming more, rather than less complex, it’s not enough to build our analytics systems off a weak claims data infrastructure. For example, the claims and EHR discrete data infrastructure will typically represent a complex patient as “hypertensive” and “diabetic.” That type of record may have worked in 1993, but it will not work in 2013.
The patient must be represented with his or her full complexity to support meaningful outcome prediction and quality measurement. This means extracting the full clinical content from the EHR, putting it into a data warehouse and leveraging the discrete data elements as well as the narrative data elements. Only in this way can the “hypertensive diabetic” properly be recognized as “83 years old,” “living alone,” “smoking,” “well-controlled hypertension,” “poorly controlled diabetes mellitus” along with the hundred other features that represent clinical and social breadth for that individual patient. With this level of information, we will be able to predict outcomes better, measure trends, measure quality and drive improvement.
Eric Mueller, Director, Product Management, Lumeris
It is true that every patient is physiologically different, but that does not negate the usefulness of individual patient data for predictive modeling. Biometric information such as blood pressure, BMI, blood sugar and LDL cholesterol levels combined with cost, quality and utilization information can contribute to a complete view of a patient’s healthcare history. Robust information at the patient level can contribute to a more complete view at the population level and ultimately lead to better care at the patient level.
For example, after collecting all the necessary data, advanced analytics and reporting are used to help health plans identify trends – like high-utilizing patients – and correlate that back to specific conditions. If a health plan notices an increase in asthmatic EMR visits, they can use that information to implement a care management plan where physicians and their care teams complete assessments and care plans for each patient, educate the patients about their chronic conditions, stress the need to adhere to symptom response plans and schedule regular checkups. The overall result: asthmatic ER utilization will decrease, and associated hospital admissions and readmissions will decrease as well. Because the health plan identified a trend in their population, they could provide high-quality care for their individual asthmatic patients.
Bonnie Cassidy, Senior Director of HIM Innovation, Nuance
Medicine has never been a “one-size-fits-all” field, but there are key pieces of patient data that can provide clear insight into those populations physicians are treating. Accurately capturing genomic information and data around the most frequent diagnoses and procedures will enable provider organizations to identify and better understand their patient populations’ unique needs and perhaps even the epidemiological factors of a community. This is especially true for larger health organizations that have multiple hospitals or outpatient facilities; different locations may not have the same population and, as a result, face unique challenges.
Once health organizations better understand their patient populations, they can begin to develop strategies that target specific areas of need in their communities. For instance, if clinical data shows a certain population has high occurrence and incidence of lung cancer, providers can offer educational opportunities and smoking cessation programs, while payers can offer wellness incentive programs. This integrated strategy is driven by that initial data analysis on lung cancer.
Tony Jones, M.D., CMO, Philips Healthcare’s Patient Care and Clinical Informatics’ Business Unit
Predictive analytics can help drive down costs a couple of different ways. First, analyzing data for a specific patient population may help providers make faster treatment decisions at the point of care, which could mean fewer tests needed for a diagnosis and therefore, lower costs.
Second, predicative analytics give hospitals a much stronger ability to develop preventative and longer-term services customized for their patients. Aggregating retrospective and real-time clinical data can present a picture of the patient population a hospital provides care to, and can enable that hospital to design care that is catered to that patient population both now and in the future. In this case, money may not be directly saved, but it may be allocated more appropriately.
Healthcare is already a data-rich environment. The challenge is that much of that data (e.g., heart rate, ECG tracings, blood pressure, etc.) is displayed on monitors but not stored for real-time or future analysis. As a result, the ability to detect patterns is diminished through the loss of this valuable data. Capturing and storing this data, then combining it with radiology images, labs and patient history, provides a much richer data set and increases the likelihood of detecting meaningful patterns. This forms the basis of predictive analytics and enables providers to treat minor medical conditions before they become major, expensive ones.
Karen Handmaker, MPP, Vice President, Population Health Strategies, Phytel
Care teams need current and trended information on every patient to act effectively. Key pieces of information include chronic conditions and related lab results, BMI, preventive screenings, medication compliance and recent hospitalizations and ER visits. Using evidence-based protocols and comparative benchmarks, “analytics” can flag trends and results for each patient that help care team members identify patients who need immediate attention and determine what action to take. With access to the right information, practices can make progress on overall quality and cost performance for their populations.
HMT: How can administrators and clinicians alike identify and collect the right data they need on which to base decisions?”
Todd Rothenhaus, Chief Medical Information Officer, athenahealth
There is immense criticism of claims-based data analysis. But claims data represents the skeleton of health outcome analysis, and clinical data is only the muscle. Claims illustrate utilization trends and can provide the best picture of how to contain risk across the revenue cycle. Practices need a strong foundation in traditional revenue cycle. They need to be able to gain some margins so that they can start to begin to pay for all the initiatives that they need to launch under the patient-centered medical home (PCMH) or an accountable care umbrella. There is no better time than now to clean up fee for service, and it’s the first step. The second step is looking at your office and asking yourself, “Am I ready to take on anything new?” In many instances, we’ve got negative productivity in healthcare, and we need to fix that – and we need to fix that at a micro level. If you can get things financially stable and performing optimally, then you can concentrate on clinical effectiveness.
Anil Jain, M.D., FACP, Senior Vice President & Chief Medical Information Officer, Explorys Inc.
At Explorys, we recommend that administrators and clinicians first identify the key strategic imperatives that will help their patients from an overall quality, cost and patient satisfaction perspective – to align themselves on the most important goals of the health system. The second step is to identify the metrics that will measure progress toward those goals. Fortunately, healthcare leaders have many metrics to choose from that have been nationally vetted and endorsed by organizations such as the National Quality Forum (NQF). Explorys, for example, has built a library of more than 700 metrics that allow administrators and clinicians to assess performance in a variety of areas from wellness to chronic disease, children to adults, office setting to the operating room, etc.
The third step is to identify existing data sources such as electronic health records, billing or claims data, patient satisfaction data, etc. The fourth step is to do a gap assessment to see if any of the metrics require data that is not already being collected or captured in some manner. If there are gaps, projects should be formulated to collect that data either by trained chart abstractors or by providers at the point of care. The fifth step is to perform a baseline calculation of the metrics and validate the results (i.e., ensure that the metrics reflect actual care). The final step is to share the metrics with providers and administrators, instill a culture of transparency and continuous improvement, and develop a plan to address gaps in quality. After implementing the plan, the process runs full circle, starting again at planning.
Dan Riskin, M.D., CEO, Health Fidelity
The current approach of scaling manual systems and requiring the doctor to enter more and more information discretely through drop-downs and check boxes just won’t work. The only reasonable approach is to use technology to do the tedious work. We should return to a rational and clear patient description directly from the physician and require the technology to figure out what the information means. The technology must work for the clinicians rather than the other way around. Fortunately, vendors are already working on this, and together, they can provide powerful infrastructure to set a health system up for current and future needs.
Eric Mueller, Director, Product Management, Lumeris
Administrators and clinicians are currently collecting the right type of data. Biometric, social history, family history, diagnostic, procedure and medication data at a patient level are collected both in EMRs and, to some extent, through medical and pharmacy claims. The challenge is integrating all of the available data and having the right tools to turn that data into usable information to enhance critical care decisions. The key is not just gathering data, but extracting it and using it in a physician’s daily workflow.
Currently, information is siloed and accessible and understandable to only the business or IT analyst. It’s up to the analyst to know what information is important enough to send on to a health plan or system executive. We need to break this pattern. We need to use the right data in programs that use advanced analytics to create real-time reports that can be used and understood by anyone in the healthcare system. In an ideal model, once a decision-maker receives a report about a physician’s generic dispense rate, for instance, they can send that exact report onto the physician. When the physician logs into her dashboard the next day, she sees that her generic dispense rate is low and begins correcting that in her daily workflow.
Collecting the right data for the sake of collecting the right data does no good. It’s about collecting the right data, aggregating that data and presenting it in a usable format that can impact healthcare decisions.
Bonnie Cassidy, Senior Director of HIM Innovation, Nuance
Capturing the right data is central to maximizing an analytics program, and there needs to be a balanced collection of demographic, clinical, revenue and compliance information. From a clinical perspective, it is key to capture accurate health information data from electronic health records (EHRs), as well as medical images. This data is central to recording the clinical codes needed to establish an accurate case mix index (CMI) that reflects the true severity of illness of a patient population, which is essential for reporting quality measures. From an administrative perspective, capturing claims and cost information is vital to ensuring financial integrity is maintained.
Tony Jones, M.D., CMO, Philips Healthcare’s Patient Care and Clinical Informatics’ Business Unit
Hospitals are already capturing much of the data that they need. Unfortunately, the data is not in the same systems or easily shared to allow tools and algorithms to take full advantage of an analysis. Much (but not all) of the data that’s available is being captured today. The next step is to interconnect the data housed in different systems or to replicate the data in a cloud. Once that is done, it will be much easier to apply algorithms that can take advantage of this vast collection of information and detect patterns to guide more clinically and economically valuable decisions.
Karen Handmaker, MPP, Vice President, Population Health Strategies, Phytel
Administrators and clinicians need different data for their respective roles, but everyone needs complete, accurate and timely information. Ensuring that reports are trusted, reliable and actionable requires organizations to have strong policies and procedures related to the following:
- Existing data capture. Use consistent locations in EMR for structured and scanned data (e.g., lab results, test orders, patient-reported data).
- New data capture. Create new structured fields rather than additional flow sheets for specific measures (e.g., fall risk assessment, Rx in care plan).
- Eliminating free text. Direct teams to use structured fields to collect data formerly entered as free text (e.g., tobacco cessation counseling, follow-up for positive depression screening).
- Data clean-up as part of standard work. Assign staff to regularly review provider attribution, invalid data entries, and proper use of new workflows to enhance reliability.
When practices have confidence in the integrity of their data and resulting reports, then they can move forward with taking action on the information. Administrators need to understand the health of their population and have confidence that they can manage to specified quality measures and savings targets when contracting with payers or considering forming an ACO. They also need to be able to easily generate the reports required under value-based care. Providers need to be able to easily identify those patients who are at risk of expensive, life-threatening acute episodes (such as recently discharged cardiac patients with multiple co-morbidities) so they can quickly intervene to maintain their health. And they need automation capabilities that can scale their limited workforce so they can engage as many high-risk patients as possible in the shortest time.