Data Deluge

June 24, 2011
As the information age surges ahead and a growing number of clinical applications and upgrades become available, data storage is becoming

As the information age surges ahead and a growing number of clinical applications and upgrades become available, data storage is becoming increasingly challenging for hospitals. In order to maintain a system where data is efficiently managed and readily available, CIOs are finding that they must take serious measures, such as constructing a new data center, creating a data warehouse, or leveraging technologies like virtualization.

If it seems like more data needs to be stored now than ever before, it's because that is precisely the case, says Ken Kirch, a consultant with Weymouth, Mass.-based Beacon Partners.

Mike Restuccia
Kirch identifies three factors as primary drivers in the ever-increasing storage requirements that CIOs must contend with. “We have data-intensive applications like EMRs; we have retention periods; and we also have accessibility requirements that it must be available,” he explains. “There's a perception that all the data must be available 99.999 percent of the time. So you have this huge amount of data that's being gathered that has to be available 24 hours a day, seven days a week, 365 days a year.”

When it comes to data storage requirements, perhaps the biggest culprits are electronic records, both in terms of the volume of information that must be stored, and the rate at which it must be available for clinicians. So when Chesterfield, Mo.-based Sisters of Mercy made the decision to install Epic's (Verona, Wis.) EMR across the system, the decision to construct a new data center quickly followed.

“We needed more space for storage and we needed high availability to support operations around the clock,” said Mike McCurry, CIO of Sisters of Mercy, pointing out that data needs have evolved significantly as electronic record systems have become more widespread. “It used to be that data was accessed only between 9 and 5, but now that need is always there.”

The new data center, which began construction in April, will serve as the heart of the infrastructure for Sisters of Mercy, a 20-hospital system that must accommodate the needs of more than 28,000 IT customers. In addition to the Epic suite — which McCurry estimates will be live by April of 2010 — the center will also house the M3 enterprise resource planning system from St. Paul, Minn.-based Lawson, along with several other applications, and will facilitate future IT installs. “It will give us capabilities that we didn't have before in terms of storage,” says McCurry.

Pillars of strength

It isn't just the 15-facility health systems that are feeling the pinch. While the size of a network is certainly a consideration, sometimes the type of services offered can be just as much of a determining factor when it comes to data storage needs.

Academic institutions like Penn Medicine — a Philadelphia-based enterprise consisting of the University of Pennsylvania School of Medicine and the three-hospital University of Pennsylvania Health System — are constantly feeling the strains of storage requirements, particularly those that operate using a best-of-breed strategy, or as CIO Mike Restuccia calls it, “pillars of strength,” which means that several different information systems must be housed in one location.

In order to facilitate this, Restuccia's team, led by Penn Health System CTO Brian Wells, is developing a health system data warehouse that will pull information from several clinical and financial applications, including the Epic ambulatory EMR and the billing and accounts receivable application from IDX (which has since been purchased by United Kingdom-based GE Healthcare). On the patient side, the center will incorporate data from Malvern, Pa.-based Siemens' Invision system and the Eclipsys (Atlanta) Sunrise Clinical Manager order entry system, along with the ED and ICU systems.

All of these applications will feed into the warehouse, says Restuccia, where data will be stored in a homogenized and synthesized manner. And there won't be time to rest after that, as another round of applications will start to roll out in the next year or two.

According to Restuccia, Penn is implementing the Eclipsys Sunrise Medication Manager at the Hospital of the University of Pennsylvania (Philadelphia) in preparation for closed-loop medication management, and is deploying Eclipsys' Knowledge-based Charting solution.
Source: From the HCI Research Series: Trends in Disaster Preparedness and Recovery Technologies
Source: From the HCI Research Series: Trends in Disaster Preparedness and Recovery Technologies
“There's heavy emphasis, when I look at the next 12 to 24 months, on the deployment of clinical applications and the provision of access to information,” says Restuccia. “In order to aggregate all that data and be able to utilize it in a consistent manner, the warehouse becomes the logical repository for that,” he says.

The warehouse will be developed in two phases, says Wells. The first will incorporate the core systems while the second will focus on the ED system and ICU monitoring, which are “still important clinical systems, but not with the robustness or the volume through them that most of our pillars have.”

The more formidable challenge, says Restuccia, lies in the integration of the core systems. “Each of those pillars of strength has their own data model and their own data identification characteristics,” he says. Therefore, in order for researchers to obtain valuable information, “they have to spend the time to go into each system and sort of decode each one of those system data vocabularies, and try to make the information useful.”

As a result, the IT staff at Penn is performing the decoding on the front end with this clinical repository and storing the data in a centralized location that researchers will be able to access on-demand. This, says Restuccia, will accelerate the rate of research and help facilitate the process of clinical trials and other key areas of investigation.

Tackling the terabytes

A critical factor in managing data on this level in academic institutions, says Restuccia, is that when it comes to research, data is often measured in terabytes.

This is also seen in many health systems. In fact, an online radiology department in a mid-sized hospital can create one to two terabytes a year by itself, according to Kirch.

However, while data and storage needs can be quite considerable on the health system side, “they're minimal compared to what you need on the research side,” says Restuccia. “Any one study and statistical gyration just generates more terabytes of data. So the storage requirements for the research organization in particular don't just grow incrementally — it's almost exponentially. You thought you had your hands full on an eight foot wave, and little did you know there was a 30-foot wave coming at you.”

And of course, it isn't just about housing the data. Researchers want to leverage data to run statistical inquiries, which generates another level of storage requirements in order to house the results.

“It's almost like success breeds success,” says Restuccia. Once we get this process in place where we're populating the repository and we're able to provide our researchers with health system data on a fast, consistent and reliable basis, the next thing they want to do is be able to take that data and bolt it onto their departmental-specific databases.”

While the needs on the research side — which can involve data-heavy initiatives such as tissue databanks, proteomics and genomics — clearly differ from those on the health system side, the constant between the two is that data must be stored in a way that it is easily accessed.

The problem is that there is such a high number of applications that need to be stored, and that the data housed within these applications must be retained for long periods of time. On top of that, the expectation is that all of the information should be right at the clinicians' fingertips.

This, says Kirch, is where tiering must enter into the equation.

“In the future as hospitals grow, you can't keep all this data ad infinitum,” he says. “People are going to have to make decisions on what we keep online real-time and what can wait, maybe even a day.”

Kirch advises that CIOs and CTOs create a system in which there are different tiers of availability by having the most current data online so it is readily accessible, then as it becomes more dated, moving it incrementally further away from real-time availability.

“CIOs have to start making decisions as to what they keep online real-time,” says Kirch, who feels that this issue will become a high priority as more hospitals adopt full EMRs. The best way to do this, he says is to “get the business drivers — the clinicians — on board, and have them sign off on what you're going to need online real-time for, as opposed to the information you can wait a little bit for. That type of philosophy will then start to even out the growth of data storage requirements.”

“There's heavy emphasis, when I look at the next 12 to 24 months, as you can tell, on the deployment of clinical applications and the provision of access to information. In order to aggregate all that data and be able to utilize it in a consistent manner, the warehouse becomes the logical repository for that.”

Sidebar

Virtualization to boost server utilization

Vince Miller
Building a new data center or creating a warehouse may be the perfect solution for some health systems, but the Cleveland-based MetroHealth System had a different set of needs and priorities. For CIO Vince Miller, the answer lied in virtualization. MetroHealth — an academic health system that includes a major medical center, a rehabilitation hospital, two long-term care/skilled nursing centers, an outpatient surgery center, and a network of community-based healthcare centers — was able to reduce the number of physical servers, lower hardware costs, and more effectively utilize hardware by leveraging virtualization, a process by which the server is partitioned so that it can run multiple applications and operating systems simultaneously.

According to Miller and Alan Greenslade, director of IT infrastructure at Metro Health, the organization has been able to collapse 90 physical servers onto four devices thanks to its virtualized environment. Of the 90 boxes, says Greenslade, 75 are replicated to the business continuity site and can be brought up online at other sites, offering a new capability for the system.

“We've seen savings in space, power and cooling, and in network ports,” says Greenslade. “On our business continuity site, we would not have the physical space, or the power, or the air conditioning to have those 75 machines replicated over. So we're seeing savings on both ends of the spectrum.”

Through storage replication, MetroHealth is able to recover servers in its alternate data center without the additional cost of hardware.

“There were a couple things that made me start looking at virtualization in particular,” says Miller. “One was the amount of server growth we were experiencing, and the other was the ability to keep current on that hardware replacement.”

Before virtualization came into the picture, each application at MetroHealth required an individual server. With every new application that was implemented, another server needed to be added, according to Miller. “This was an opportunity to lessen the footprint within the data center itself for disaster recovery planning or business continuity, and the redundancy that was built into it. As the servers grow, we also plan for tech refresh on the OR hardware, and we saw this as an opportunity not to have to replace as many servers. There's a benefit of cost savings to the organization, and obviously to IS.”

Miller finds that he is constantly adding to the virtualization system. As servers come up for hardware replacement, they are evaluated as to whether they are candidates for virtualization. If they meet the qualifications that have been established, they are virtualized.

There are, as with any technology, some concerns, primarily in the areas of server growth and security. Miller and his staff are vigilant about keeping a close eye on server growth and ensuring that they don't “mushroom the server farm” simply because they now have increased capabilities with virtualization.

In terms of security, Miller's team is “very careful of where we deploy virtualization and how we deploy it. It's just something we're constantly paying attention to,” he says.

It's one of those technologies, says Miller, where the benefits clearly outweigh the risks, particularly in terms of the cost savings that MetroHealth has been able to realize. It has helped him to decrease the amount of capital needed for technology refreshes that cover everything from PCs to network electronics, and as any CIO can attest, every little bit counts when it comes to funding.

“By doing virtualization, I'm able to save money by not having to replace single servers. I can put them in clusters,” says Miller. “So what happens is every year as you go through capital, you keep needing less and less and less. This is a way to get the biggest bang for the buck. It's really going to help us tremendously this year.” — K.H.

Healthcare Informatics 2008 August;25(8):27-30

Sponsored Recommendations

A Cyber Shield for Healthcare: Exploring HHS's $1.3 Billion Security Initiative

Unlock the Future of Healthcare Cybersecurity with Erik Decker, Co-Chair of the HHS 405(d) workgroup! Don't miss this opportunity to gain invaluable knowledge from a seasoned ...

Enhancing Remote Radiology: How Zero Trust Access Revolutionizes Healthcare Connectivity

This content details how a cloud-enabled zero trust architecture ensures high performance, compliance, and scalability, overcoming the limitations of traditional VPN solutions...

Spotlight on Artificial Intelligence

Unlock the potential of AI in our latest series. Discover how AI is revolutionizing clinical decision support, improving workflow efficiency, and transforming medical documentation...

Beyond the VPN: Zero Trust Access for a Healthcare Hybrid Work Environment

This whitepaper explores how a cloud-enabled zero trust architecture ensures secure, least privileged access to applications, meeting regulatory requirements and enhancing user...