By the time this article is published, odds are another security incident will have been announced, exposing millions of records to the eyes of clever thieves. Even with HIPAA fines and the public shame that comes from being a media headline statistic, it seems there is no end in sight to the waves of data breaches that are sweeping away a patient’s right to have their medical information secured.
In this Q&A, data security expert David Finn, Health IT Officer, Symantec, explores what healthcare organizations – and individuals – can do to improve privacy and keep valuable patient information out of the hands of those who would use it for nefarious ends.
Editor’s Note: The following interview has been edited for the purpose of clarity and concision.
When we talk data security, that encompasses everything from the storage of information to the channels of communication between devices. Is there a blanket privacy and security solution that can cover everything?
One of my favorite sayings is, “A fool with a tool is still a fool.” Particularly in healthcare, the CEOs, the CFOs, the operating officers, they will want to buy a technology and turn it on and think that they’ve solved their security problem. But these are big, complicated systems – not unlike an EMR – and if you think you can put these in without adding staff, getting some consultants, and understanding workflows and changing processes, you will be about as successful as you would if you put in Epic or Cerner without changing any of your practices. It just won’t happen. It might work for a little while, and then the people who put it in change jobs and move on, and no one knows how to use this tool again.
One of my other things I like to say is that robotic surgery is not actually surgery being performed by robots on patients. Or a medication cabinet isn’t actually dispensing drugs directly to patients – you have to have smart people using these very complicated tools in order to get the outcome you’re trying to achieve.
You have to have the tools. Don’t get me wrong. I’m not saying don’t use the tools, but if you aren’t using the tools correctly, you aren’t doing yourself any favors. And the worst thing you can do in today’s world is create a false sense of security, because just having a tool isn’t going to protect you.
You strike an important cord, I think. Security is ultimately a people problem. Computers don’t click on phishing emails, and no tablet has ever social engineered its way into a secure area. And so, you have to have the tools to know when stuff is going wrong, but you have to really focus on the people – getting them educated and helping them understand the value of the information they’re trying to protect and why it’s important to protect it.
Some of these scams and techniques used by the bad guys are getting really smart and clever, though. How do you ensure that staff at a provider location is educated on the latest methods?
Well, and that’s one of those things – it’s an ongoing process. A lot of healthcare organizations have security and privacy training they do annually, and you get 45 minutes or an hour – and that security training from the first year you take it until the second year, the whole threat landscape has changed. I’m not saying it takes dozens of hours a year, but you have to keep reminding people when new things happen – and they have to be informed about them.
When I was a CIO, I had two requirements for training: It had to be personal, and it had to be entertaining. Everything you would teach at your business to employees to protect their business applies to them in their personal practice.
I read one your articles where you were talking about how you encrypt your device.1
That’s right. I do.
I wish you were the norm for the American people. Most people don’t understand why that’s important. And what’s even worse because they don’t know what’s happening on their mobile devices every day. Most people have no idea that when they post a picture on Facebook, it’s geotagged. Everyone now knows you’re on vacation, and your house is unprotected. It won’t be very long before the bad guys figure out how to use that, if they’re not already. And so, yeah, it’s about protecting the business and the EPHI (electronic protected health information), but that really applies to people as individuals, too, and we don’t always make that connection. And then, let’s get real, security and privacy training can be very boring. You can talk for hours about HIPAA and fines and all that stuff – but you have to make this entertaining. It doesn’t have to last hours, but it has to be to the point and help them understand.
Is this one of those things where individuals and providers may not get the point until after they’ve been a victim of a data breach? For example, I didn’t start encrypting my phone until after I had it stolen once – then the importance of security and privacy became clear.
(laughs) Yes, that is very impactful. And that’s exactly what we’re seeing in healthcare, unfortunately. A lot of organizations don’t address the problem until they’ve had a breach, and 40 million records later they’re dealing with it. I think part of our job at Symantec, and you in the media, is to help educate the non-IT executives on why this is important.
Healthcare is just now moving away from paper. So, when we talk about data security, is that something that is decades away? After all, it took so long just to get providers to use electronic records – and in some cases they still aren’t.
It’s funny, I did a presentation recently in Chicago, and I was going back through privacy and security. I started with Hippocrates and the Hippocratic Oath, which is about 400 B.C. – he talked about the sacred trust between a physician and a patient, that you should never speak abroad of anything known about them.
And so I said, “When is the next time we get any rule in healthcare about protecting information?” And the answer was 2003, when the Privacy Rule went into effect in the United States (laughs)! We went from 400 B.C. to 2003 without any changes in our approach to privacy of healthcare records, and now that we’re all digital, medicine is just trying to catch up with it.
So, as a security vendor how do you overcome that challenge? Healthcare and tech don’t always mesh – is education your biggest obstacle when it comes to selling someone a security system?
Exactly. Getting them to understand what they need to do in healthcare is always a challenge. There are a lot of people, for example, still resisting mobility. Well, it’s a requirement today. Physicians are mobile, you’ve got patient engagement under Meaningful Use 2 – you cannot keep mobility out, you have to embrace it. Yes, it’s probably a new technology to your organization, and these probably aren’t even devices you own even – these are devices that are going to be connecting to your resources. So, you need new strategies.
We have to change the way we think about delivering healthcare, and that means we have to change the way we think about security. There’s that component of convincing providers that healthcare needs new technology, and then there’s the component – and it is an educational issue as well – that these new technologies, while they’re cool and easy to use, the easier it gets for the end user, the more complicated it gets for IT and how you manage to control them.
We’ve talked a lot about what providers can do in terms of protecting data, but is there anything that patients can do to keep their personal information secure, apart from not seeking healthcare services?
Yeah, and I think you touched on this in your article – people need to be sensitive about what they’re doing. In our Internet security threat report that we issued in April of this year, 2 for the 12-month period of January through December 2014, we looked at mobile health applications. Some of those mobile health applications sent – in clear text – passwords and your patient data. Some of them reported up to 15 separate domains – now, why would a heathcare app have to point to 15 separate domains? They’re collecting that data!
And most users don’t think about this. So, there’s an education component – and yeah, you think the user would send data to the app in the cloud and maybe you’ve chosen to send it to a personal record, but that still doesn’t account for 13 other places it’s sending data. In some cases, the app is sending data to adware sites – and those are the things patients, in fact any consumer of digital technology, needs to look out for.
Will businesses catch up in the same way that consumers are beginning to? We’re starting to see market demand on the consumer side for devices with native encryption and the like.
I think they will. Particularly in healthcare, privacy and security is going to become a differentiator. If you have the choice between two hospitals – and you clearly do – and one just breached 40 million records while the one down the street has never been in a headline, which one are you going to go to?
We’ve seen hospitals that compete with breached organizations use that information in media ads. And I’m not saying that’s a good thing either, because pretty much anyone can get breached this day and age, but I think it really is going to be a differentiator. People are going to start thinking about those track records and their information and who can do the best job of protecting it.
One final question: is there any excuse for all of these data breaches we’re seeing involving healthcare organizations?
I have a couple of thoughts about that. Of course, I’m reminded again of your article – if you went from 2014 back to 2009, what you’d see on the HHS “wall of shame” is that about 66 percent of the breaches were lost or stolen devices. Now, give me a break! Encryption would have stopped all of those. Latptops, mobile devices – if they’d have been encrypted, there wouldn’t have been a breach.
The other thing is I think healthcare asks the wrong questions. When I go into a provider – and I’m talking to someone outside of IT or above IT – the question is usually, “How do we keep from becoming the next CHS or the next Anthem or BlueCross?” And I understand the fear and motivation behind that question, but if all you want to do is stay out of the headlines, you’re asking the wrong question. The real question should be – and this applies even to individuals – “Given the threats we face on a daily basis, how do I make good, rational business decisions and clinical decisions that make sense and drive the business without exposing more risk?” Healthcare is very risk adverse, and the problem in today’s world is you can’t eliminate all the risk – you have to make smart decisions.
Google Glass: Seeing too much of your patients?
Editor’s Note: The following interview has been edited for the purpose of clarity and concision.
Let’s start by talking about the applications Google Glass has in healthcare. How are they being used?
We at Accellion actually built a reference application for Google Glass that’s for an ER doctor and, basically, it shows how you can hook up a secure content platform to do an application on the glasses whereby a doctor can scan a wristband barcode and see vital information on a patient pulled up, and then also be informed when results are available, such as MRI results.
The new upcoming version of Google Glass is more targeted to the enterprise sector, and it also offers tremendous opportunities for healthcare companies to deliver applications to doctors and healthcare workers that let them provide levels of care that you couldn’t do without this technology. So, it’s very exciting, but there are concerns about security
It seems as if its capabilities are similar to any mobile device. What are the additional privacy concerns that Google Glass brings to the table?
You would be quite aware if a TV crew was walking around your building – somebody would know something was up. But if somebody was walking around with a pair of glasses on recording, you might not realize what was going on. And so, the security concerns are, first of all, do you have in place a policy about when and where and how people can record video or capture pictures wearing wearable devices? So this would be like an extension of the BYOD (bring-your-own-device) policy, to like WYOD (wear your own device).
And the next thing is that there is another level of concern with wearables, and that is they can capture a tremendous amount of information. And so, even application developers need to think about, “Just because I could capture this data, should I?” You really shouldn’t be capturing data that is not central to the application. For example, in a healthcare situation, let’s say it was something to do with a flu epidemic, as part of this wearable device, you may want to record geolocation because that’s sort of relevant to flu outbreaks. But if you were actually doing a healthcare application that had something to do with, say, a rash on someone’s arm – well maybe you shouldn’t be capturing geolocation for that.
Is this even something physicians will want to use? Will it actually help them improve the care they provide?
When you see our Google Glass ER app, you’ll get it. If you were an emergency room doctor and you had on your Google glasses, it’s giving you information about this patient, which means you have your two hands free – that’s an example of real value add, and it’s not just for the doctor, it’s for the patient too. I think that’s what’s so interesting about these wearable devices; they also have real potential for transformative-type applications – particularly for applications where there is a need for hands-free, and particularly in healthcare there is that need.
I think it’s an exciting time. Think about it – fighter pilots are the ones who always have these heads-up displays. Now, it’s coming down to people who work in hospitals or work out in the field.
Are Google Glass and wearables something providers should be hesitant to adopt?
I think they should think this through before they do it. Here’s the thing, with smartphones and tablets I think that enterprise IT was caught flat footed. They were in their organizations before they had any chance to think it through. With wearables, I think there’s an opportunity for organizations to get ahead – and the lessons learned from securing smartphone and tablet use, they apply to wearables. It’s not that I think people should be hesitant about it; I think they should go in from the start with a viewpoint of thinking about the security aspects as much as the benefits.