With the intention of providing a wider spectrum of content and ideas, we have started an online series that is entitled, “HMT Conference Call.” Within these special multi-media features, you will have the opportunity to listen in on the conversations of prominent healthcare professionals as they analyze and debate the most important topics facing our industry today. Audio selections of these conversations will be made available on the HMT website.
For our first HMT Conference Call, I approached Mac McMillan, Chair of the HIMSS Privacy and Security Policy Task Force and CEO, CynergisTek, Inc., and asked him to select two of his colleagues to discuss in-depth issues relative to compliance monitoring. McMillan’s two guests, Adam Greene from the law firm Davis Wright Tremaine and Sharon Finney, Corporate Data Security Officer for the Adventist healthcare system, were gracious enough to allow me to record their conversation regarding risk analysis and risk assessment.
In June, I will post audio files of their call in the “Online Only Features” section of the HMT website. Subscribers of the HMT daily newsletter will be notified when they can hear McMillan and his colleagues’ lengthy conversation.
Here is a transcript of McMillan’s introduction to the conference call:
Mac McMillan Chair of the HIMSS Privacy and Security Policy Task Force and CEO, CynergisTek, Inc.
McMillan: Today, I reached out to two folks to discuss the continuing challenges we have in the healthcare industry around risk analysis and risk assessment.
The first person on the line is Adam Greene from Davis Wright and Tremaine. Adam has a background from OCR [Office of Civil Rights] and he has been working now for quite some time in the private sector as part of Davis Wright and Tremaine advising healthcare clients.
The second person on the line is Sharon Finney, the Corporate Data Security Officer for the Adventist healthcare system. She has a long history of service in the healthcare industry as an information security professional, and also sits on various boards and advises other folks on corporate data security.
The reason I reached out to these two folks is because both of them have a lot of experience and they’re both very knowledgeable about what’s required as well as what folks are doing in the field. Our discussion today will be around what do we need to do to really get folks to where they need to be because we still have a lot of the healthcare organizations who are not conducting risk assessments frequently enough. They are not conducting them thoroughly enough. We’re still seeing evidence of that. In fact, we saw evidence of that in just the most recent OCR pronouncement with respect to the client they levied this year. Quite frankly, when you take a hard look at that tool from a real security practitioner’s perspective, I’m not sure the OCR and others in the government have helped us with this new tool that they’ve published for smaller organizations.
I would like to talk about all of that, and talk about where the issues are, and talk about what we think really needs to happen in order to get this industry where it needs to be, because this is such an important thing as part of a security program.
So, Adam, I would like to start with you.
Adam Green Partner, Davis Wright Tremaine |
Green: Thanks, Mac.
Risk analysis is really hard. That’s one of the big challenges here. Every client would like a simple checklist that they can go through, check the boxes and feel like they’re done. Risk analysis, the way OCR seems to view, really requires some deep thought as to the particular risks your organization. That’s going to differ for different organizations. If you’re in California, an earthquake is going to be a much higher risk than if you’re in Maine, for example. There’s not a one-size-fits-all solution, and I think entities really had a hard time grappling with that notion sometimes. That’s one of the challenges that we’ve seen. The more-sophisticated covered entities understand that this consideration has been a real big focus of OCR of late; they’ve gotten the message. They still don’t necessarily know what are the right resources. What are the right tools? Who should they be using? What should they be doing? They at least have been hearing that risk analysis has been really important. The Meaningful Use requiring risk analysis has brought the healthcare industry a long way on that front. There’s still a number of entities who haven’t got the message and won’t get the message until they are investigated, unfortunately, by OCR and told you need to have a risk analysis.
McMillan: You bring up an interesting point with respect to the complexity of it.
Sharon, you and I have been on the other side of this coin in terms of being involved in information security for years, and I know I’ve done literally hundreds of risk analyses over my 30+ year career, and I personally don’t think that risk analysis is that difficult. This is one of the areas where Adam and I might disagree, and I think part of it is because I was taught very early on that risk analysis is a process. It is a process that if you learn the process, you can conduct a risk assessment of almost anything, whether it’s an organization, whether it’s a program, whether it’s a system, whether it’s an operation, you name it. It’s a fairly straightforward process of identifying what it is you are trying to assess the risk against, identifying the threat, identifying your vulnerabilities, identifying what control measures you currently have in place, performing analysis with respect to the likelihood of those threats taking advantage or exploiting those vulnerabilities you have identified, identifying what the impact would be if any of those situations were to occur and determining what you need to mitigate that risk.
Is it too simple for me, Sharon, because I’ve done it for so long? Are we really making it harder for people to understand than that?
Sharon Finney Corporate Data Security Officer, Adventist healthcare system |
Finney: Well, the answer is both “yes” and “no.” You’re working in an industry that has been intimately familiar with risk from a patient-care perspective for many, many years. We’ve improved the quality of care, and we’ve isolated processes and procedures and things that we needed to do to stop patients from being able to fall. We’ve isolated those risks, and we’ve been very good as an industry at really homing in on patient care and how we’re caring for those patients. The problem in translating that, I think everybody – the government, healthcare, everybody – thought that the patient care risk-assessment process was going to transfer seamlessly into the world of security and privacy and technology, and it didn’t. What we encountered in this world of security risk assessment was that we weren’t talking the same language. We had to talk to clinicians and physicians and business people and leadership about all these different aspects of risk management in a security world. We had to start from a basis of we had to start talking the same language. That inhibited us as security professionals for a long time, being able to talk that same language. Because of this integrated component of technology into the risk assessment process, that it was difficult, and it does make it much more complicated. I don’t think the process itself of doing a risk assessment or risk analysis is any more complicated in one environment or another, But I think all the components that feed it – and feed a good solid-risk assessment that identifies your vulnerabilities, depending on the environment – make it more complex and harder to understand.