Features Editor
I’m personally very concerned with privacy and security. It’s this facet of HIT that garners my attention the most, and as such I took every opportunity I could at HIMSS16 to pick the brain of security experts from across the space. How do we keep patient data private and secure from prying eyes? Do seized medical records give the bad guys a window into our lives? What can be done to halt data breaches and stop identity theft?
The answers I received during my chats seemed to converge into one overriding piece of advice: Security starts with you, and it’s the job of each and every consumer to protect their own privacy.
With a heated battle waging in the courts between Apple and the FBI over unlocking iPhone data, I was happy to hear most security experts – or at least those I spoke with – seemed to look past emotions surrounding the issue and delved deep into the logistics of what weakened encryption means for everyone – not just the purported terrorists, but for me, you, and every person who values their privacy.
The media narrative has been simplified to a story where the FBI wishes to access this one shooter’s phone, and Apple refuses on the basis that doing so would open a “Pandora’s Box” of weak encryption, where soon every overreaching government agency, oppressive regime, and two-bit criminal on the planet will exploit the workaround for their own aims.
And while it may sound crazy, Apple isn’t wrong. And believe it or not, this issue of the San Bernardino shooter’s phone also may impact security at healthcare practices all over the globe.
With more and more institutions utilizing smartphones and tablets to deliver care, a mobile device with built-in backdoors – even those made to be used (theoretically) only by authorities – means these devices are, by definition, insecure. Encryption, often touted as the final effective line of defense to protect patient records, is meaningless if a device contains malware or a doorway that can be accessed by someone other than the user.
In an era where no secret seems safe, to suggest that only the “good guys” would have this backdoor is to suggest that every single person who works for Apple and the FBI is a good, ethical person who will not overstep their bounds. Even if I could accept such a premise, secrets always find a way of being leaked, especially if those secrets can be leveraged to make a lot of money.
David Finn, Health IT Officer, Symantec has said in HMT on more than one occasion that data security is a “people problem”: Users are careless with devices or may download malware unwittingly. It’s ultimately human error that is the cause of many security incidents. In my opinion, intentionally designing a device to have a security flaw will only amplify this problem. Training may fix the “people problem,” but teaching patients and clinicians to encrypt their devices means very little if the standard encryption methods aren’t actually able to keep invaders out.
With more and more data changing hands via health apps, telemedicine, and through BYOD at the point of care, the security of an iPhone is a whole lot bigger than one criminal case. As long as the devices doctors, nurses, and patients use are insecure, we’re inviting more identity theft and cyberattacks to happen, and the results could be catastrophic. It’s my hope that Apple wins this battle – and if they don’t, I hope many of you will join me in seeking third-party encryption alternatives.
As always, I want to thank you for reading our magazine, and I welcome your feedback at [email protected].