Over the past week a few reporters who were following up on a recent breach of 9 million patient records for stories they were writing asked me basically the same question amongst all their others, “What are the barriers that stop healthcare organizations from encrypting their devices?” One of the resulting stories, by Marianne McGee, has been posted at HealthCareInfosecurity. During my work with a wide range of small to large organizations, in a wide range of industries, I’ve found there are some common reasons why encryption is not implemented. Here are the top four I’ve run across.
1. Lack of executive support
This has been a problem since organizations started using encryption. Decision-makers and executives withhold their support for encryption for a wide range of reasons.
- A large portion of executives don’t want their data encrypted because they view encryption as being too complicated to use.
- Many think encryption will slow them, and/or their systems, down too much. Some think encryption now is as kludgy as it was 5, 10, 15 or even twenty years ago, when they first encountered using encryption and had very bad experiences. Those bad times with security technologies stick with them much longer than the good times.
- Many executives think they already have the encryption they need if they are using SSL on their web sites. They simply don’t understand that such encryption does not keep the data encrypted after it goes into the vast number of storage locations.
- And many execs simply don’t want to pay for encryption unless they absolutely are required to implement it.
2. Lack of resources/funding
Encryption costs money. Not nearly as much as it did even just a few years ago, but generally, there will be a cost associated with implementing encryption everywhere you truly need it. Organizations need to address this by figuring out why the funding isn’t there. Here are some common reasons:
- The last bullet from section 1 above is one of the primary reasons why there is no funding. The execs and directors simply don’t understand the importance, and so they put the kibosh on any funds.
- I’ve also seen budget proposals that simply did not include encryption as an oversight. It really is amazing how often it gets forgotten, while trying to include firewalls, DLP, anti-malware, etc. Encryption gets the short end of the stick almost as often (well, closer to than the others mentioned) training and awareness.
- I’ve seen many companies that originally had budgeted for encryption, but then many times that funding gets usurped by some other new project that is sexier and more glitzy. Yes, encryption is far too often the jilted technology spouse, left for a more exciting new technology, often not even related to security protection. Oh, Big Data Analytics, you are an extremely tempting option for many!
3. Lack of understanding of what is required.
Okay, here is where I want to put into context a quote that was in the article mentioned at the beginning of this post. First I’ll give you a few examples of real instances, and then I’ll better be able to explain.
Just a couple of weeks ago, I was helping one of my clients to get their HIPAA security and privacy policies, procedures and work plans created. They are a smallish to mid-sized startup technology company that is a business associate (BA) for what they hope will be many healthcare covered entities (CEs). I have cross referenced each HIPAA/HITECH policy not only to the corresponding location it supported within the regulatory text, but I also point to various NIST and ISO/IEC 27002 standards that each supports as well. When the client saw that they said, “We don’t want to comply with anything beyond what HIPAA requires; we don’t want to do more than necessary.” So, I explained how those other standards were requiring basically the same types of activities, just stated a bit differently, or going into more depth of detail on the topic.
They then saw the encryption policies and procedures, and said, “We know that is not required, it is optional. HIPAA does not explicitly state that protected health information (PHI) must be encrypted. Let’s take it out.” So, I also explained that encryption is “Addressable” under HIPAA, which means something completely different from “optional.” After a long discussion, they finally realized (I think) that all the policies and procedures I had for them actually were necessary to meet HIPAA compliance for their specific business activities. They originally THOUGHT they knew what they needed, and would have done differently on their own, such as leave out encryption. Their lack of understanding would have relegated encryption to the trash can.
The Omnibus rule, as worded, complicates understanding encryption requirement for some CEs and BAs by allowing PHI to be sent to individuals at their request the BAs or CEs “have advised the individual of the risk, and the individual still prefers the unencrypted email” and then the individuals say it is okay to go ahead and send the PHI in clear text any way. Another client said they know their customers are fine with sending unencrypted email, so they’ll just go ahead and send that way. Whoa; that is not what Omnibus states! Do you have documentation from each to substantiate this? No. Right there you are making a bad business compliance decision based on lack of understanding.
As another example, a few years ago the IT admin at a healthcare provider told me they had encryption “fully implemented” to meet HIPAA compliance. However, upon further questioning and review, I discovered they had encrypted the transmission of PHI from their patient web portal using HTTPS. None of the storage areas, including on their many laptops that were being used by mobile workers at the time, were using encryption. The IT folks said they wanted to do what it took to meet HIPAA compliance. They thought that having encryption implemented somewhere, where they thought it was most important, was enough to meet compliance.
And a final example. A few years ago a small clinic located in a large city in a neighborhood with a high crime rate had their server, containing all their patient information, stolen. The clinic was in a stand-alone building, and the server was located in an interior room with thick walls, heavy door, and no windows. One weekend, with the door to the room was locked, and the outer doors also locked, criminals broke through the outer doors, and then broke through the interior doors and stole the server and all other computing devices. The agency investigating the situation asked if the data on the server was encrypted. The clinic replied, no it was not; they thought being behind two locked doors, with the interior having no other entry point, provided sufficient protection for the patient data on the server. The investigator stated that because of the risk of crime in the area where they were located, they should have encrypted the data. The IT manager said he hadn’t even thought about considering the risk of the neighborhood.
I’ve found the IT folks I’ve worked with over the years (and there have been a lot; I started in IT and continue to work with many IT folks who are my clients) are overwhelmingly knowledgeable about their specialty areas. However, I’ve also found a large portion that, while they know their IT systems and applications inside and out, have made incorrect assumptions about what is necessary for implementing effective information security controls, and also have erroneous beliefs what is necessary to meet the long list of legal compliance requirements for security and privacy. Often they know some information security concepts, but not the full domain of security possibilities (such as found in ISO/IEC 27001 and ISO/IEC 27002), or how to do a comprehensive information security risk assessment. As with any profession, a little bit of security and privacy knowledge can be a dangerous thing that leads to bad decisions; such as determining encryption is not necessary in areas that they may not recognize as being high risk.
4. Increasing technology complexity, BYOD and mobility
Think about all the complexity that exists within most organizations. Web servers, file servers, desktop computers, smartphones and a wide range of other mobile computing devices, mobile storage devices, employee-owned devices used for business activities, stationary data storage, cloud services providing a wide range of services, and the list could go on. Where is all the data? Where are all the risks to data? This is a huge topic that is best tackled in another post (or series of posts). This complexity often results in a hit-or-miss implementation of encryption. There are often problems with compatibility of encryption solutions with all these diverse computing devices. Sometimes there is no encryption solution for certain types of devices. It is not surprising that all the sensitive data is not encrypted everywhere there are risks. However, that is not a good excuse for not being more proactive in identifying and managing risks.
Bottom line for organizations of all sizes…
Encryption is an effective tool that all types of organizations, of all sizes in all industries, can use to help protect sensitive information such as PHI and all other types of personal information. To determine where to use it, first identify where your sensitive data is collected, stored and processed, and then identify the risks through the entire lifecycle of the sensitive data. Implement encryption, in transit and in storage, in all the locations where there are sufficient levels of risk and legal compliance requirements to do so, to justify encryption to your business leaders and decision makers.
This post was written as part of the IBM for Midsize Business program, which provides midsize businesses with the tools, expertise and solutions they need to become engines of a smarter planet. I’ve been compensated to contribute to this program, but the opinions expressed in this post are my own and don’t necessarily represent IBM’s positions, strategies or opinions.