The HIPAA Security Rule’s most significant flaw was on display recently. Hospice of Northern Idaho (HONI) has settled with the Federal Government for $50,000 to close out the case of a stolen unencrypted laptop that had the electronic protected health information of 441 patients on it. Media attention focused on the fact that this was the first “significant” fine for a breach of patient data concerning under 500 patients. Describing the settlement, OCR Director Leon Rodriguez said, “This action sends a strong message to the healthcare industry that, regardless of size, covered entities must take action and will be held accountable for safeguarding their patients’ health information.”
Well, yes, but…
Size matters. Consider the that the actual HIPAA Security Rule explicitly creates the loophole of “reasonable and appropriate”:
(1) When a standard adopted in § 164.308, § 164.310, § 164.312, § 164.314, or § 164.316 includes addressable implementation specifications, a covered entity must—
(i) Assess whether each implementation specification is a reasonable and appropriate safeguard in its environment, when analyzed with reference to the likely contribution to protecting the entity’s electronic protected health information; and
(ii) As applicable to the entity—
(A) Implement the implementation specification if reasonable and appropriate; or
(B) If implementing the implementation specification is not reasonable and appropriate—
(1) Document why it would not be reasonable and appropriate to implement the implementation specification; and
(2) Implement an equivalent alternative measure if reasonable and appropriate.
So, a covered entity must unless it mustn’t. And what determines if it doesn’t have to? “Reasonable and appropriate”. Surely, the argument goes, a doctor’s office with a staff of 4 cannot reasonably be expected to implement the same level of safeguards as a large medical group. A 250 bed hospital cannot be expected to implement the same safeguards as a many thousand bed multi-state hospital system.
What about Hospice of Northern Idaho? They have a staff of about 100. Compare that to Visiting Nurse Service of New York with over 18,000 employees (almost 3,000 nurses alone). What’s reasonable and appropriate for these two very different sized entities cannot be the same.
In fact, a covered entity that is too small to encrypt its laptops is probably too small to have anyone who knows that’s an option or, at least, have anyone who is qualified to document why it is not reasonable and appropriate to do so.
Put another way, we now know, thanks to this $50,000 fine, that EVERYONE is expected to encrypt laptops*. At least everyone with 100 employees. A firm of this size, would have spent a good deal less than $50,000 to encrypt their laptops . What about the doctor’s office with the 4 employees? Does her laptop need to be encrypted? Does she need to find an encrypted thumb drive for the 25 records she wants to take on a trip with her or can she use the drive from Staples that’s under $15? And now that a 100 employee Hospice service must encrypt its laptops, how does that raise the bar on the larger organizations?
(*Encryption is problematic in HIPAA: it is an addressable standard, which means it is optional so long as you can justify why you chose not to encrypt—but I’ll discuss that in its own blog.)
I don’t actually think there is much disagreement that the spend to protect a laptop which will definitely contain sensitive patient information is a reasonable expense (let’s say an average of seventy five to two hundred dollars a year per machine). But that’s easy for someone who is an information security professional to say.
What the world of HIPAA needs is something akin to the way PCI DSS approaches risk. The Payment Card Industry Data Security Standards divides merchants into two categories, those that process more than some millions of card transactions a year and those that process less. For those that process more, they must pay to undergo assessments of their data protection controls by qualified third parties. For those that process less, there are 4 levels of Self-Assessment, one of which will be required. Each level defines what level of Security Assessment Questionnaire (SAQ) must be completed and therefore what level of mitigating controls are “reasonable and appropriate” for the entity. The 4 levels are:
SAQ A:
Card-not-present (e-commerce or mail/telephone order) merchants, all cardholder data functions outsourced. This would never apply to face-to-face merchants.SAQ B:
Imprint-only merchants with no cardholder data storage OR stand-alone dial-up terminal merchants, no cardholder data storage.SAQ C:
Merchants with payment application systems connected to the Internet, no cardholder data storage.SAQ D:
All other merchants not included in descriptions for SAQ A, B or C and all service providers defined by a payment brand as eligible to complete an SAQ.
Could this approach work for the Healthcare industry? Of course. Standards could be set for annual # of patients seen and degree of IT infrastructure maintained by a covered entity.
But even more likely is what we see taking place today. The Health Care Information Technology industry is dividing service (i.e., care) providers in the way that the merchant card industry already is: smaller players are hosting less and less of their infrastructure and they are paying someone else to process their data and manage their information security compliance risk. So we can look for tiers of data protection expectations between those that merely collect and transmit patient information and those that store it.
The last question is this: is it scarier to have large amounts of electronic protected health information in centralized data centers run by professionals or have it all segmented with thousands of practices trying to protect their patients’ records without the benefit of much IT support? If you think the centralization of data is riskier, at least consider that the newly issued HIPAA Omnibus rule from the Feds finally holds these data processors (a.k.a., Business Associates in HIPAA-speak) as accountable as the care providers. Just in time.
David in response to your comment “What the world of HIPAA needs is something akin to the way PCI DSS approaches risk.” Are you just trolling, or are you really that far out of the loop? Wake up and smell the coffee. HITRUST CSF organizational framework and both self-validation and full certification have been around for years and IMHO the best framework around for nearly any organization to follow. Most people “in the know” like it even better than the ISO-27000 series.
Kris,
Thanks for the spirited response (and the virtual cup of coffee– been a long time since I sat at a cafe on Pearl Street in Boulder and sipped a cup).
The Payment Card Industry Data Security Standards were/are sponsored by the issuers of the cards. While I am a huge fan of the HITRUST Common Security Framework and while I know its development was sponsored by some heavy hitters in the industry, it still has not reached the point that PCI has in terms of industry acceptance. (None of the HITRUST sponsors have the influence in health care that VISA has in credit cards).
There is also method in my reference (I don’t mean it to be trolling). I am beginning to tease out all the differences between card data and health data. this leads to thinking about differences between credit bureau data and the data that is making its way into state health data exchanges. That’s where I want to end up.
So when we think about standards, consider how the handling of card data is regulated by, in part, the Fair Credit Reporting Act which does not get specific about security– just use. But the handling of health care data is governed by HIPAA which has a whole section on security (even if it is somewhat general).
Anyway, this the direction I am trying to begin to explore with this post. Thanks again for the comment.