Last week there was a report of a "health data breach" at Indiana University School of Medicine, hot on the heels of the "medical privacy breach" the week before at Stanford Hospital in Palo Alto, California. In the Stanford breach, a commercial website was found to contain data relating to 20,000 emergency room patients including
Last week there was a report of a "health data breach" at Indiana University School of Medicine, hot on the heels of the "medical privacy breach" the week before at Stanford Hospital in Palo Alto, California. In the Stanford breach, a commercial website was found to contain data relating to 20,000 emergency room patients including "names, diagnosis codes, account numbers, admission and discharge dates, and billing charges for patients seen at Stanford Hospital’s emergency room during a six-month period in 2009." (New York Times)
The Indiana breach involved an unencrypted laptop from the department of surgery at the Indiana University School of Medicine. This laptop was "apparently" stolen from a physician's car in August according to the report in Health Data Management. The laptop contained health information related to more than 3,000 people, including name, age, gender, and diagnosis. In addition, for some 178 patients, the records included Social Security numbers.
While both incidents are regrettable and should never have happened, they are quite different in several respects. For a start, the Stanford data was published online, and stayed online, for nearly a year. That is serious exposure. Even though criminal intent does not appear to be a factor in the data showing up online, there is no way to predict the intent of people who may have seen and/or downloaded the data while it was exposed. The Indiana data has not, as far as we know, been published, and it is quite possible that access to the data was not the motive for the theft (a fancy laptop sitting in a fancy car sounds like the ideal target of opportunity for someone looking to generate some quick cash by selling the hardware).
The one "good" thing that both incidents have in common is the potential to educate individuals and organizations about information security and data privacy. The Stanford case, as detailed by Kevin Sack in the excellent New York Times coverage cited earlier, highlights the importance of outside contractor security and speaks to a well-established cybersecurity best practice: Any organization that uses outside contractors needs to make sure that those contractors adhere to the same standards of information security as the organization itself.
In this case, Stanford Hospital transferred patient data to a billing contractor that apparently failed to afford the data adequate protection because it showed up online in a spreadsheet used by a homework assistance website called Student of Fortune (as sample data in an example of how to produce bar graphs). This breach is bad news for the contractor, but also for Stanford Hospital, even though the hospital spokesperson is quoted in the New York Times as saying: “there is no employee from Stanford Hospital who has done anything impermissible.”
In my opinion that would not be true if the hospital does not routinely follow best practices and obtain written assurances from its contractors that they have specific and well-documented policies and procedures in place to prevent exposure of personally identifiable information. The hospital would also need to show that it has been diligent in verifying those assurances and auditing those policies and procedures.
As for the Indiana incident, the lessons are perhaps more straightforward, like don't leave your laptop in your car and don't store sensitive data in an unencrypted state on a laptop (an encrypted thumb drive can be a practical alternative form of secure storage). Reports of the incident state that the laptop was password-protected, but a system access password alone does not prevent a person from getting to data on the hard drive. Although the HIPAA Security Rule does not require patient data on a hard drive to be encrypted there are compelling reasons to use encryption, not least of which is avoiding the embarassing and costly exposure of patient data.
Furthermore, the Health Information Technology for Economic and Clinical Health (HITECH) Act of 2009, which introduced mandatory notification of patients in the event that their records are exposed by a security breach, specifically exempts encrypted health data from these notification requirements. In other words, encrypted heath information is not considered, under HIPAA, to be at risk if it falls into the wrong hands. (If you handle medical data, the American Medical Association has a very useful document on encryption here.)
Hopefully, both hospitals are wiser now, and other hospitals have learned from these incidents. If you don't exercise due care with medical data shared with contractors or encrypt such data when it is stored on laptops, then the consequences can be damaging, to patients and to hospitals, and to society in general. After all, security failures like these undermine the potential of information systems to deliver benefits such as reduced healthcare costs and increased productivity.