Privacy of Data is an Ongoing Concern
This week, just outside of Toronto, an unencrypted USB key was lost that contained the names, government-issued ID numbers and personal health information of more than 80,000 patients who visited a local H1N1 vaccine clinic.
Here’s the story from CBC News:
Ont. privacy commissioner orders ‘strong encryption’ of health records
In December the Durham health authority, which is responsible for a large area east of Toronto, announced it had lost the medical records of thousands people after a nurse misplaced a USB key at Durham region’s headquarters in Whitby, Ont. The information on the USB key, also known as a memory stick, was not encrypted. The device contained data collected from more than 83,000 patients during H1N1 flu vaccination clinics in the region between Oct. 23 and Dec. 15.
On Thursday, Ontario privacy commissioner Ann Cavoukian said Durham must ensure the safety of patient records and ordered it “to immediately implement procedures to ensure that any personal health information stored on any mobile devices [laptops, memory sticks, etc] is strongly encrypted.” Cavoukian made clear in her report that she expects every health authority in the province — not just Durham — to follow suit.
As Cavoukian notes, personal health information must be kept confidential. This includes (although Cavoukian doesn’t explicitly say this) personal health information that is part of research data. The safeguard that she is urging all health authorities to implement (encryption) is something that ethics review boards should be urging researchers to use in order to protect the confidentiality of research data. And this, of course, doesn’t just apply to personal health information, but any research data about which a promise has been made to maintain confidentiality. We have become much more sensitive to the careful protection of health information (although this story indicates otherwise!) but there is a great deal of research that has nothing to do with health in which careful consideration must be taken to protecting confidential data or identities, as promised in many processes of consent.
Simply advising researchers to encrypt electronic data isn’t enough. Granted, it’s better than just protecting your data with a password. But there are more things that researchers who use electronic data must be thinking about.
There are two questions related to electronic data storage and security that ethics review boards must ask researchers to also think about if they haven’t already. collection.
The first question is, where are the data being stored? Best case scenario is always to store data locally, i.e. on secure servers that never requires the data to be “moved” anywhere. Many hospitals and academic centres now have these and issue staff passwords and “space” on the servers so that data can be stored locally. Once you transport data electronically even just to a non-local server, you increase the risk of that data being lost, manipulated, leaked or corrupted. This is something that is particularly worth keeping in mind when research projects involve electronic surveys. Many electronic survey tools store data remotely, even outside of the country where the research is being conducted. Case in point: Survey Monkey. Survey Monkey has long been the choice of many researchers to collect information easily using an electronic survey. However, the fact data are stored on servers within the USA and are therefore subject to the USA PATRIOT Act means that many non-USA researchers are now choosing Canadian-based, local survey tools.
The second question is how are the data being stored? If the data are stored on a local server accessible only to the researcher and research team via a password-protected desktop computer in a securely locked office, that’s a very good start. The data doesn’t have to be “transported” anywhere so it’s difficult to lose it and it’s reasonably inaccessible to others. If, however, the data is being stored on any kind of portable device such as a laptop or memory key — as data often are — then the data must be encrypted.
It’s very easy to now buy a USB key that uses encryption to store data. If you happen to lose it, the data is virtually useless to others. They’re much cheaper now and hold large amounts of data.
One final strategy is to always require that researchers store identifiable information separately from other kinds of data. Coding lists, consent forms and raw data should always be stored securely and separately.
Hopefully none of these safeguards is news to most researchers and ethics review board members. They are, for the most part, not burdensome or time-consuming for the researcher. However, clearly, as the above story demonstrates, many are not paying attention to these quite easy-to-implement safeguards, resulting in deleterious effects: loss of trust, breaking promises of confidentiality, and the potential for significant harm.