Monday, April 26, 2010

Malinformed Consent will backfire on health research

Patients are very connected to their bodies, which when they are measured result in health information; so it should be no surprise that Patients feel like they SHOULD be very well connected to where their health information goes. The feeling that their health information is going places that they don't know can and should make anyone feel uncomfortable.

A wonderful article in the NY Times - ‘Informed Consent’ and the Ethics of DNA Research - does a tremendous job of  explaining various cases where a patient found out LATER that their data was used in ways that they didn't understand or authorize. This re-purposing of the information was clearly not the patients idea. In many of the cases the patient said that they WOULD HAVE allowed the research to be done, but were outraged that they were not asked.

There is also clear indications in the article that there is a huge gap between how researches communicate and how patients understand. This is not to put down patients intelligence, I actually see this as a total failure of the book-smart researcher. Possibly our higher-education system needs to do a better job of exposing students to simple facts of humanity. Research tends to focus so much on the outcome that they don't recognize the human part of their source data. This is the focus of the 'informed' part of consent. Communicating with patients is critical to acceptance.

There is an small paragraph on a hugely important case of Henrietta Lacks from the 1950s. A new book written by Rebecca Skloot, is out on the life of Henrietta, which has caused some of this reflection. I encourage everyone to do even the smallest research on Henrietta. We all have so many reasons to thank her, but she never knew. In her case it wasn't just her health information that was re-purposed, it was her actual body cells, which turned out to be amazing.

I have written on technical methods of De-Identification as highly contextual. I believe that we can appropriately de-identify data for a specific purpose. But even in a case where data has been de-identified for one purpose does not mean it is appropriately de-identified for another purpose. I can't tell which of the abuses noted in the NY Times article tried to de-identify, but clearly the method used failed. It is clear that in the case of  the Havasupai Indians, the knowledge was exposed by someone involved in the research. The human factor is very hard to control with de-identification technology.

Any single security or privacy tool is only a tool. It can be used wrongly or carelessly. Utilizing multiple tools together helps greatly, but can still fail. But before technology is applied, Policies must first be thought through carefully and written. I really hate that everything must wait on policies, but without policies we can not understand the context or know when there is a loss of control. I very much want healthcare researchers to do their job. I do not want to suffer of failing health as my ancestor have. But that does not mean that I want to suffer because of health information exposure. I am going to get my genes sequenced, and would love to allow researchers to play with the information. I want, truly want to believe that nothing bad will happen.

No comments:

Post a Comment