The Quantified Life and Data Ethics: Thinking beyond Privacy

I read a few articles that have festered. Today, we’ll take a look at one.

Headline: Who owns your medical data. Most likely not you. Click to article.

It’s a good bet that the fine print of the consent form you signed before your latest test or operation said that all the data or tissue samples belong to the doctor or institution performing it. They can study it, sell it or do whatever they want with it, without notifying or compensating you, although the data must be depersonalized in their best effort to make sure you are anonymous.

I’ve signed a number of those forms myself, and I’ll admit I’ve never thought about it until recently, when I read that Memorial Sloan Kettering Cancer Center had been criticized for licensing patients’ personal data to a for-profit artificial intelligence start-up in which the hospital holds an equity stake.

Go read.

Now, I’m not a researcher or scientist. I’m a data analyst that’s worked in healthcare previously as an interpreter and educator on language services. A good portion of my career has been dedicated to ensuring accurate communication and supporting informed content. Today, I try to ensure people can make sense of what data they have. I’m also endlessly fascinated by medical research and ways that it continues to make lives easier.

As practitioners in data, we see the best of our work. We see where we can help, make impact, and provide towards good. The challenge we face is finding our inner Dr. Frankenstein and realizing where what we see as good may be making monsters for others in the future. It requires thinking beyond the binary (good vs bad) and realizing all actions have inherent risks – it’s that we’re balancing both the risks and benefits and making trade-offs – for both ourselves and others.

Here’s the rub: when I work with data, the greatest concern I run into is privacy. We work hard to de-identify information. And yet, even with all this focus, it’s likely some of us can be identified from DNA databases without being tested, that our phones track us sometimes even when we’ve set permissions for it not to, and that we don’t even need profiles to be tracked. Yet, privacy is only one part of this discussion.

You see, privacy falls under ethics. But, ethics don’t stop at privacy. In fact, there’s more we’re not discussing, like:

  • Harm prevention
  • Authority and decision-making
  • Sanctity of life
  • Rights to property and body
  • Trust
  • Fairness

If I use these values and look again at issues with DNA databases, tracking permissions, and shadow profiles, I can see there’s more to them than just privacy. Ethics allow us to specify where issues lie and work to correct them, or even avoid potential pitfalls when we’re proactive. We codify them, stating to other professionals in our work and consumers of our services that these values have merit.

In a formal Code of Ethics, they may look something like this.

Beneficence - do no harm

 

Respect of Property - Do not steal

Autonomy - Support self determination

Justice - Be FairConfidentiality - Be trustworthy

In our work, we may think of some of these ideas as rules or laws. Others may feel more hidden or deceptively simple, depending on our vantage point. Some values we may even write off as not relevant to our work. That is, until we force ourselves to slow down and ask more questions. It’s also rare to have an ethical dilemma that hits on just one value…in fact, it’s the collision of these values that typically causes the conundrum in the first place…

Such as, who owns data if it’s anonymized?

Is it fair to individuals for corporations to profit off it when individuals supplying it don’t?

…when people have no choice in opting in or out?

…when its collected with no clear plan or vision?

What if it’s for a cure?

…that is tethered to copyright?

…that gets purchased (in aggregate) to make insurance decisions?

…that affects more than just one generation? Or country?

The infamous case of Henrietta Lacks allows us to explore ethics that extend beyond privacy. In 1951, she went in for treatment of cervical cancer, her form being particularly aggressive. Doctors took additional samples solely for research without telling her. Henrietta died that same year. Her cells from those tumors live on, a seemingly immortal line scientists may recognize as HeLa. While her cells were anonymous for many years, much of this story started to unfold in the 1970’s. There’s not only a book about this, but a movie as well, if you want to learn more.

A part of this discussion is consent, which covers autonomy, property, justice, and beneficence. Informed consent in the US has a long and complicated history. Most commonly, those with the least power have also had the least opportunity for consent. Such as experiments in Tuskegee and syphilis. Or several women who have sued as far back as the 1900’s.

Other parts may include justice and beneficence. Researchers gained much from Henrietta’s cells, but her family still struggles to this day. Ethics force us to look at this. Is it compensation that’s a challenge? What about recognition, legacy, or even acknowledgement? How does power play into this?

Moving back to the ownership of health data, we have lots of ethical questions to consider:

  • Autonomy – who has the authority to make what choices? How data literate does a person need to be to have informed consent?
  • Property – once cells or biological matter are removed from the body, who owns it? What if it’s descriptive data only?
  • Justice – Is having consent buried in legal forms fair? Can parts be opted out with treatment still administered, similar to redlining a contract? Can this information be used later against that patient or later generations? Does one party benefit more than another? If it’s more for the greater good, is this opt in or opt out?
  • Confidentiality – are we building trust and being clear in what is being done? Are we anonymizing appropriately?
  • Beneficence – how does this benefit the patient first?

And yes…

  • Privacy – how am I protecting this information? Am I collecting and keeping only what I need?

Some items we consider privacy are actually matters related to confidentiality. Trust is imperative, particularly when dealing with healthcare. Patients who mistrust their providers may miss appointments, fail to disclose necessary information, or not adhere to treatment. Justice also enters this equation. The people most affected by this typically have the least amount of recourse, such as individuals with severe health conditions who may face steep treatment charges and face life-altering decisions. Or people who fail to understand what their data is and how it will affect them long term.

Data is evolving from a once disposable artifact to the lifeblood that shapes how we experience reality. It’s collected widely and shared vastly. What data we provide today can shape beyond our lives, as we’ve seen with Henrietta Lacks and as we’re starting to see with in the array of headlines around this issue. It’s not enough to view data solely through the lens of privacy. Other ethical issues are at play and have long term effects.

Go throughout your day. How many data points are you generating? How much of these items can you control? If you wanted to opt out, what would you have to do? Now, you’re analyst and probably fairly happy to give most of this information. Ask others how they feel about this and why. Beyond matters of privacy, what other values might come into play?

We need to start checking our work and considering the effects beyond our snapshot view. We’re capable of so much, but items like cells and data take on lives of their own. The data we collect today will live longer than the time we take to analyze it and it will reside longer in the database than we will typically spend at an employer. While the literal data may not change, our uses will and it will continue to shape our realities.

What blind spots do we have with our creations?


Side note: these images come from my latest slide deck on Data Ethics. Contact me if you’re interested in learning more.