Health or privacy, the false dilemma

Over the last year we have been presented more than usually with the following problem: if we were to choose between access to better health care services and more privacy over our personal data, which one would we pick?

For over a year now, technology companies have been in a race to produce applications for mobile phones that can be used to trace infection risk and see if we’ve been in contact with someone potentially carrying the COVID-19 virus.

The downside is that during this rush, the matter of data privacy often came in second. The upside is that we’ve had the opportunity to observe how people actually react to surveillance phone apps, what motivates or prevents them from using such applications.

Here’s an overview of what you can read in this article:

Health or privacy? The recent lesson from Covid-19 tracing apps

I admit there were times over the past year when I felt I had to choose between health (mine and my family’s) and many other things I enjoyed: going out, and meeting with friends are the first examples that come to mind.

I voluntarily installed a Covid-19 tracing app on my phone, but for some reason, I failed to give access to my location data. More than once, I wondered if the trade-off was worth it.

As it turns out, the right answer is not so intuitive. If the danger is imminent we tend to put health first, with the idea that privacy can be dealt with later. If the problem is not seen as life-threatening, people value their privacy as much as their health.

In documenting this article I came across a story by the National Public Radio, where three subjects in different countries (France, Israel and China) shared their personal experiences with how the governments in their countries implemented these contact tracing apps.

privacy vs health trade-off

privacy vs health trade-off

  • In China, while not overtly mandatory, the use of covid-19 tracing apps meant the possibility to go to work and access any basic facility like public transport or shopping malls, leaving people with no other choice but to submit into the privacy limitations.
  • In Israel, a similar app was, at first, enforced through Israel’s security services (also known as the domestic spy agency that usually tracked Palestinian suspects) which raised many eyebrows and caused a big privacy debate. In the end this led lawmakers to consider it too intrusive, and rule it out.
  • The experience in France was a bit different since, in Europe, the discussion on covid-19 tracing apps was organized from the start around the issues of privacy and anonymity. Eventually, the apps released here were operating on Bluetooth, tracing proximity and not identity or location. Still, the adoption rates were more than disappointing, but we’ll go into that a bit later.

While nobody argues the importance of having enough “big” data to be able to analyze and reach correct decisions in controlling the spread of an epidemic, to see firsthand how some governments chose to implement contact tracing apps and other control methods is troubling.

The question that surfaces through is why do we have to make a choice between saving our life and saving our privacy? Why can’t we have both?

Reconciling human rights

The short answer is that we should be able to get both. Fundamental rights should not come in conflict with each other. This is the very moment in history when we need to put ourselves in a different position and not think in terms of either-or.

Of course, if you give people a choice between health and privacy, especially when a global pandemic is on the rise, they will probably almost always choose health. Anybody would choose to live first because privacy doesn’t help you if you’re dead.

But the underlying concern behind the dilemma really is: can we trust other people to hold our data? If we look back in history, we already do.

Take for instance the legal professional privilege in common law jurisdiction that protects all communications between a professional legal adviser and his or her clients from being disclosed without the permission of the client. Its earliest recorded instance was in 1577 (note that the privilege is that of the client and not of the lawyer).

The privacy problem escalated in the healthcare system in early 1990s, as more and more information was stored in electronic databases. One huge benefit of storing patients data in an electronic database is portability, but with that comes the risk of data being ported in unauthorized hands, either by intent or by mistake.

The beginning of the 21st century paints an even more complex picture, with big data and algorithms and AI. But, according to historian Yuval Noah Harari, as much as we can’t avoid big data and AI, we shouldn’t have people choose between their basic rights. And Alessandra Pierucci, Chair of the Council of Europe’s data protection “Convention 108” committee seems to agree.

Is blockchain technology a solution?

Documenting the PharmaLedger project feels like being involved first hand in solving the privacy-health dilemma. We resonate both with people’s privacy concerns and the need to have enough data in order to offer better healthcare services.

While I was following the worldwide debate about the covid-19 tracing apps, I realized that, if there is one candidate that could take a shot at reconciling these two basic human rights – that is blockchain technology.

Sure, we must collect more data. We must use AI, and machine learning and algorithms. But we must do it with a clear purpose that the patient is onboard with. We must have transparency to prove at any time that we respect the rules. And we must avoid having too much data concentrated in just one place.

It all comes down to these three principles:

  1. Clear purpose – Whenever we allow our personal data to be collected, it should only be used with the intended purpose, and not to manipulate or control us or to benefit third parties.
  2. Transparency – If we need to give access to more personal data, not only health data, but, as is the case with the coronavirus tracking apps – geolocation, who we meet, for how long etc., then we should also have increased transparency about how this data is used and who is using it, and for how long.
  3. Decentralization – A system with clear governance rules that no one instance can override.

But not all of these issues can rely solely on the intrinsic features of blockchain networks. Technology is just a tool. It is up to the human factor how the tool is being used. Let’s have a look into PharmaLedger’s blockchain based solution.

The Personalized Medicine use case in PharmaLedger

PharmaLedger’s Personalized Medicine use case aims the reconciliation of these two fundamental rights, health and privacy. The value proposition at the core of the Personalized Medicine use case is:

Build an environment of trust through (1) transparency and (2) decentralization that is (3) patient-centered – meaning that data is only used to improve diagnosis, prevention and personalized treatments.

Here are the main patient rights that are at the center of PharmaLedger’s Personalized Medicine use case:

  • You are the rightful owner of your own data, and the only one who can give access to it
  • You have the freedom to save, share and access your own health data
  • You know how and when your data is being used
  • You can change your mind – you can revoke access to personal data at any time

How will blockchain help? And why is a regular database not good enough?

If we look at how data is collected right now, we realize that the largest amount of data is in the hands of a few big companies. There’s no transparency over how the data is being used. And it’s out of our control.

In PharmaLedger, we are using a protocol called Open DSU – that was developed in another project we collaborated in – PrivateSky.

DSU stands for data sharing unit, obviously. A DSU is located off-chain and may have some data processing capabilities of its own (and then it is called a self-sovereign app). The importance of having data stored off-chain is crucial from a data-security point of view. Data stored in DSUs is encrypted and only then directly anchored in the blockchains. The next picture explains how private data from off-chain DSUs is anchored to blockchains:

DSU (Data Sharing Units) & anchoring

This means that patients would be in control of their confidential data via a digital wallet on their smartphones.  Kind of like Revolut, only instead of money, you manage the keys to your private data.

Of course the tech behind DSUs and self-sovereign apps is much more complex, but I just wanted to use this pretext to bring up an important aspect: We need to educate ourselves when it comes to data. What is it? Why is it important? What is it worth? And who should be the owner of it?

The fact is that we only begin to become aware of how important our private data is. Due to latest technological developments, data has become a new asset class. It is subject to ownership and monetization. At worst, if it falls into the wrong hands, it can be used against our best interests.

After the experience of last year we learned that we can’t rely on governments to secure our data. For example, what happens when I willingly give out my personal data to a consumer app? Does the current legislation cover my privacy rights? The truth is, at the current time, our private medical data may be at risk if transferred over to consumer apps.

Data should be self-sovereign. PharmaLedger is a platform that will enforce data self-sovereignty, and also will lower the cost of entry for any scientist or researcher who can make good use of that data in order to bring to the market an app or a product that will improve patient’s health. That is democratization of data.

With a platform like PharmaLedger – any consumer app that will be built on top of it will follow the governance rules of the PharmaLedger platform, including the way data privacy is enforced. Enabling this type of ownership and providing information that is relevant, correct and easy to understand is essential in managing fear and increasing trust in data sharing.

High hopes and reasonable expectations

When we speak of new technologies we must have in mind reasonable timeframes for people adopting them.

Even if a new technology is helpful, people still need time to make the necessary mind shift to change their behavior. Sometimes, these changes take years, even decades.

For example, we happily give out our geo-location in order to avoid traffic jam. But when it comes to give out the same data to reduce the spread of a possibly deadly virus, we are skeptical. Why is that?

It is possible that, when we use Google Maps we have an immediate gain. We can see what streets other people are flocking so we can avoid going the same routes. But we don’t necessarily realize that others are seeing what we are doing, too. That other people are using our data. We are just trying to avoid the nasty feelings of anxiety, frustration and anger while being caught in a massive traffic jam – and who could blame us?

The problem is that, beyond this immediate gain, we have zero control of the information we give away. Once ownership is lost – is lost forever. It all circles back to education. Today we may pay less attention to the various agreements in small print that we check-mark. But that may change in the future, as we gain more knowledge about data ownership.

Take away thoughts

I hope this helps clarify the main pain points when it comes to trading health over privacy, and weather we should do it at all. If I were to resume to just three things that make most sense to me, here is my list:

  • We may accept more intrusive technology out of fear in extraordinary situations. But this behavior is not long-lasting.
  • Even in extraordinary situations, we shouldn’t have to choose between our fundamental rights.
  • Decentralization, transparency and a user centric approach create the layer of trust that we can build upon.

PharmaLedger is just one example of data ownership done right. As we continue the learning process, we will be more inclined to choose solutions that switch the privacy-for-benefits trade-off to our advantage.

What would you choose?

What would you do for better healthcare services? Would you give up your private data? Or would you install a digital wallet on your smart phone to keep your private data safe, just like you do with your other assets? Please let us know in the comments section below.