The fact that data has an immense value in today’s society is now common knowledge, and as technologies that capture and analyze data proliferate, so do businesses’ and states’ abilities to collect and exploit data. It is therefore fundamental that we ask ourselves whether our data is really private or not, and whether there are solutions available to gain access and complete control over it.

In this “data gold rush”, laws have struggled to keep up, and it has too often been up to private firms to establish what is ethical and what is not, and act accordingly. This has led to the occurrence of several cases of misuse of personal data.


Cases of misuse of personal data by private entities

In 2018, it was revealed that a UK political consulting firm acquired and used personal data from Facebook users that was initially collected from a third party. In total, Cambridge Analytica misused the data of nearly 87 million Facebook users, many of whom had not given any explicit permission for the company to use or even access their information. The public was only made aware of the Cambridge Analytica scandal when several employees came forward as whistleblowers.

More recently it became public that many entities in the states that have banned abortion in the USA, are tracking and persecuting women ending their pregnancies by accessing their location data from their phones. Figures from Google show that the company received 5,764 “geofence” warrants between 2018 and 2020 from police in the 10 states that have banned abortion. These warrants demand GPS data showing which mobile devices were present in a specified area during a particular time period. Google doesn’t specify what alleged crimes these warrants concerned, and no known cases have come to light yet of geofence warrants being used to prosecute abortions; but the data shows that the use of these warrants is widespread 1.


Medical data privacy in US and China

Not only personal or location data, but medical data is also subject to harvesting and trading processes. Medical data represents one of the most sensitive information sets that one owns, and therefore its privacy and access raise many concerns. In this context, different world powers are adopting very different solutions:

  • The United States

In the United States, the Health Insurance Portability and Accountability Act (HIPAA) defines who is allowed to see patients’ medical records2. In some cases, patients need to give explicit permission to access their records. However, such permission is not always required; covered entities such as doctors, healthcare facilities, the government and even some buyers are allowed to access U.S. medical data under specific guidelines. On the other hand, medical records in an aggregated form – databases that includes lots of anonymized data attributes – become accessible to several different entities, as organizations gain the right to share or sell them.

To summarize, in the US patients’ medical records are protected and private, but they can still be legally accessed and purchased by various people or groups, even without patient permission. However, in any of these cases, the patients have no input on the use of their data nor a way to tangibly benefit from this immense market.

  • China

China understood early that big data in health and medicine are a strategic national resource, and that their development could improve healthcare; they have subsequently heavily promoted the use of big data in medicine. The campaign to actively collect and study citizens’ medical data without their consent began in 2016 in four cities chosen as pilot sites, and many more centres have been built since then. However, there is no specific law or guidance on data privacy in China3.


Is there an alternative?

It seems that the world is evolving in a direction in which it is up to states to use and regulate how big private businesses can use people’s data, in an equation in which the people – the actual owners of their data – are almost always an absent party.

We at Data Lake have created a consent-based infrastructure to unlock the exchange of medical data, while both empowering and rewarding individuals, as well as securing access to data for medical research.

With Data Lake, citizens can simply sign a transaction on the Polygon blockchain to gain control over their medical data in a safe and transparent way. Once the consent is given, Data Lake sources the patient’s medical data and anonymizes it. When a request for a specific data set is registered by vetted Data Buyers doing legitimate research and development, the patient’s data is contributed within large anonymous data sets, and the patient is then rewarded in $LAKE tokens by doing so.

$LAKE is the currency of medical data flow: once Data Lake registers a data purchase from verified researchers or private healthcare companies, the value of the purchase is registered on the blockchain. From this transaction, an amount that varies between 40 to 97% of the purchase value is converted into $LAKE tokens; these tokens are distributed to data donors, consent collectors, trust entities and the Data Lake treasury, as well as the token’s liquidity pool.

True to the principles of self-determination of data, our system allows donors to revoke their consent at any time. With a “negative” consent status, donors’ data will instantly be excluded from any future datasets.


Our mission is to unlock medical data for the first time in a way that is democratic and that provides solid fundamentals for medical research to prosper. This will mean faster diagnoses, better treatments and medications, and ultimately improved patient outcomes and hope for millions. Visit to become a Data Donor, and follow our journey though the democratization of medical data at







One Comment

  1. Consent-to-Earn; a model for fair data exchange. | Data Lake

    […] companies have come to light, as we have analyzed in our article “Is my Personal Data private?“2 . The common thread of these scandals is that individuals are not aware of the use of their […]

Comments are closed.