Since the start of the pandemic, an army of companies – from the Big Tech players down to start-ups most of us haven’t heard of yet – rushed to be part of the Covid-19 response. This is the transformative ‘disruption’ to the healthcare sector they’ve been betting on – it just arrived a little differently than expected. But while they look set to cash in during the crisis, it could come at a terrible cost to our privacy.
By Tanya O’Carroll, Director of Amnesty Tech
The world of biotech was already blossoming before murmurings of an untreatable mystery virus started to circulate at the end of last year. There are now countless companies offering everything from personalized fitness plans based on your genes to dietary advice based on the bacteria in your gut. These companies and the VC firms that back them believe there is a huge, untapped market for selling AI-driven insights based on health data. Big Tech also wants in.
If these dynamics were already in motion, they have been on fast track since the onset of the coronavirus pandemic. In March, Prime Minister Boris Johnson held a meeting with healthcare start-ups, big tech firms, and major healthcare players to discuss how they could help tackle the coronavirus pandemic. The government then quietly granted access to millions of UK health data records to Amazon, Microsoft, and Google – plus controversial data-mining firm Palantir – to build a Covid19 Datastore, aggregating data from multiple sources, including testing data. On the other end of the spectrum, start-ups such as EverlyWell, Let’s Get Checked and CircleDNA that sell home testing kits for things like genetics and blood diagnostics have rushed to market new covid-19 testing kits.
But while tech companies are renowned for moving fast, they also famously break things. One of the things that could be left broken is our right to privacy if bio-invasive surveillance creeps in as a new norm.
Covid-19 presents an unprecedented opportunity for tech companies to get their hands on health data. The fact that health data is governed by strict data privacy regimes in Europe and the US has long been a frustration for those who want to cash in on the sector, as it takes gargantuan datasets to train the kinds of AI models that can be monetized. One way private ventures can gain access to health data is by partnering with governments. Even if they cannot walk away with direct patient records, they can walk away with the lucrative AI models built from those records, which is the key that unlocks the value of the data. This helps explain why Palantir, a data-mining firm that frequently run contracts worth millions of dollars, has agreed to assist the government’s Covid-19 response for the cost of just £1.
Another way that companies can access data is by building up their own private vaults of health data directly from consumers. Google knew this when it moved to acquire FitBit for $2.1 billion at the end of 2019 and Fitbit’s CEO also knew it when he said, “ultimately Fitbit is going to be about the data.”
This has been the model used by the home diagnostics market. Because they go ‘direct to consumer’, companies can solicit consent from consumers to use their genetic and other intimate health data for research, giving them a very free reign in what they can then do with that data. The company 23andme, which maps people’s ancestry based on their DNA, controversially made millions of dollars selling the behavioural, health, and genetic insights it had accumulated from its huge customer-base to Big Pharma and biotechnology companies.
The problem is that once health data is on the market, it can be used in all kinds of ways that could never have been understood or predicted when someone ticked a ‘consent’ box. Advertisers, including pharmaceutical companies, can use AI models trained on genetics to target people who flag as higher risk for specific health conditions – despite the science behind such ‘predictions’ being shaky at best. Meanwhile, insurance companies may use the insights generated by big data to determine who gets coverage and at what price. Meanwhile, privately-held genetic data has already been used by law enforcement without the awareness or consent of those whose data was shared.
These are the mechanics of surveillance capitalism at work – the fact that the business model underpinning most digital services has been designed from the get-go to mine, profile and influence people at scale by capturing our data – and attention – and selling it to others. Last year, Amnesty International warned that this “surveillance-based business model” poses an unprecedented threat to human rights by forcing people to make a Faustian bargain: give up their intimate data – and their rights – in order to access the benefits of the modern world. This is simply not a legitimate choice and invalidates the so-called ‘consent’ that many companies rely on to justify their invasive data practices.
As companies scramble to create an even more intimate marketplace of our data – one that trades in insights about our biological selves – you don’t need to strain hard to imagine the endgame. Start-ups like China-based iCarbonX, dubbed “the next Google in BioTech”, have painted that vision for us. iCarbonX reportedly wants to capture more data about your body that has even before been possible- combining genetic sequencing, data from frequent blood tests, microbiome insights and physical data from both wearable fitness devices and products like their smart mirror, which, according to their CEO, aims to produce “an exact 3-D figure of you: the fat, the muscle—your entire body shape, plus facial recognition, and what’s going on with your skin”. The main product currently advertised on the company’s website? A covid-19 testing kit.
While Covid-19 didn’t create this problem, it looks set to accelerate it. It is now clear that continuous high-levels of population-scale testing will be necessary if we are to co-exist with the virus out-of-lockdown. This means there is a major new market for testing. But people need to be able to trust that when they take a Covid-19 test – be it provided through the NHS, their employer or a private company – that their data is protected and won’t be used for any other purpose. Data protection might not feel like a priority in a crisis, but we will be living with the consequences long after the pandemic is over if we fail to draw a protective line around our health data now.
THIS ARTICLE WAS FIRST PUBLISHED HERE BY NEWSWEEK