An ounce of prevention is worth a pound of cure. Rarely have we felt this so acutely as during the COVID-19 crisis. Thus we stand on the precipice of solidifying a panopticon.
Brace yourselves. We are learning the implications of our hyper-connected, technologically-advanced, and planet-spanning super-society at an exponentially accelerating rate. And yet such understanding arrives with troublesome latency.
Consider healthcare. Roughly eighty (!) years later we are still navigating the insurance-related consequences of WWII-era wage-fixing. The long tail of such macro-level decisions argues for making them with great care and re-examining them rigorously and regularly.
On initial examination the approach seems well intentioned and thoughtfully reasoned:
- The protocol encodes your identity as a series of rotating, random, anonymous keys
- The system will keep these keys local to your phone
- Key generation and exchange will be opt-in for end-users
- Key exchange occurs between nearby Bluetooth LE devices
- The protocol captures only key-to-key contact, not location data
- An infectee can report their status by uploading their relevant anonymous keys
- Performing (6) will require a confirmation code from a healthcare provider
- Users of the system will download the keys of infectees to ascertain their exposure
There is a lot to like about this. The rotation of anonymous keys makes unmasking identities via pattern-of-life analysis difficult, the healthcare provider’s certification thwarts spammers, the local storage of telemetry and linking of contacts prevents third party access to non-pertinent information, and geospatial-free fact-of contact data limits utility to the intended purpose.
What could be so bad about stepping onto this seemingly high-friction slope? To answer that let’s zoom out a bit.
One of the great tragedies of the past decade took the form of our collective myopia in the wake of the Snowden leaks. Reasonable people can disagree on matters surrounding intelligence operations, civil liberties, and whistle blowers. Where we lost control of the narrative was in thinking too small.
In 2007 news agencies began reporting on divorce court proceedings buttressed by the E-ZPass toll collection system whose deployments began in 1993. This offered a glimpse of the world to come in which we would casually and pervasively trade anonymity for convenience and cash-back, but few people could have guessed the ultimate scope and scale. Apple had publicly released the iPhone just that year, AWS launched the previous year, only three years earlier Facebook launched and Google IPO’d, and meanwhile credit card providers were sitting on a rich vein of ore whose value they would take a few more years to fully realize.
Then in 2013, just as these private tech goliaths really started coming into their own, Snowden swung the spotlight onto government intelligence agencies. Three years later, in case there was any chance of re-focusing this conversation, Trump stole the stage. And only a few more years later, just as GDPR was beginning to enter the planetary consciousness by spamming us with cookie acknowledgement banners, COVID-19 comes roaring onto the scene and seems poised to further cement our dependence on Big Tech for nearly everything from ordering essential goods to maintaining relationships afar to tracing infected individuals.
We have been giving up an assortment of liberties piecemeal over a long time and now all the pieces are about to click together.
Back To The Present
On some level the Apple/Google partnership for contact tracing seems like a noble idea and a reasonable thing to try. In reality it will likely prove ineffective and governments will experience temptation to leverage more invasive means. Consideration of these means will raise public awareness of their existence, a potential headache for Big Tech, and actual usage at scale would carry huge risks of abuse, a reality that ought be blindingly obvious. The purveyors of the world’s biggest sensor networks have good reason at least to try to short-circuit proceedings.
Let’s take a moment to consider goals we may pursue in our current crisis:
- Alert potential infectees of their risk status
- Locate potential infectees, test them, and quarantine them
- Enforce isolation of known infectees
The Apple/Google partnership, like a handful of similar recent proposals, supports only the first use case. Furthermore, an opt-in approach to activating the technique will make a key exchange event unlikely, the messiness inherent to inferring a meaningful contact will result in a high error rate, and latent disease communicants will allow transmission without human proximity. These three problems in concert seem likely to render the data set ineffectual, providing us only a poor solution to part of the problem.
Let’s assume for a moment, though, that there is an adequate deployment rate, the tech is pretty good, human proximity dominates as a transmission vector, and consequently a reasonable signal exists. How long until government and health officials are clamoring to extend the tech to the other use cases? And how much more comfortable would we be with this next increment of invasiveness having already gotten part way there? We cannot readily implement the other two use cases without much bigger privacy implications. And thus over time we may be tempted to employ an all-source intelligence approach as South Korea has, an automated reporting system as China has, or an electronic fence as Taiwan has.
Stumbling Into A Dark Future
We could totally do it. It wouldn’t even be that hard. Smart phones from Apple and Google have high-resolution GPS sensors to enable app functionality, a growing density of cell towers means an improving accuracy of your location even without GPS, cell phone carriers like Verizon and AT&T won’t let your phone communicate via their towers without identifiers sufficient to bill you, Facebook has assembled a 2.5 billion human strong training data set for facial recognition, CCTV cameras are cheap and popping up everywhere, credit card usage is pervasive, and compute on massive scales is available to anyone with a credit card to plug into the cloud.
Three properties are worth noting about the history of the component technology. Firstly, it has emerged gradually, preventing us from properly appreciating the liberties we have been surrendering. Secondly, in many cases companies have created the tech for relatively benign purposes while providing consumers delightful benefits often at little or no monetary cost, making us willing participants. Thirdly, the development of the technology and storage of the resultant data has occurred in a highly fragmentary fashion, yielding a system with built-in integration challenges and inherent checks and balances.
But now we find ourselves in circumstances where it is incredibly tempting to centralize vast amounts of raw data in the hands of governments. And later, when COVID-19 is a painful but distant memory, the inheritors of such stupefyingly powerful technology may well weaponize it in ways we will lament. Make no mistake — we are poised to execute on some of the most impactful decisions in human history with repercussions that could last centuries.
Fielding high-tech solutions on a short-fuse will likely prove either ineffectual because they don’t go far enough or cause disastrous long-term collateral damage because they do go far enough. In the short-term we should employ a variety of low-tech means, in the medium-term we should field new technologies that can have more of the safety benefits while minimizing privacy risks, and in the long-term we should be taking a more holistic look at our nascent suite of technologies and the relationship we want to have with it.
On the prevention front we need to encourage and enforce risk-reducing behavior by standardizing practices exemplified by the grocery stores that are laying out tape markers for distancing purposes. For detection we ought not just be improving the accessibility of testing but also demonstrating some adaptability and elasticity by leveraging the myriad recently unemployed folks to perform the old-fashioned meatware-based contact chaining services that health departments have been perfecting for decades. And, obviously, on the response front we need to provide better access to personal protective equipment, ramp up hospital capacity to meet demand, make testing procedures readily accessible, and flood the world with hand sanitization stations.
I would wager that there is a bunch of tech on the horizon that could tip the equation in our favor. How about ultra-wideband technology which compared to WiFi and Bluetooth demonstrates relatively low power consumption, has superior range resolution, already inhabits new Apple devices, and is soon to find its way into Android devices? Maybe in addition to higher confidence “contact” events I could also white list my socially non-distant brethren and then have my phone buzz if I carelessly wander too close to others?
We have talked a lot about the societies we would like to inhabit. Meanwhile armies of scientists and engineers kept on inventing things for a combination of love and money, businesses kept on exploiting those technologies in ways that are entirely rational from their local standpoint, and now we are courting a form of environmental disaster as we find ourselves surrounded by a dangerous amount of incredibly complex technology, not entirely unlike what we found at the beginning of the Industrial Age when a renaissance in Chemistry resulted in a catastrophe for our water, soil, and air, or when a renaissance in Physics delivered us into the Nuclear Age. The Information Age is at least as dangerous but in ways less screamingly obvious. Let us tread carefully and thoughtfully.