Is Apple and Google's Covid-19 Contact Tracing a Privacy Risk?

When Google and Apple announced last week that the two companies are building changes into Android and iOS to enable Bluetooth-based Covid-19 contact tracing, they touched off an immediate firestorm of criticisms. The notion of a Silicon Valley scheme to monitor yet another metric of our lives raised immediate questions about the system's practicality and its privacy. Now it's time to seek answers.Apple and Google say that starting next month they'll add new features to their mobile operating systems that make it possible for certain approved apps, run by government health agencies, to use Bluetooth radios to track physical proximity between phones. If someone later receives a positive Covid-19 diagnosis, they can report it through the app, and any users who have been in recent contact will receive a notification. The system is Bluetooth-only, fully opt-in, collects no location data from users and no data at all from anyone without a positive Covid-19 diagnosis. Apple and Google chose perhaps the most privacy-friendly of the many different schemes that could allow automated smartphone contact tracing.
But that doesn't necessarily mean it's private enough, or practical. Security and privacy-focused technologists have pointed to a long list of potential flaws in Apple and Google's system , including techniques that could reveal the identities of Covid-19 positive users or help advertisers track them, false positives from trolls, mistaken self-diagnoses, and faulty signals between phones.Those problems are real—but some have solutions. WIRED spoke to cryptographers and security experts about the potential pitfalls of Bluetooth contact-tracing, and then posed those issues to a few of the technologists helping to build the contact-tracing systems at Apple, Google, and a consortium of more than a dozen groups focused on Bluetooth-based contact-tracing called the TCN Coalition, including groups like Covid Watch, Co-Epi and Novid.

Read all of our coronavirus coverage here .

The result is a complicated picture: an unproven system whose imperfections could drive users away from adopting it or even result in unintended privacy violations. And yet it may also preserve privacy in the most important ways, while also serving as a significant tool to help countries around the world prevent new outbreaks.

The criticisms of the Bluetooth-based system outlined below don't encompass some of the larger sociological and political issues surrounding smartphone contact-tracing . Any effective contact-tracing will require testing for Covid-19 to ramp up far past current levels. Diagnosed or exposed individuals need the economic freedom and space to self-quarantine. And many low-income or older people—those who appear to be most at-risk—are less likely to have smartphones. Instead, we'll examine the more immediate question of potential technical vulnerabilities in the system.

Can It Be Used to Track People?

The likeliest concern for anyone taking part in a contact-tracing system is whether they're signing up for more surveillance. Bluetooth-based contact-tracing is perhaps the least surveillance-friendly option, but its protections aren't perfect.

To understand those flaws, first a refresher on how Google and Apple's scheme—and the similar one proposed by the TCN Coalition—will work. Contact-tracing apps will constantly broadcast unique, rotating Bluetooth codes that are derived from a cryptographic key that changes once each day. At the same time, they'll constantly monitor the phones around them, recording the codes of any other phones they encounter within a certain amount of range and time—say, within six feet for 10 minutes. (Both numbers are "tunable" based on new data about how Covid-19 infections are occurring.) When a user reports a positive Covid-19 diagnosis, their app uploads the cryptographic keys that were used to generate their codes over the last two weeks to a server. Everyone else's app then downloads those daily keys and uses them to recreate the unique rotating codes they generated. If it finds a match with one of its stored codes, the app will notify that person that they may have been exposed, and will then show them information about self-quarantining or getting tested themselves.
The system involves every phone constantly broadcasting Bluetooth codes, but limits any snoop's ability to eavesdrop on those codes to track a person's movements by switching up the numbers every 10 or 15 minutes. Even so, Ashkan Soltani, former chief technologist for the Federal Trade Commission, has pointed out that a so-called "correlation attack" could still allow some forms of tracking.To demonstrate the problem, Soltani imagines a nosy neighbor setting up a camera outside their window and recording the face of everyone who walks by. The same neighbor also "roots" their phone so they can see all the contact-tracing Bluetooth signals it picks up from other users. When one of those passersby later reports that they're Covid-19 positive, the snoop's app will receive all their keys from the contact-tracing server, and they'll be able to match up the codes the user broadcast at the moment they passed the camera, identifying a stranger as Covid-19 positive. They might go as far as posting the picture of that infected person on Nextdoor to warn neighbors to watch out for them.