Public health experts rushed to create contact tracing apps in countries all over the world this spring. They serve an important purpose in determining who might've been exposed to the novel coronavirus so they can be tested and isolated. But the risks were clear too. Contact tracing apps have the power to amass personal data that reveals your movements, activities and relationships.
The potential harm from contact tracing apps came into focus at Defcon, an annual gathering of hackers that's taking place online this week. Two presentations focused on the privacy failings of contact tracing apps. The verdict is clear: The apps have a tendency to collect information they don't need.
This data-hungry mindset isn't how governments should approach contact tracing apps, said Eivind Arvesen, a security researcher from Norway who presented at Defcon on Friday. Instead, they should be asking themselves, "How little data can I get away with to try to solve this concrete issue, and no more?"
Arvesen presented on Norway's now-defunct contact tracing app, which he helped review as part of a government funded third-party audit. Another presentation, on Saturday, will focus on the permissions asked for by contact tracing apps, as well as COVID-19 symptom tracking and information apps.
Human contact tracers usually hunt down the known contacts of someone who tests positive for a contagious disease like COVID-19. Apps seek to fill in the blanks as to where a contagious person has exposed a stranger to a disease. As two strangers stand near each other, for example, the apps record that contact in case either of them tests positive in the days that follow. For the apps to be effective, a high percentage of the population has to use them.
As soon as public health agencies turned to apps to augment the contact tracing process, privacy experts warned of risks. Governments should be transparent about the data they take from phones, avoid collecting unnecessary data, and also plan to end the collection and delete the data when the pandemic passes. Universities, including MIT, and tech companies, like Apple and Google, jumped to create privacy-respecting software that governments could use in their apps.
Norway's contact tracing app
Arvesen said Norway's app collected location data and one, unchanging identification code for users, creating a permanent and thorough record of their movements to be stored centrally on a server. That might actually sound ideal for contact tracers, but privacy experts say that collecting location data is unnecessary and should be avoided. Where two people were when they met doesn't matter. All that counts is that they met.
It also isn't necessary to give one user a single, unchanging identifier. Other apps have found ways to avoid this, with some protocols changing the user's identifier as often as once a minute. This approach makes it much harder for someone to abuse the data, using it to track one person's movements while using the app.
Finally, some apps store the data locally on the user's phone and access it only if that person tests positive and agrees to share the data.
As Arvesen and his fellow reviewers prepared their report on Norway's app, regulators from the country's Data Protection Authority signaled they were also concerned. Then, the country shut down the app.
Apps around the world take location data
Arvesen said he found the app to be worse on privacy than other contact tracing apps in Europe. But data-hungry apps exist elsewhere in the world. The creators of COVID-19 App Tracker, who are presenting their findings on Saturday, automatically scanned 136 apps from countries around the world and found that most of them ask for permissions they don't need.
Of the apps scanned, three quarters asked for location data, said Megan DeBlois, a co-creator of the website. Some of the apps simply help users keep track of their symptoms, and have no reason to ask for location data.
DeBlois teamed up with her brother and their respective partners to create the app tracker, and all are volunteers. The goal of the project is to capture information about every governmental COVID app on the Google Play store and make it publicly available.
Permissions are only part of the picture. To really understand how an app behaves, researchers have to look at the data it sends and receives when it's in use. Security auditors like Arvesen can do that on behalf of governments.
DeBlois said she'd like to see more transparency about the data used in contact tracing apps. Ideally, governments would make the code open-source, making it easy for privacy researchers to analyze it and flag any problems for the general public.
One possible reason governments haven't done this is the speed with which they've had to create the apps. The rush could've prompted governments to set aside security reviews that would normally take place before software gets deployed to users. Open-source code would then make it easy for bad actors to look for obvious flaws and exploit them.
Without the reviews, DeBlois and Arvesen both said, users can't trust that the government is taking only the data it needs, and keeping it safe.
"We want people to look at the code," DeBlois said. "You can verify it through the code, build that trust."