Contact-tracing apps are designed as a complement to the manual data collection process that healthcare experts conduct when a patient tests positive for a disease with high contagion rates. These processes are normally done via a questionnaire that identifies the people the individual has had direct contact with recently. The basic principle of tracing apps is to notify those who may be at risk of infection in the patient’s environment as quickly as possible, so that suitable measures can be taken, such as isolation or administering a test.
Compared to manual processes, applications do not require patients to know everyone they have been in contact with. To do so, the application must register when two users are within a distance that represents a risk of infection and when a person who they have been in contact with recently has tested positive for the disease. It must also permit the individual identified as a carrier to notify everyone he or she has been in contact with. This entire chain of information—and here is where the challenge lies—must be done while safeguarding the anonymity of all the parties involved.
Privacy or security?
Although we will come across conflicting reactions depending on the type of society, as a general rule, rejection is the initial response when people are told that they want to install an application that will report citizens’ movements to the authorities. There are many reasons for this: doubts about whether the app really does what they say, mistrust about correct data use and processing, fear of possible negligence that could lead to a data leak from the central servers, etc. This makes it difficult for an application launched by a country’s healthcare authorities to reach a 60% adoption rate, the figure required for it to be fully effective, according to epidemiological studies by the University of Oxford. If we look at the tracker about contact-tracing applications compiled by the Massachusetts Institute of Technology, we can discern that most governments are well aware of this problem and have taken measures to make their proposals attractive from the viewpoint of privacy. From this long list, only 10 of them use device location tracking data, while the rest rely on Bluetooth technology. This ensures that information about individuals’ locations are removed from the equation, putting the focus on short-distance contacts. Further, to alleviate the distrust of end users, many proposals have adopted a transparent model, making the software’s source code public. Unfortunately, in the majority of cases this transparency is limited when the app is already installed on the device, so that information processing and operation remain on the server side, providing assurance that data will be destroyed when they are no longer relevant.
Two very similar proposals have emerged in response to this problem, focused on only collecting person-to-person contacts and avoiding the sending of this information to a central storage facility, unless absolutely necessary. One of the first solutions to be adopted in some governments’ developments was DP-3T. The technical specifications were designed by a consortium of techies, legal experts, engineers, and epidemiologists whose idea was to create a decentralized system with priority given to user anonymity, as well as giving users control over the information they share.
On their part, and taking inspiration from DP-3T, Apple and Google have joined forces to offer developers a unified API, permitting the use of the services of these two giants to develop contact-tracing applications easily. Both companies have disclosed their intention to implement what is known as Exposure Notification API in future updates of their respective operating systems and have shared some details, such as the possibility of enabling and disabling it by regions, according to each country’s needs and the security and privacy requirements demanded by governments. The basic principle of both solutions, which have already been adopted in several contact-tracing apps, is similar. Using Bluetooth, the idea is for the application to regularly emit random identifiers while also registering those transmitted by other devices. All information about the identifiers emitted and received is stored locally on the device. When a user of the app tests positive for the disease, the pertinent healthcare authority provides the individual with a single-use code (with the objective of preventing a malicious user from being able to send false positives) that can be used to send a remote server the identifiers that have been emitted in recent days, which are then registered in a public database. To verify whether the user has been in contact with a confirmed case, the app periodically queries the database with the identifiers sent by users who have been diagnosed, and checks them against recently-received codes. If there are several consecutive overlaps, equivalent to an exposure time greater than 15 minutes (the time established by the World Health Organization), the software sends an alert to the user so that he or she can take the appropriate measures.
A matter of nuances
Privacy, just like security, is not absolute. From the time when an app starts to log new movements, information is created that puts our privacy in danger. The main risk of anonymized systems is what are called linkage attacks, which can let adversaries associate a series of anonymous data with a concrete person, if they obtain access to enough information to extrapolate them with other sources. For example, even if the data do not leave the device, an adversary with access to both records can use them to confirm that two people were physically close at a specific time and, if the location of one of them is known, it can deduce that the other person was in the same place. This risk increases even more when speaking of centralized databases, particularly those that contain historic location data. If they are not suitably protected—both at infrastructure and legislative levels—they could become a serious risk to users’ freedom and privacy. Finally, we shouldn’t forget that these systems contain highly-sensitive information, making it imperative to ensure, to the utmost degree possible, that the entire system is free from vulnerabilities, including the terminal at which the application is run.
As a reflection, I think it is noteworthy that—except for a few exceptions—there are many governments who are making a real effort to seek balance between the right to privacy and prevention measures, which suggests that we are moving in the right direction.