Public health officials worldwide are struggling to manage the lethal coronavirus disease 2019 (COVID-19) pandemic. As part of the response, governments, technology companies, and research organizations are leveraging emerging data-collection and data-analysis capabilities to understand the disease and model and track its spread through communities. Facilitated by a trove of technology-based data sources—in particular, the data generated from the widespread use of mobile phones—these public health surveillance programs could prove especially valuable for preventing successive waves of infections as quarantine orders are relaxed and economies reopen.
Balancing Public Health Benefits and Privacy Concerns in COVID-19 Mobile Surveillance Tools
Dozens of countries, including the United States, have been using mobile phone tools and data sources for COVID-19 surveillance activities, such as tracking infections and community spread, identifying populated areas at risk, and enforcing quarantine orders. These tools can augment traditional epidemiological interventions, such as contact tracing with technology-based data collection (e.g., automated signaling and record-keeping on mobile phone apps). As the response progresses, other beneficial technologies could include tools that authenticate those with low risk of contagion or that build community trust as stay-at-home orders are lifted.
However, the potential benefits that COVID-19 mobile phone–enhanced public health (“mobile”) surveillance program tools could provide are also accompanied by potential for harm. There are significant risks to citizens from the collection of sensitive data, including personal health, location, and contact data. People whose personal information is being collected might worry about who will receive the data, how those recipients might use the data, how the data might be shared with other entities, and what measures will be taken to safeguard the data from theft or abuse.
The risk of privacy violations can also impact government accountability and public trust. The possibility that one’s privacy will be violated by government officials or technology companies might dissuade citizens from getting tested for COVID-19, downloading public health–oriented mobile phone apps, or sharing symptom or location data. More broadly, real or perceived privacy violations might discourage citizens from believing government messaging or complying with government orders regarding COVID-19.
As U.S. public health agencies consider COVID-19-related mobile surveillance programs, they will need to address privacy concerns to encourage broad uptake and protect against privacy harms. Otherwise, COVID-19 mobile surveillance programs likely will be ineffective and the data collected unrepresentative of the situation on the ground.
Developing a Privacy Scorecard for COVID-19 Mobile Surveillance Programs
To help public health officials understand and evaluate the privacy implications of mobile surveillance programs, RAND Corporation researchers developed a concise, standardized, and transparent privacy scorecard. Conciseness is important because privacy policies for data collection and use are often lengthy and written in complex legal jargon that prevents a typical user from reading and understanding these policies. The RAND team wanted a standardized approach because there are many types of mobile surveillance programs that can be used to monitor COVID-19. Public health agencies will need to be able to compare not only the efficacy and usability of such programs, and but also the privacy protections included in different programs to make good decisions regarding intervention selection. Finally, transparency is critical to building trust with potential users of mobile surveillance program tools.
The research team analyzed documents from diverse sources— including advocacy groups, technology companies, government officials and members of Congress, and key laws (such as the European Union’s General Data Protection Regulation, the Health Insurance Portability and Accountability Act, and the California Consumer Protection Act)—to develop a set of criteria that apply to the mobile surveillance programs being used in COVID-19 response. “Scores” are assigned based on the extent to which the program satisfies the criteria (as determined by an objective, fact-based evaluative question). The scoring options include (1) fully satisfied, (2) partly satisfied, (3) not satisfied, (4) unclear, or (5) not applicable (N/A).
To demonstrate how the scorecard can be used to assess and compare mobile surveillance program tools, the research team scored 40 such programs in 20 countries (including the United States). They found that there is considerable variance across programs, even when those programs are focused on a similar activity (e.g., symptom tracking, contact tracing). For example, Australia’s COVIDSafe contact tracing program fully met 16 of the 20 scorecard criteria and partially met two other criteria. By contrast, South Korea’s contact tracing program fully or partially met only six criteria and did not meet nine; the remaining five criteria were either unclear or not applicable. (Completed scorecards for all assessed programs are available in Appendix B of the full report.) The research team did not evaluate the level of penetration or the efficacy of the programs they scored.
Using the scorecard to compare two contact tracing programs—Australia’s COVIDSafe and South Korea’s location-based text alerts—shows a variance across the programs.
|Does the program meet criteria?||Australia’s COVIDSafe||South Korea’s Location-Based Text Alerts|
RAND-Developed Privacy Scorecard Criteria and Questions
|Transparency||Policies||Does the program provide answers to all the privacy questions that were identified?|
|Public audit||Are the data collected by the program auditable by the public or an independent third party?|
|Open source||Is the program software code open source?|
|Disclosure of data collected||Are users explicitly told what type(s) of data (e.g., GPS, Bluetooth) are collected?|
|User-specific data visibility||Can users view and correct the data that pertain to them?|
|Purpose||Narrow scope||Does the program relate exclusively to the COVID-19 public health response?|
|Secondary use prohibition||Does the program prohibit secondary uses (e.g., making data available for sale or provided to other entities/companies)?|
|Law enforcement firewall||Are the data only available to public health officials and not law enforcement?|
|Data minimization||Does the app collect only the minimum amount of information necessary to achieve the stated purpose? (For instance, does it collect information about users who have not opted in, or specific details, such as timestamps, if only general date is necessary?)|
|Anonymity||Real identities obscured||Does the program anonymize the real identities of the users?|
|Informed Consent||Voluntary||Can users opt out of the program without punitive consequences without being denied access to certain services or goods?|
|Specificity||Do users give consent for the data to be used for the program’s specific purpose?|
|Revocable||Can consent be withdrawn (for example, by deleting the app)?|
|Data deletion||Does the user have the right to delete data that are collected?|
|Temporal Limitations||Sunset clause||Is there a predetermined date when the program will end?|
|Data time limits||Are there limits to how long specific data are collected, processed, and stored?|
|Data Management||Encryption||Are the data that are collected encrypted?|
|Local storage||Will data be stored and processed entirely on the user’s mobile device?|
|Policies||Are there clear policies about data management and cybersecurity practices?|
How Federal, State, and Local Officials Can Use the Scorecard Approach to Protect Privacy and Support Public Health Goals
As U.S. public health agencies continue to develop, assess, and promote mobile surveillance programs, there are several actions that federal, state, and local officials can take to strengthen privacy protections for mobile surveillance program users.
Recommendations for the Federal Government
There is an opportunity for the federal government to promote a national culture of consumer data privacy to both build trust with the public and prevent the abuse of mobile data.
- The federal government could support state and local agencies’ efforts to implement mobile surveillance programs by creating a registry of such programs that includes scorecard-based information about privacy protections for each. Agencies could use this repository of information to support selection of programs for use in their respective jurisdictions and to coordinate approaches.
- The robust technology sector in the United States likely will continue to drive the development of additional approaches to mobile surveillance programs. The federal government can play a strong role in coordinating stakeholders across technology, public health, and privacy communities to ensure that such programs serve their stated public health goals and incorporate strong privacy protections, including those identified in the privacy scorecard.
- To further ensure that mobile surveillance programs have a narrow scope focused on public health goals, the federal government should clarify the authority or authorities under which federal agencies may use the data that are collected.
- More broadly, there is an opportunity for the federal government to promote a national culture of consumer data privacy to both build trust with the public and prevent the abuse of mobile data. In particular, a push by the executive branch and Congress for a national data privacy law could guard against the charge that mobile public health surveillance programs are a means to expand unchecked government power. A federal law could help address concerns over private-sector collection and use of sensitive health and behavioral data and provide a cohesive approach to regulating and clarifying companies’ data-sharing practices. These efforts can safeguard both current and future public health efforts and the data economy.
Recommendations for State and Local Governments
- To clarify expectations for mobile surveillance program tool developers and end users, states should implement a scorecard-based approach to evaluating privacy protections included in such programs. (See example below.) Although some privacy criteria might require trade-offs between usability and public health objectives, many of the criteria can be achieved with little to no impact on the effectiveness of these programs. For example, promoting transparency through public audits and open-source platforms, or building in time limits through sunset clauses and data-retention limits, will not hinder the use of mobile technology but will help protect privacy and might be helpful in encouraging stakeholder and user buy-in.
- In developing their strategies, state and local governments should consult with community stakeholders to ensure that programs are meeting local needs while being sensitive to privacy and equity risks. It will be particularly important that those disproportionately affected by the virus and those historically subject to extensive government surveillance have a central seat at the table in these consultations.
An example of a scorecard-based approach to evaluating privacy protections included in COVID-19 mobile phone–enhanced surveillance programs
Google and Apple jointly developed an interoperable protocol (or API) that enables Bluetooth low-energy beaconing to notify a user of a potential exposure deriving from prolonged proximity to another user who is infected. The protocol is designed to protect anonymity by enabling public health authorities to develop applications that use the Bluetooth chips to send and receive randomly assigned and changing identifiers. If a user is determined to be infected, the user can choose to have the device identifiers included on list of individuals who have tested positive for COVID-19. Periodically, user devices check this identifier list and, if one is found to have been received by the user’s device, the user is notified of a potential exposure. The protocol is now available for app development and will be incorporated directly into mobile operating systems.
|Transparency||Policies||Partially||A significant amount of detail about the protocol has been provided, but the status of some of the criteria will depend on the particular applications that are built.|
|Public audit||Unclear||This depends on specifics of how apps using the protocol are developed and implemented. There is no clear approach to public audits at this stage. To get access to the API, apps have to “meet specific criteria around privacy, security, and data control;”a however, there is no specific mention of “audit.”|
|Open source||Fully||The code is open source and available.|
|Disclosure of data collected||Fully||Although the data collected by each application will to some degree be determined by each public health authority, the API itself only uses Bluetooth beaconing.b|
|User-specific data visibility||Unclear||It is not clear whether users can access data that are collected or transmitted.|
|Purpose||Narrow scope||Fully||The protocol is exclusively used for contact tracing for COVID-19.c|
|Secondary use prohibition||Fully||Apple and Google state that the protocol will not be available for marketing.d|
|Law enforcement firewall||Fully||Only official government public health authorities will have access to the API, and it cannot be used for any purpose other than COVID-19 response.e|
|Data minimization||Fully||Bluetooth beaconing data are only collected for users that have been confirmed to be infected with COVID-19. User location cannot be tracked, and apps that track location will not have access to the API.|
|Anonymity||Real identities obscured||Fully||The protocol is designed to maintain anonymity by using randomly assigned, changing Bluetooth beacons that can only be resolved to a user’s phone with a user-specific key.|
|Reidentification prohibition||Fully||The design of the protocol, including decentralization and use of rotating anonymous identifiers, makes reidentification extremely challenging.|
|Informed Consent||Voluntary||Fully||Google and Apple provide specific goals of the approach, and this is an opt-in system.f|
|Specificity||Fully||Google and Apple emphasize that this will be an opt-in system for contact tracing.f|
|Revocable||Fully||Google and Apple emphasize that this will be an opt-in system for contact tracing.f|
|Data deletion||Fully||Users have the option to delete all tracing keys collected by the API.|
|Temporal Limitations||Sunset clause||No||Apple and Google can choose to shut down the API unilaterally on a regional basis. Once it is incorporated into operating system updates, the API may stay on phones indefinitely.|
|Data time limits||No||Time limits for data retention will depend on how applications using the protocol are developed and implemented.|
|Data Management||Encryption||Fully||Bluetooth beaconing data are encrypted on a user’s phone.|
|Local storage||Partially||The list of Bluetooth beacons stays on the phone; however, a central server is used to collect infected persons’ keys and broadcast them to other users.|
|Policies||Partially||Only some cybersecurity details have been released. Specifics will depend on how the app is developed and implemented.|
aApple and Google, Exposure Notification: Frequently Asked Questions, version 1.0, April 2020.
bApple and Google, Contact Tracing: Bluetooth Specification, v1.1, April 2020.
cApple and Google, “Privacy-Safe Contact Tracing Using Bluetooth Low Energy,” webpage, undated.
dCasey Newton, “Apple and Google Have a Clever Way of Encouraging People to Install Contact-Tracing Apps for COVID-19,” The Verge, April 14, 2020.
eDarrell Etherington, “Apple and Google Release Sample Code, UI and Detailed Policies for COVID19 Exposure-Notification Apps,” TechCrunch, May 4, 2020.
fZack Whittaker and Darrell Etherington, “Q&A: Apple and Google Discuss Their Coronavirus Tracing Efforts,” TechCrunch, April 13, 2020.
With the scorecard, end users of mobile surveillance program tools can see which privacy protections specific programs offer and explanations for how the program did or did not meet the identified criteria. In cases where there are privacy trade-offs—for example, if the collection of real identities is necessary or data need to be managed centrally rather than on users’ devices—public health officials can explain the reasons for not meeting specific criteria. Transparency about such trade-offs, including a justification based on public health needs, can help build user trust.