Mobile Location Data and Covid-19: Q&A - Human Rights Watch

Mobile Location Data and Covid-19: Q&A - Human Rights Watch


Mobile Location Data and Covid-19: Q&A - Human Rights Watch

Posted: 13 May 2020 12:00 AM PDT

Introduction

Governments and the private sector are increasingly relying on data-driven technologies to help contain the novel coronavirus, Covid-19. While some see technological solutions as a critical tool for contact tracing, quarantine enforcement, tracking the spread of the virus, and allocating medical resources, these practices raise significant human rights concerns. Human Rights Watch is particularly concerned about proposals for the use of mobile location data in the Covid-19 response because the data usually contains sensitive and revealing insights about people's identity, location, behavior, associations, and activities.

Mobile location data programs to combat Covid-19 may not be scientifically necessary and could lead to human rights abuses if they are not equipped with effective safeguards to protect privacy. The long history of emergency measures, such as surveillance measures put in place to counter terrorism, shows that they often go too far, fail to have their desired effect, and, once approved, often outlast their justification. 

This Q&A explains the different ways that governments are using mobile location data to respond to Covid-19, the human rights concerns associated with these measures, and human rights standards that should be applied when using such data. It includes illustrative cases, recommendations, and guidelines to help evaluate the human rights risks posed by the use of mobile location data.

What Is Mobile Location Data and How Is It Being Used to Respond to Covid-19?

How is mobile location data being used to respond to Covid-19?

We define "mobile location data" as geolocation and proximity information from mobile phones and other devices. Governments view mobile location data as a key component of measures to contain the spread of Covid-19. They are presenting individualized tracking as a reliable way to track the movement of people who are infected and identify individuals with whom they came into contact during the period in which they are contagious. Individualized tracking can also be used to ascertain whether people are complying with social distancing and quarantine measures. Analysis of aggregate location data, on the other hand, might provide insight into the effectiveness of social distancing measures, model the potential for transmission, and identify potential "hot spots" of transmission. Examples of how governments are using technology to respond to Covid-19 include:

  • Contact tracing: Contact tracing is the process of identifying individuals who may have come into contact with an infected person. Its goal is to interrupt transmission by rapidly identifying individuals who have been in close contact of someone who is infected, defined by the United States Centers for Disease Control and Prevention (CDC) as within 6 feet of someone for approximately 10 or more minutes. The idea is to encourage such individuals to isolate themselves from others and seek testing and treatment. Because the coronavirus is primarily transmitted through person-to-person contact via respiratory droplets when an infected person coughs, sneezes, or talks, mobile location data has been proposed as a helpful method to identify potentially exposed individuals.
  • Enforcing quarantine and social distancing orders: Governments are imposing quarantines and other restrictions on movement, including broad lockdowns, closures of business, public spaces, and institutions, orders for the isolation of individuals infected, and requests for voluntary social distancing. Governments are using mobile location data to monitor compliance with these restrictions, for example, by encouraging or compelling people to install an app that uses location data to identify people who violate these restrictions.
  • Big data analytics: Companies and governments are also examining location data in aggregate form to better understand general patterns of people's movements and behaviors and how these have changed over time. Such analysis aims to forecast how the virus might be spreading and the effectiveness of public health interventions such as social distancing measures and identify ways to better allocate testing and medical resources.
  • Hot spot mapping: Hot spot mapping is a type of big data analysis that involves the use of location data to piece together the movement or location history of individuals who have tested positive in order to send out public health warnings about particular locations, or close down or disinfect particular locations.

How does mobile location tracking work?

Mobile location data comes from a variety of sources, including cellphone towers, Global Positioning System (GPS) signals, and Bluetooth beacons.

  • Cell site location informationMobile phones connect their users to telecommunications and internet networks through cell towers. As a mobile phone moves with its user, the phone pings nearby cell towers (or "cell sites"). This process generates location information stored by the telecommunication operators ("Telcos") about the cell towers to which the phone has sent a signal. With proximity information from multiple cell towers, a technique called "triangulation" is used to estimate the location of a cell phone with greater precision. Governments can compel Telcos to provide that mobile location information to track someone's real time or past movement.
  • Global positioning system (GPS): A mobile phone's GPS capabilities allow it to track its location to within 5 to 10 feet (1.5 to 3 meters). Many smartphone apps (including maps, social media, games, shopping, and utility apps) log this location data, which can then be obtained by governments and data brokers. Data brokers are entities – some well-known, others less so – that collect information about potential consumers then sell that data (or analytic scores, or classifications made based on that data) to other data brokers, companies, and/or individuals. There has been a proliferation of apps for contact tracing and quarantine enforcement that rely on GPS data to track people's movements. Additionally, anonymized GPS data (i.e. stripping data of personally identifiable information) can be used to track patterns of movement of populations in the past and in real time.
  • Bluetooth beaconsBluetooth is a wireless, low-power, short-distance set of protocols used primarily to connect devices directly to each other in order to transfer data. Bluetooth can only communicate with devices that are nearby (approximately 33 feet or 10 meters). Bluetooth signals have been proposed as a method of contact tracing by identifying a phone's proximity to other devices with a relatively high level of accuracy, using a specialized app. Unlike cell tower or GPS data, which track actual location, Bluetooth tracks interactions. Therefore, it is best understood as an interaction tracking tool. 

What Are the Applicable Human Rights Standards?

Even in times of emergency, when states restrict human rights for public health reasons, international human rights law says that measures taken that limit people's rights and freedoms must be lawful, necessary, and proportionate. States of emergency need to be limited in duration and any curtailment of rights needs to take into consideration the disproportionate impact on specific populations or marginalized groups.

These rules apply to efforts to track and manage Covid-19 using mobile location data. The collection and analysis of such data could reveal users' identities, movements, and associations in a manner that interferes with the right to privacy. Article 17 of the International Covenant on Civil and Political Rights (ICCPR), which is derived from Article 12 of the Universal Declaration of Human Rights (UDHR), establishes "the protection of the law" against "arbitrary or unlawful interference" with an individual's "privacy, family, home, or correspondence." The United Nations Human Rights Committee has found that restrictions on the right to privacy must take place only "in cases envisaged by the law." Restrictions must also be "proportionate to the end sought, and ... necessary in the circumstances of any given case."

Human Rights Watch and over 100 other human rights organizations have urged governments to respect privacy and human rights when using digital technologies to contain the pandemic. At a minimum, technology-assisted measures should:

  • Be lawful, necessary, proportionate, transparent, and justified by legitimate public health objectives
  • Be time-bound and only continue for as long as necessary to address the pandemic
  • Be limited in scope and purpose, used only for the purposes of responding to the pandemic
  • Ensure sufficient security of any personal data that is collected
  • Mitigate any risk of enabling discrimination or other rights abuses against marginalized populations
  • Be transparent about any data-sharing agreements with other public or private sector entities
  • Incorporate protections and safeguards against abusive surveillance and give people access to effective remedies
  • Provide for free, active, and meaningful participation of relevant stakeholders in data collection efforts

How Are Governments Using Mobile Location Data to Respond to Covid-19?

Governments are increasingly using mobile location data to respond to the spread of Covid-19 for understandable public health reasons since the virus is a highly communicable disease. Privacy International, a London-based organization working to promote the right to privacy worldwide, is maintaining a tracker of responses that governments, tech companies, and international agencies are using to help contain the spread of the Covid-19. Below are some examples.

Contact Tracing Using Data Provided by Telecommunications Providers

Governments are accessing data from Telcos in contact tracing efforts. In Israel, an emergency regulation approved by the government on March 17 authorized Shin Bet, Israel's internal security service, to receive, collect, and process "technological data," including location data, from Telcos without user consent to predict which citizens have been exposed to the virus. Under the program, the health ministry sends alerts to people's phones ordering them to self-quarantine. The cabinet circumvented the parliament in approving the emergency regulation. Israel's supreme court later ruled that the government needed to pass a law authorizing such "fulfills the principles of privacy protection" or else it would be halted. The health ministry on March 23 also released a voluntary app, ostensibly to back up Shin Bet efforts, to inform people if they have come in contact with an infected person.

In Armenia, the parliament on March 31 passed amendments giving the authorities very broad surveillance powers, which require Telcos to hand over the phone records for all of their customers, including phone numbers and the location, time, and date of their calls and text messages. The authorities can use that data to identify individuals who are infected and should be isolated or close contacts who should self-quarantine, or to monitor individuals in isolation or quarantine.

In Russia, the prime minister on March 20 ordered the communications ministry to design a national system to track people who have been in contact with coronavirus patients, using location data provided by individuals' mobile phone provider. On April 1, the communications ministry confirmed it had designed the system. The communications ministry has demanded that regional authorities provide lists of mobile phone numbers of people infected with coronavirus, as well as the phone numbers of citizens who are quarantined at home either because they had traveled abroad or had contact with infected people.

In Ecuador, on March 16, the president issued an emergency decree authorizing the government to use data from satellite and mobile telephone platforms to monitor people who tested positive for the virus, those who have been in close contact with someone who tested positive, those who have symptoms, and those subjected to mandatory isolation for having entered the country from abroad.

Bluetooth Contact Tracing

In Singapore, the government on March 20 launched TraceTogether, a Bluetooth-based contact tracing app, to supplement its human contact tracing efforts. When a person is contacted, they are required by law to assist the health ministry in accurately mapping out their movements and interactions to minimize the risk of widespread infection. Data logs are stored on phones in encrypted form, using "cryptographically generated temporary IDs". However, when a TraceTogether user is a confirmed Covid-19 case and agrees to upload the data log in the app to the health ministry, the health ministry will decrypt the temporary IDs in the user's app and obtain a list of phone numbers from the uploaded data log.

The European Commission on April 8 adopted a recommendation to pursue a pan-European coordinated approach for the use of mobile applications for contact tracing, among other purposes. The common approach will be guided by privacy and data protection principles, including data minimization and appropriate safeguards such as psaggregation, encryption, and decentralization. It will also be voluntary, with a preference for Bluetooth-based proximity tracing. Further guidance is due to be adopted on the data protection and privacy implications of the use of mobile applications. The European Parliament on April 17 adopted a resolution reinforcing the commission's recommendation, demanding full transparency so that people can verify the underlying protocol for security and privacy of such apps. In the meantime, a number of European Union countries, including France, Germany, and the Netherlands, are in the process of selecting contact tracing apps.

In Norway, the National Institute of Public Health on April 16 launched a voluntary, self-reporting app that will monitor users' movements and then ask people to go into quarantine if they have been exposed to someone who tested positive for the coronavirus. When a user is confirmed as having coronavirus, the app will then retrieve their location data and send a text message to every other user who has been within 2 meters of that person for more than 15 minutes, instructing them to go into quarantine.

Mobile Apps to Enforce Quarantine and Social Distancing Orders

Authorities in cities and provinces across China are using the app Health Code, which was developed by private companies, to make decisions about whom to quarantine and for how long. The app assigns each of its approximately 700 million users one of 3 colors: green enables unrestricted movement, yellow requires 7 days of quarantine, and red requires 14 days of quarantine. To enter buildings, go to the supermarket, use public transport, and move around their neighborhood, people must scan a QR code at a manned checkpoint. However, the rules behind color assignments are secretive, making it difficult for individuals to understand why they were assigned a particular color, or what circumstances might trigger a change of color. The app also collects users' location data and shares it with the police. Users have complained that the app's decisions are arbitrary and difficult to appeal; some of them have been confined to their homes for indefinite periods even after serving the quarantine period mandated by the app.

In Turkey, the health minister declared on April 7 that it is mandatory for people infected with Covid-19 to download an app called "Life fits inside the house" as part of the "Pandemic Isolation Tracking Project." The app follows the movement of people instructed to self-isolate, and if they leave their homes, they receive a warning via SMS and are contacted instantly through automatic call technology and told to return to isolation. Under the program, those who fail to comply with the warning and continue to violate the quarantine are reported to relevant law enforcement and face administrative measures and sanctions, which can include jail time ranging from two months to a year in accordance with Article 195 of Turkish Penal Code. Human Rights Watch has not yet investigated how widespread the use of the app is in practice and whether the Turkish authorities have made efforts to enforce its use.

In Moscow, the city government on April launched an app to track the movement of coronavirus patients. The app is mandatory for all patients who have been ordered to stay at home. It requests access to the user's calls, location, camera, storage, network information, sensors, and other data to ensure people do not leave their home while contagious. This app is in addition to the installation of one of the world's biggest surveillance camera systems equipped with facial recognition technology to ensure that everyone placed under self-quarantine stays off the streets. On April 15, Moscow also introduced a digital permit system for non-essential travel, both on public transport and private vehicles.  

Big Data Analytics

In the EUeight major Telcos have agreed to share anonymized metadata with the European Commission for modelling and predicting the propagation of the coronavirus. An official from the commission said the data will be aggregated and anonymized and that the commission will delete it when the pandemic is over. Still, the European Data Protection Supervisor warned about the possibility of such measures becoming permanent.

In the US, mobile advertising companies, which gather the location data of mobile and internet users to target and sell ads, are reportedly supplying analyses of people's locations and movements to the CDC and certain state and local governments. In the context of Covid-19, this data sharing arrangement is apparently designed to help the authorities better understand how infections spread and refine public health responses. Much of this arrangement, including how data is collected, shared, anonymized, and analysed, is unknown. It has also been reported that the federal government is building a national coronavirus surveillance system to monitor and forecast rates of infection and hospitalization across the country. It is unclear whether this project is linked to the CDC's partnership with the mobile advertising industry.

In South Korea, in addition to using cell phone location data, CCTV cameras, and tracking of debit, ATM, and credit cards to identify people infected with coronavirus, the authorities created a publicly available map using aggregate data of infected individuals to allow other people to check whether they may have crossed paths with someone infected with the virus. The platform was officially launched on March 26. Health authorities also send out cell phone notifications containing very detailed information on confirmed cases, including the age, gender, and daily routes infected people took 48 hours before being quarantined. The purpose of the disclosures is to enable potential untraceable contacts (for example, strangers who were in the same restaurant as the confirmed case at the same time) to recognize and prepare for possible infection.

In Ecuador, the president on April 6 announced the SOS Covid tool, which works with information obtained from the emergency service, the ministry of telecommunications, the ministry of health, mobile-service providers, and the Salud EC App (see below) to monitor whether the quarantine is being observed, detect cases, carry out massive tests, and identify areas of risk due to crowding.

Self-Reporting Initiatives

Governments are also launching initiatives to report coronavirus cases and direct people toward medical resources that rely on location data. For example, Ethiopia's Information Network Security Agency launched a Covid-19 monitoring platform on March 23 to update the public on the number of Covid-19 cases in the country and to provide information such as directions to the nearest pharmacies, hospitals, and police stations. People who develop symptoms or were in contact with people with confirmed cases can also give information to the ministry of health via the platform. The system also enables users to report illegal or unauthorized activities, such as large public gatherings. It also allows the reporting of people suspected of having symptoms based on subjective assessments on others' symptoms. This is concerning, especially at a time when there are reports of harassment and discrimination against foreigners and healthcare sector workers as cases of Covid-19 rise in Ethiopia.

Ecuador announced on March 25 the development of Salud EC App, an application that stores the name, year of birth, ID number, and geolocated address of its users. Through this voluntary app, users can report their symptoms related to Covid-19. The app then provides the user with the online resources created by the government for the health emergency.

How Can Mobile Location Tracking Interfere with the Right to Privacy?

The privacy risks of mobile location tracking are significant and well-established. Mobile location information can contain sensitive and revealing insights about a person's identity, location, behavior, associations, and activities. The use of mobile phone network data creates granular, real-time targeting opportunities, which can be used by governments to forcefully enforce quarantine, discriminate, or crackdown on populations for other reasons. And in the hands of abusive governments that already have adopted intrusive surveillance practices, this can serve to enhance repression.

The mobile phone tracking programs described above raise concern that governments are collecting, using, and retaining data beyond what is necessary for legitimate and targeted disease surveillance measures. The lack of transparency regarding many Covid-19 tracking initiatives, such as those in Ecuador and Ethiopia, prevents the public from assessing whether there are meaningful limits on the types of personal information that will be collected, used, aggregated, and retained, or whether tracking and data collection will end once the pandemic is contained. This is particularly troubling in countries like ChinaEthiopia, and Russia, which have a record of pervasive surveillance.

Other concerns include: restricting people's movements based on arbitrary and opaque apps, as is the case in China; the lack of consent to data being used, as is the case in Armenia, Israel, and South Korea; and the combination of mobile location data with other types of data, such as facial recognition, as is the case in Moscow. Almost all of the initiatives using location data to respond to Covid-19 involve placing large collections of data in the hands of governments, many of which have histories of repression and discrimination against already marginalized communities, including religious minorities and political dissidents.

Excessive interference with location privacy is a gateway to undue restrictions on other rights. For example, in Israel, the Shin Bet reportedly restricted people's movements in error when they ordered into quarantine people who turned out to be negative for the coronavirus, including a woman who was ordered to self-quarantine after waving to a homebound coronavirus victim from the street. Information sharing with law enforcement may also have a chilling effect on access to health care. In the US, local governments are collecting the addresses of people who test positive for the coronavirus and sharing the lists with police and first responders, which some public health experts say could make people reluctant to seek medical care or get tested for Covid-19 because of a fear of profiling by law enforcement. Finally, publicizing revealing details about people's movements and behaviors can stoke fear, panic, and discrimination. In South Korea, the government has sent "safety guidance texts" that notify the public about places that infected people have visited. Owners of affected shops and restaurants told The Guardian that these alerts are chasing customers away and may put them out of business even after they disinfected the premises.

If data is anonymized, what is the harm?

Anonymization (i.e. stripping mobile location data of personally identifiable information) has been proposed as a safeguard, but it is well-established that anonymized data can be combined with private and publicly held data to re-identify individuals. To prevent this, governments would need to provide clear rules to prohibit combining anonymized data with other personal data. This became a major issue in South Korea, where the level of personal information sent out on public health text message alerts based on the location history of known infected individuals led to the doxing of individuals. There are reports that some people suspected of testing positive for Covid-19 based on information sent out by public health alerts experienced hate speech or harassment. In some cases, the texts fueled social stigma leading to speculation about extra-marital affairs. The National Human Rights Commission of South Korea criticized authorities for providing more information than is necessary to stop the spread of disease, leading to a violation of privacy and human rights of an infected person, including "secondary damages as patients become the target of criticism, taunts, and hatred online." It recommended only sharing location and times when infected people visited places, rather than providing the travel history of each individual.

Does aggregating data create privacy risks?

Companies and governments are also analyzing large location datasets to forecast disease trends and the effectiveness of public health interventions. An example is the US CDC's reported partnership with the mobile advertising industry. Google has launched "Covid-19 Community Mobility Reports" that map mobility trends over time by country or region across different places like parks, grocery stores, and transit stations. Facebook's Disease Prevention Maps initiative provides its research partners, which include Harvard School of Public Health in the US and National Tsing Hua University in Taiwan, with "co-location maps" that predict travel patterns, "movement range trends" that show whether social distancing and other preventive measures are working, and a "social connectedness index" that attempts to infer disease spread from "friendships across states and countries."

Google and Facebook say that their initiatives are based on anonymized and aggregated location data that provide high-level insights into people's movements and behaviors, rather than detailed location histories that are prone to re-identification. In theory, data aggregation poses fewer risks to privacy. However, companies and governments that perform such aggregation must disclose sufficient information about the protocols and procedures used to aggregate data that enable independent and external researchers to test if they actually work. Covid-19 tracking initiatives based on aggregated data should also disclose how they draw conclusions from this data, how this data is used to inform public health interventions, and the limits and risks associated with such analysis.

Does Bluetooth-based proximity tracking protect privacy?

Some companies and researchers have recently announced new efforts to make contact tracing more privacy-protecting using Bluetooth technology. Among the more prominent are the Pan-European Privacy Preserving Proximity Tracing initiative (PEPP-PT), the Decentralized Privacy-Preserving Proximity Tracing (DP-3T), and Apple and Google's Privacy-Preserving Contact Tracing initiative, which consists of an application programming interface (API) that public health agencies can integrate into their own contact tracing apps. The next phase is a system-level contact tracing system that will work across iOS and Android devices on an opt-in basis. In Bluetooth-based proximity tracing, devices that come close to each other share pseudonymized IDs (a string of random numbers that are not tied to a user's identity and change every 10-20 minutes for additional protection). If a user becomes infected with the virus, they can send an alert to all the phones with which they have been in proximity. The broadcast would not identify the infected person, nor would the infected person know the identity of the people who would be notified.

Bluetooth-based proximity tracing is being promoted as the more accurate and secure option for contact tracing because a device's ability to communicate with another can be a much more accurate proxy for nearness and because systems can be built to decentralize data, meaning that data can be stored locally on the device rather than on a centralized database. 

While promising in some respects, Bluetooth-based proximity tracing is largely untested, and designing these systems involves choices that have implications for privacy and security. For example, Bluetooth-based proximity tracing can rely on centralized databases, or decentralized storage of data on people's phones. While some governments may prefer centralizing data under their authority, this can be problematic if the authority has broad powers to abuse the metadata and is prone to bribery or legal coercion or has failed to take appropriate steps to secure the data from attacks by malicious actors.

Decentralizing data so that it is stored on people's devices is generally seen as a better option from a privacy perspective. However, this approach is not without privacy risks either. A tech-savvy adversary in close proximity to a device could identify the IDs of infected people stored on it, or could set up a device with a stationary camera to capture the IDs of users passing by.

Furthermore, researchers from the Institute for Technology in the Public Interest have warned that technical safeguards may not address abusive implementation of contact tracing technologies. For example, strong encryption and decentralized systems will not protect someone from a government or private entity requiring that they show the results of the app (i.e. whether they are an infectious disease risk) in order to access to buildings or transportation.

In India, the official Covid-19 app, Aarogya Setu, was initially voluntary when it was launched in early April, but on April 29 it became mandatory for all government staff through an order by the Department of Personnel and Training and for all employees in the public and private sector through a Ministry of Home Affairs (MHA) May 1 order.

Even if an app is officially voluntary, in practice some businesses are already claiming that they will mandate them as a condition of returning to work. In China, Human Rights Watch has found that local authorities require users to show the Health Code app on their phone in order to hail a ride, use public transport, or enter supermarkets and residential areas. Finally, as experts in technology, law and policy, and epidemiology have noted, Bluetooth-based proximity tracing is vulnerable to trolls and spoofing, which could weaken trust in the system.

Are the privacy risks justified?

Inaccuracies associated with mobile location tracking programs raise questions about whether the restrictions they impose on privacy are necessary to safeguard public health.

A key consideration is whether mobile location technologies can accurately determine whether a person is in close contact (within 6 feet of someone for 10 or more minutes) of someone who is infected. Technology researchers have found that cell site location information and GPS signals are unlikely to provide location estimates with the level of precision required to meaningfully predict the risk of Covid-19 transmission. While Bluetooth tracking technologies can be engineered to achieve significantly more accurate measurements, their accuracy may still degrade in the presence of other signal-transmitting devices and in areas with high levels of interference, such as high-density buildings or busy parks (especially in cities). Furthermore, proximity tracing alone says very little about the nature of the interaction, such as whether people were in a closed space or outdoors, whether they were wearing masks or not, or whether someone sneezed during the interaction.

The wide variance in how people use cell phones may also make location tracking efforts ineffective. For example, tracking may assume that each device is unique to a single individual. However, in Sierra Leone researchers found that call detail records were an unreliable proxy for Ebola transmission during the 2014 to 2016 outbreak because many people had multiple mobile phones, or lent, traded, and passed around their devices among family and friends. In locations with weak signals, including conflict areas where cell phone towers may be strategic targets, people often use multiple SIM cards or phones.

Will Public Health Responses that Unduly Rely on Mobile Location Tracking Discriminate Against Minorities?

Disparities in mobile phone use, digital literacy, and tech uptake could also exclude vulnerable or marginalized populations from public health responses that unduly rely on mobile location tracking. These disparities are particularly pronounced for contact tracing apps, which assume that users enjoy access to smartphones that meet minimum technical specifications and a reliable mobile or internet connection.

According to the GSM Association, the industry body that represents mobile network operators worldwide, the percentage of the world's population that connects to the internet using mobile phones was 49 percent at the end of 2019. In some regions, such as Sub-Saharan Africa, penetration rates are as low as 26 percent. Industry analysts estimate that Bluetooth tracing will be out of reach for as many as 2 billion mobile phone users whose devices are not configured to support this technology. That is roughly a quarter of all mobile phones in use today.

Disparities in access and use of mobile devices based on location (urban versus rural) and gender are also well documented and generally reflect and entrench broader patterns of inequality. Older people – a group that is at increased risk of severe disease and death in the Covid-19 pandemic – are also less likely to use specialized apps or have smartphones or even access to the internet. In the US, a 2019 Pew survey found that 68 percent of older people between ages 55 to 73 own smartphones, compared to 93 percent of people ages 23 to 38. In Italy, which has one of the lowest internet penetration rates in Europe, the government acknowledged the limited effectiveness of its voluntary contact tracing app because one-sixth of the population does not use the internet and older people were generally unlikely to download it. In China, older people without smart phones have been unable to take the public bus (which now requires the Health Code app) or enter public hospitals (which now require online appointments).

Women are up to 31 percent less likely to have internet access than men in some countries, and worldwide, about 327 million fewer women than men have a smartphone. Women's use of cell phones is constrained by factors including lower literacy – globally, of an estimated 781 million people aged 15 and over who are illiterate, almost two-thirds are women and girls. If governments and companies mandate contact tracing apps as a condition of entry into public or private spaces, vulnerable and marginalized populations that are less able to download these apps will face discrimination.

Human Rights Watch has also cautioned that the use of incomplete and discriminatory datasets can misdirect public health efforts in ways that endanger the rights of the poorest and most vulnerable people. For example, stricter enforcement of social distancing measures in low-income counties could unduly penalize front line workers, people struggling to find shelter, or unemployed people traveling to food banks or welfare agencies because their movements may appear abnormal or in violation of social distancing norms when in fact they have to be more mobile to meet basic needs.

Recommendations

These technologies are intended for a praise-worthy purpose: protecting public health at a time of public emergency, a situation that can justify some restrictions on rights. But the long history of emergency measures shows that they often go too far, fail to meet their objectives, and once approved, often outlast their justification. No matter how compelling the situation, it is incumbent on public authorities and private actors to ensure that measures do not overstep the permitted legal restrictions on individual rights. 

This means that governments should not use or approve technologies using mobile location data to combat Covid-19 until they have demonstrated that they are necessary and proportionate to combat the spread of the disease and have enacted adequate safeguards to prevent human rights abuses. They should address the more fundamental question of whether such technologies are truly effective in curbing the spread of Covid-19 or may in fact misrepresent an individual's risk of infection or mislead the public. They should also assess whether there are ways to combat the pandemic that are less intrusive on rights, such as privacy and freedom of movement, than deploying location tracking technologies. The international legal standard for restricting these rights contains these elements:

  • The restrictions are lawfulthat is they are neither arbitrary nor discriminatory in design or application, and they are enacted in law with sufficient specificity to give people a clear idea of what is prohibited and offer meaningful limits to official discretion
  • The restrictions must be necessary in the sense that they would be effective, grounded in scientific evidence, and there are no alternatives that would have lesser impact on the rights concerned
  • The restrictions are proportionate to the risk to public health and in no way compromise the essence of the right in question
  • They are required to achieve a legitimate objective, in this case the protection of public health (rather than a xenophobic or discriminatory agenda)
  • The measures and the restrictions to rights they entail are of limited duration to the time of the emergency
  • The technology and its approved uses are respectful of human dignity
  • The technology is transparent and subject to review as well as oversight, and measures of remediation for rights abuse are available

Guiding Questions to Assess Proposed Programs Utilizing Mobile Location Data

Human Rights Watch has considerable doubt about whether the programs using mobile location data described in this Q&A can satisfy this threshold. Nonetheless, governments around the world are pursuing such programs at breakneck speed. When analyzing proposed or actual mobile location tracking technology, it is critical that the public, the media, the scientific and engineering community, and public policymakers ask the following questions as a way of interrogating whether any given tool or program presents undue risks to human rights.

Preliminary Questions

Governments, companies, and others assisting in the development of programs that propose utilizing mobile location data should first ascertain if the underlying technology is capable of tracking individuals' exposure to Covid-19 with sufficient accuracy. Is the way the program identifies at-risk individuals consistent with what we know about Covid-19 transmission (e.g. proximity tracking versus symptom tracking)? Are its measurements able to correct for or otherwise take into account variations on how someone might interact with an infectious person (e.g. in high density buildings or busy parks) or use their phone (e.g. shared devices or high cell phone turnover rates)? What errors might these programs commit? How might they interfere with someone's ability to seek testing and treatment or the broader public health response?

To ensure that a program would be epidemiologically sound and to help avoid issues of bias and error, governments and companies should engage relevant stakeholders (including civil society, representatives of vulnerable and marginalized populations, computer scientists, and epidemiologists) in meaningful and transparent dialogue. Some of the relevant inquires would be whether these programs will be linked to proper institutional responses – is there, for example, an accessible path to testing and treatment for those flagged to be at risk of Covid-19 exposure? Or would the program instead divert resources from non-technical measures, such as manual contact tracing and public messaging on social distancing, and to what effect?

Stakeholders should also ask if programs are truly voluntary and whether people would face any official sanction or disadvantage as a result of their decision to participate in the program. For example, it is important to understand if the program would impose punitive measures or undue restrictions on movement, access to health care, and other rights, particularly for vulnerable and marginalized populations.

At the Design Phase

If a program is already in development, in addition to the questions above, it is important to consider if it incorporates privacy by design principles. These would include data minimization, the practice of only collecting data that is adequate, relevant, and limited to data that is necessary for the purpose of scientifically established public health objective. Another relevant consideration is whether the program places strict limitations on how data can be collected, used, aggregated, retained, and shared, including with other users, other government agencies, and the public. Yet another is whether there are clear time limitations, including plans for the program to be deactivated and accompanying data deleted after it is no longer needed. Engaging data protection authorities to develop guidelines on how to protect privacy when using personal data in response to the pandemic is an important step.

Giving users control over what information they share and when they can discontinue to share data is important. Does a program allow users to meaningfully provide fully informed consent, through terms that are transparent and in clear and plain language, allowing users to opt-in, rather than opt-out? Are its privacy functions, including settings determining what data will be collected, who will have access to it, how long it will be retained, and how to delete it, easy to understand? In the case of contact tracing apps, it is important that collection, aggregation, retention, and analysis of personal and health data is not centralized within a single authority, such as a government ministry. If data collected under the program is used to analyze and communicate to an individual their risk of infection, it should provide meaningful information about the limits of this analysis and direct individuals to relevant public health resources, such as government health advisories.

Anonymizing and securing data are important design areas that deserve close scrutiny. Data collected should be anonymized to the greatest extent possible and the risks of de-anonymization communicated to users in an understandable and accessible manner. Has the source code been made available so that the public can assess whether it does what it intends to do? Have developers disclosed meaningful information about the anonymization protocols so that the broader public can verify that they are effective? Developers should also disclose how the data collected is protected against external parties who may wish to exploit or tamper with it. For example, has the product been designed with sufficient information security controls (such as end-to-end encryption) and are those measures subject to regular audit?

At the Deployment Phase

If a program is already in place, it should be examined or re-examined to assess whether it is compliant with the standards above. Developers should consider questions about the social and political context in which it operates and ascertain that protections and safeguards are in place to prevent against abuse. For example, are users able to challenge the collection, aggregation, retention, and use of their data, and do they have access to effective remedies for abuses? Can they withdraw from the program and delete their data? Can communities and users audit the tools themselves in order to signal that the technology is trustworthy and does what it purports to do?

Aarogya Setu Data Only Shared With Government Officials Directly Involved In COVID-19 Interventions, "Highly Encrypted" Says Niti Aayog CEO - BW Businessworld

Posted: 11 May 2020 12:00 AM PDT

The Central government's Aarogya Setu mobile application is based on "privacy-first by design" principle keeping in mind the safety and privacy of users' data, said Amitabh Kant, CEO of Niti Aayog. He added that the user data from the app would only be provided to those government officials who were directly in charge of containing the spread of the Coronavirus in India.

In an exclusive interaction with ANI, Kant informed that Aarogya Setu mobile application has been built to ensure privacy and security of personal information that was collected from people. It is based on "privacy-first by design" principle. "Aarogya Setu has a clearly defined protocol for access to data. National Informatics Centre (NIC) is the fiduciary of the data, and data is only shared with government officials directly involved in COVID-19 related medical and administrative interventions on a strictly need-to-know basis and limited in scope only to their direct work," said Kant.

Concerns over the Aarogya Setu App were raised when a French 'ethical hacker' claimed to have access of the users' data and had highlighted security bugs within the app which could privacy ramifications.

Clarifying apprehensions that some users may have related to data security, Kant said: "When an individual provides his/her mobile number for registration, the Aarogya Setu server assigns an anonymous, randomized unique device identity number (DiD) and associates it with their mobile device. This pair - the mobile number, DiD and other personal information is securely stored in a highly encrypted server."

After registration, the app asks for your name and mobile number (any name that you want to be called by, not your legal name). In addition, it asks for your age and gender (both have a direct co-relation to COVID-19 impact), profession (to ensure people who are in essential services are proactively assisted), countries visited in last 30 days and willingness to volunteer in times of need.

"All contact tracing and location information that might have been uploaded to the Aarogya Setu server is permanently deleted 45 days from the date of upload if you have not tested positive for COVID-19 within that period of time. If you are infected, all contact tracing and location information pertaining to you are permanently deleted from the server 60 days after you are declared cured of COVID-19," added Kant.

While the app requests users to share location, the app does not use location data for contact tracing. "The app has clearly defined and delimited how location information is used - only on an anonymous or aggregate basis and for the specific purpose of identifying hotspots so that proactive increased testing and sanitization of these locations can be done," he said, adding that the app does not continuously monitor any user's location.

According to Government data till date, Aarogya Setu app has registered about 96 million users since its launch on 2 April. However, contact tracing data has been fetched of only 12,000 users who had tested positive for COVID-19 constituting less than 0.1% of all users. "Unless a person turn COVID-19 positive, this information is never accessed or pushed to the server and is permanently deleted from the phone 30 days after it is collected," he said.

"The central feature of the app is location history and bluetooth-based contact tracing in the fight against the virus. The Bluetooth interaction between two phones on which the app is installed is performed anonymously, using a randomized and secure Device Identification Number (DID) that has been assigned to the devices at the time of registration," added Kant.

Along with the user's location history which is sampled sparingly (once every 30 minutes), this information is securely encrypted using the native key chain of the phone's operating system and is stored on the phone itself.

"The Aarogya Setu engine is designed to respect the privacy of COVID-19 positive patients. The backend of the App is integrated with ICMR database through an API, and information about patients who have tested COVID-19 positive is received in real-time. It is this ICMR database which is the source from which the App receives information about all COVID-19 positive cases," added Kant.

"It is only in the event there is a requirement for individual medical intervention that the anonymized personal information is re-identified. The team is exploring moving from a one-time DID to dynamically generated DIDs for every user, to further enhance privacy," added Kant in an interview to ANI.

Explaining significant predications made by the Aarogya Setu, Kant said, "In the last 6 weeks, Aarogya Setu App has emerged as a key technology solution aimed in combating COVID-19. Through this app, several potential emerging and hidden hotspots were identified. The engine predicted 130 hotspots across India at the sub-post office-level between April 13th to April 20th. Every forecasted hotspot has since been declared a real hotspot and acted upon by the health ministry." 

(ANI)


Obscure, decade-old vulnerability finally unearthed in GLPI asset management app - The Daily Swig

Posted: 18 May 2020 03:12 AM PDT

John Leyden 15 May 2020 at 11:28 UTC
Updated: 15 May 2020 at 13:13 UTC

RCE achieved by smuggling web shell into WiFi network comment

Obscure vulnerability finally unearthed in GLPI asset management app

Security researchers discovered that the backup feature of GLPI, an open source IT asset management app, was vulnerable to a long dormant, critical vulnerability that laid undiscovered for more than 10 years.

All versions of GLPI released since the software was first put together in 2010 are vulnerable to a remote code execution flaw (CVE-2020-11060) that became exposed through a backup feature.

The vulnerability – discovered by a security researcher at French consultancy Almond – would have been tricky to exploit even before a recently issued patch.

Rather than focusing on its seriousness, however, the point of interest in the web security flaw stems from its arcane source and complexity.

Under the microscope

GLPI was put together by a French developer and widely used by enterprises, especially in France.

During an engagement for one of Almond's clients, the researcher, who operates under the moniker @myst404, discovered static encryption key vulnerability (CVE-2020-5248) in GLPI, as explained in an advisory from the consultancy.

The flaw arose because GLPI used a hard-coded, static cryptographic key to encrypt sensitive data.

This meant, for example, that the LDAP password used for external authentication is stored encrypted in the database with the static key.

Polyglot files and web shells

The cryptographic flaw – resolved by a recent version 9.4.6 update from GLPI – led @myst404 to look more closely at the security of the technology, an exercise that allowed him to uncover a far more interesting web security flaw.

As explained in a technical write-up, the Almond researcher discovered that "an arbitrary path and a hashed path disclosure can be abused to execute code on a GLPI host, by creating a PHP/GZIP polyglot file".

Almond's researchers developed an exploitation method for the somewhat obscure security flaw that uses a technician account to achieve remote code execution (RCE) through a "specially-crafted gzip/php web shell in a WiFi network comment".

The attack relies, in part, on a cross-site request forgery (CSRF) security flaw.

RECOMMENDED Polymorphic payloads: New image processing test suite snags Google Scholar

The vulnerability has been patched with the release of GLPI v9.4.6.

In an advisory, GLPI's developers acknowledged the backup-related security flaw while downplaying its significance.

"An attacker can execute system commands by abusing the backup functionality," the developers state.

"Theoretically, this vulnerability can be exploited by an attacker without a valid account by using a CSRF."

The developers conclude: "Due to the difficulty of the exploitation, the attack is only conceivable by an account having maintenance privileges and the right to add WiFi networks."

Far from trivial

Almond's researchers agreed that the backup-related vulnerability is far from trivial to exploit.

"It was a bit hard to exploit, but since the release of the article/PoC there should be no problem to reproduce the issue, for an attacker having maintenance privileges and the right to add WiFi networks," @myste404 told The Daily Swig.

"The technician profile has these privileges and is created by default along with the default associated user tech (password: 'tech'). Of course, the admin GLPI account also has these privileges."

"However, the vulnerability is still very hard to exploit, but theoretically possible, for an attacker without any valid account," he concluded.

READ MORE XSS vulnerability in 'Login with Facebook' button earns $20,000 bug bounty

Comments

Popular Posts

Signal, WhatsApp and Telegram: All the major security differences between messaging apps - CNET

VPN browser extensions: Why you shouldn't use then - Tech Advisor

Police Target Criminal Users of Sky ECC Cryptophone Service - BankInfoSecurity.com