Location data is essential for solving environmental and humanitarian challenges. Location data differs from other types of data because of the focus on location reference. For example points (coordinates) of a disease occurrence or areas (boundaries) with flooding. Because of the interconnected nature between humanitarian and environmental concerns, location data is more than just a dot on a map; it is also a proxy variable for socio-demographic, political, and environmental factors. Therefore, location data raises unique challenges and implications for privacy, for example, ease of tracking movements of people and making inferences about “social, economic or political behaviour” (1). Furthermore, the increased use of automated decision-making systems using location data challenges fairness and justice through a lack of equitable distribution of resources because of biases within these systems.
In this blog post, we discuss how location data is a battleground/bargaining market for both justice and privacy, especially through Rawls' theory of distributive justice in the humanitarian domain. The conflict between justice and privacy arises from the increasing use of location data to decide whom to give aid to while at the same time posing challenges to data privacy for both individuals and groups of people. This is particularly important given the number of humanitarian disasters that specifically endangered vulnerable communities, coupled to the projected increase of these disasters due to climate change.
Unpacking Justice: Is This What the Vulnerable Signed Up For?
Data-driven decision-making is becoming the norm in humanitarianism, promising justice via fairness. This is achieved by distributing aid and resources equitably by identifying and resolving injustices that amplify community vulnerability, in line with Rawls' principles of justice as fairness. This is also evident from the use of approaches such as vulnerability science in humanitarian disaster risk reduction and management. But numbers can lie, especially when the data is collected and analyzed in a biased manner, due to selective sampling of data, misinterpretation of data, and respondents' subjectivity.
For example, in Yemen (2), authorities (including governmental bodies, local leaders, and donors) acting as gatekeepers decided where and how to collect data. To meet leaders' expectations, data was frequently adjusted, leaving some high-risk areas as data deserts. Another concern was the misrepresentation of social groups such as the gender imbalance in the data. For instance, an assessment team used household surveys and observations on 53 individuals, of which only 7 were women. From the perspective of local people, they were aware that organizations and donors would make the decisions based on their feedback from interviews and surveys, leading them to answer in ways they believed would be most beneficial to them, hence this was unjust towards women.
For aid efforts to truly be effective and just, we need a comprehensive and unbiased view. It is necessary to consistently ask: does the 'justice' we strive to deliver truly align with the needs and necessities of those in need?
The Hidden Privacy Cost: How Much Privacy is Lost?
Location data can unveil demographic characteristics by highlighting the routine behaviors of social groups. When combined with other datasets, location data can hint at a person's health status, religious beliefs, political leanings, and social connections, potentially leading to undue profiling and surveillance. While someone might willingly share their whereabouts, they might not realize the extent of the information they're disclosing. This oversight can have unintended consequences.
For instance, aid workers aiming to assist vulnerable groups might unintentionally place them at greater risk. In conflict areas, a mere geolocation tag could turn a community into a target. During the COVID pandemic, certain marginalized communities experienced increased discrimination and stigmatization due to the misuse and misinterpretation of health data (3).
Quantifying the true cost of these privacy breaches is complex, and the true cost might be more. As such, we not only need a more transparent method of data collection but also a vigilant approach to data storage and processing.
Beyond Individual Privacy: Are There Other Values at Stake?
In data-sharing, our immediate concern is often individual privacy. But is that the only cost? Beyond individual privacy, there is a significant security risk. In the wrong hands, location data can become a weapon. It's not just about malicious use of individual routines; it can also jeopardize collective safety.
A case in point is the cyber-attack on the International Committee of the Red Cross (ICRC). Over 515,000 individuals, including victims of war and natural catastrophes, were harmed by the attack last year. This breach compromised their privacy and posed direct threats to their safety, with particularly vulnerable groups like unaccompanied minors at heightened risk.
Moreover, the costs don't end with safety. Public trust could be eroded when organizations lack transparency or seem to exploit location data. Due to this erosion, people may be hesitant to share data, even if it's crucial for humanitarian efforts. In short, the conversation around location data should not be limited to just privacy. It is a complex web where values like privacy, safety, and trust are deeply interconnected.
The Cost of "Not Paying": the Consequences of Withholding Data
In the quest for humanitarian aid and fairness, both humanitarians and communities often grapple with the cost of privacy. In conflict or disaster-stricken areas, location data is instrumental for timely interventions. However, some groups may choose to withhold their location data, even knowing the potential consequences, in order to protect their privacy. This absence of data can hinder rapid response efforts, leaving vulnerable populations at heightened risk and invisible to humanitarian aid.
An example is catering to the plight of refugees. In countries marred by strong xenophobic sentiments, refugees often find themselves between a rock and a hard place. While unauthorized border crossings and concealing their personal data might seem like the only way to evade discrimination, these actions come with their own set of risks (5). Not only do they risk physical harm during perilous escapes, but they also face mental distress from the constant fear of discovery and anti-refugee hostility.
In humanitarianism, there is even a clear tension between recognizing and preserving privacy (6). In the delicate balance between revealing and concealing our location data, it seems we are destined to relinquish something, which is often more profound than we initially perceive.
Implications: Traversing the Web of Values
In the digital realm, our locations become intricate intersections of privacy, justice, and other social values. Our choices concerning our location data are not just binary decisions of sharing or withholding; they are situated within a complex web of considerations that reflect our evolving social ethos. Every piece of shared location data can bolster humanitarian efforts, aid in crisis management, or even enhance urban planning. Yet, the same data, when misused, can infringe on individual rights, leading to undue surveillance or even discrimination.
The real challenge lies in navigating this delicate balance of values. As we tread this path, one must wonder: in our pursuit of one value, are we inadvertently sacrificing others, diminishing their importance in the broader scheme of things?
References
- Georgiadou, Y., de By, R.A. and Kounadi, O., 2019. Location Privacy in the Wake of the GDPR. ISPRS international journal of geo-information, 8(3), p.157.
- Paulus, D., de Vries, G., Janssen, M. and Van de Walle, B., 2023. Reinforcing data bias in crisis information management: The case of the Yemen humanitarian response. International Journal of Information Management, 72, p.102663.
- Strassle, P.D., Stewart, A.L., Quintero, S.M., Bonilla, J., Alhomsi, A., Santana-Ufret, V., Maldonado, A.I., Forde, A.T. and Nápoles, A.M., 2022. COVID-19–related discrimination among racial/ethnic minorities and other marginalized communities in the United States. American Journal of Public Health, 112(3), pp.453-466.
- International Committee of the Red Cross (ICRC), 2022, ‘Hacking attack on Red Cross exposes data of 515,000 vulnerable people’. The Guardian, 19 January.
- Kosher, R., 2019, ‘Technology and Privacy in Refugee Aid’. RightsViews, 28 May.
- Weitzberg, K., Cheesman, M., Martin, A. and Schoemaker, E., 2021. Between surveillance and recognition: Rethinking digital identity in aid. Big Data & Society, 8(1), p.20539517211006744.
About the author
Yunxuan Miao is currently a Ph.D. candidate from the Section of Ethics and Philosophy of Technology in the Department of Values, Technology and Innovation at TU Delft. She is working on value change in socio-technical systems, with a particular focus on battery technologies.
Brian is a PhD candidate at the Faculty of Geo-information Science and Earth Observation, University of Twente. His PhD research titled Accountable Geo-intelligence, aims at understanding how to use geodata and AI responsibly by ensuring privacy and mitigating biases in the context of humanitarian action