Estimating a population’s mobility can be useful for policymakers, particularly during times of crises such as natural disaster, disease outbreak, or conflict. After an earthquake, governments or NGOs may need to know where people who need assistance are moving to, or during pandemics, governments may look to model the spread of disease and track enforcement of lockdown orders. Policymakers are increasingly turning to mobility statistics generated from mobile phones, satellites, and other data sources to aid in humanitarian relief. Mobility matrices, which quantify population movement between regions over a fixed period of time, are particularly useful, but personal mobility data can be sensitive. Privacy protections are vital for humanitarian initiatives and should be balanced with decision-makers’ ability to provide timely aid to those most in need.
Differential privacy is considered the gold standard for privacy-preserving statistics, making it a prime candidate for producing private mobility matrices. However, because these methods add carefully calibrated noise to statistics, they induce a “privacy-accuracy tradeoff.” This tradeoff is well-known, but is often characterized in amorphous and unhelpful ways: for a policymaker attempting to deliver aid to people most in need, what matters is not data accuracy, but decision accuracy.
This project aims to define and calculate how differential privacy impacts intervention accuracy: the accuracy and effectiveness of policy decisions based on private mobility data rather than standard (non-private) data. To do this, researchers rigorously document the implications of using private mobility statistics in two types of humanitarian settings and provide concrete guidance for effectively deploying differentially private algorithms in practice.
Researchers assessed responses to (1) a pandemic and (2) for aid delivery using both non-private and private call detail record (CDR) data. CDR data provides the time and location of phone calls, enabling the team to construct mobility matrices. Under assumptions of government response to various scenarios, researchers assessed the likelihood that using private data would lead to a different decision. In one case study, the research team used data from Afghanistan in 2020 and assumed anti-contagion policies such as lockdowns would be put in place whenever 20% of a region became infected. They also looked at settings where a short-term shock necessitated aid delivery, including the battle of Kunduz in 2015 and the Lake Kivu earthquake in 2008. In all three cases, researchers found that intervention accuracy remained high, between 72% and 99%, when using differentially private mobility matrices.
The tradeoff between privacy and accuracy is known, but not easily translated. NGOs want to send humanitarian aid to locations in most need, while providing strong privacy protections. Traditional differential privacy calculations enable them to calculate the worst error loss a person could experience, but the practical implications of these numbers are only understood by a very small group of privacy experts. Intervention accuracy , on the other hand, allows NGOs to decide on an acceptable margin for error, while providing the strongest privacy protections possible under those constraints. By reframing statistical accuracy as intervention accuracy, differential privacy becomes more actionable.
Interested readers can consult the working paper for more details about the privacy method developed, the experiments conducted, and the impact of different privacy parameters.
Copyright 2023. All Rights Reserved
Design & Dev by Wonderland Collective