Afghanistan: Humanitarian work at risk after Taliban took control of ‘keys to the server room’ with biometric data

Afghanistan: Humanitarian work at risk after Taliban took control of ‘keys to the server room’ with biometric data

0

From ethical dilemmas on data security to worst-case scenarios unfolding in real time – the Taliban’s rise to power in Afghanistan is spurring urgent concern about the safety of data that aid groups have collected over the past 20 years.

Data protection experts warn that aid groups must quickly review and safeguard sensitive information on Afghans who have received emergency relief and other services. Humanitarian agencies are among those that have tracked, stored, and shared data linked to millions of Afghans – including precise biometric data like fingerprints or iris scans.

Crucially, some of this data has been collected by the now-deposed Afghan government – raising concern that the Taliban have inherited databases and technology that could be used to identify people linked to previous regimes or international forces, or members of persecuted groups who have received aid.

“The Taliban have been given the keys to the server room, so to speak,” said Karl Steinacker, a former official with the UN’s refugee agency, UNHCR. He now advises civil society organisations on digital identity.

The New Humanitarian spoke with Steinacker and Katja Lindskov Jacobsen, a senior researcher at the University of Copenhagen’s Centre for Military Studies, to unpack the issues. In the interview, parts of which are excerpted below, they discussed the potential risks, why aid groups collect so much data in the first place, and the right to be forgotten.

It’s unclear exactly how much data aid agencies have collected and shared over the years, or what the Taliban have access to now, which underscores the need for a swift review, Steinacker said.

But aid groups or international donors have had their hands in an enormous range of data through two decades of programming: registration for millions who received food aid or mobile cash transfers; digitised government identity cards linked to biometric data; or iris scans for refugees in neighbouring Pakistan, for example.

UNHCR did not respond to a request for comment. Other agencies, including the migration agency (IOM) and the World Food Programme, said they were not able to respond to questions before publication.

The Taliban have promised an amnesty, and a spokesperson said there is no “hit list”. But rights groups already report reprisal killings and threats.

In early August, the Taliban seized US military biometric devices that might help uncover people who worked with international forces, The Intercept reported. And when Kabul fell, Taliban soldiers searched for files at the national intelligence agency and the communications ministry, The New York Times reported.

Today’s risks should underscore wider data privacy questions for the entire aid sector, which has often embraced the benefits of digitised records and biometrics while overlooking the dangers, according to researchers who study data security in aid settings.

How long is data stored? Who else has access? Is there adequate consent from people receiving aid – often newly displaced with few other options? Are policies future-proofed to protect against unforeseen risks? The Taliban’s rapid takeover in Afghanistan has brought these and other questions to the forefront again.

In June, a Human Rights Watch investigation detailed how biometric data UNHCR collected from Rohingya refugees was shared with the country they fled, Myanmar.

“With biometrics, the concern is, you can take a new name, but you can’t really take a new iris,” said Jacobsen, whose research often focuses on humanitarian interventions and technology, including biometrics.

Her 2015 study highlighted potential flaws with a first-of-its-kind UNHCR biometrics programme for Afghan refugees in Pakistan. The system used iris scans – stored anonymously – to determine whether returning refugees had already received aid. Jacobsen’s research warned that “false positives” – where a person’s iris is erroneously found in the system – could essentially deny aid.

It was a programme Steinacker supported as UNHCR’s head of global registration in 2004. Now, both he and Jacobsen are calling for an urgent review of data in Afghanistan, and for a deeper re-evaluation of the use of biometrics across the aid sector.

The New Humanitarian: What should be the immediate priority for aid agencies when it comes to evaluating data security in Afghanistan? How much – and what kinds – of data are we talking about?

Karl Steinacker: What is important is that the agencies sit together and assess first what data there is, and where it is. Every big organisation would say: “There’s no need to worry. We have data security in place.” But is that so? What about the data which is in common databases: a child protection database, let’s say, where you can maybe trace single mothers, or victims of sexual violence – things that are quite delicate issues.

The other issue is the data used through commercial service providers – cash programmes in particular. Were they cash programmes for very specific vulnerable groups who might be targeted by the Taliban, because they were war veterans, or sexual minorities – whatever it is.

But this process has to start somewhere. Someone has to say, “Since we haven’t done what we should have done before we implemented these programmes, let’s now retroactively look at what we have, what can be accessed by the Taliban, and how can we mitigate the problem?”

What should have happened is the data protection impact assessment has been done before they started these programmes. But we know from experience that no humanitarian agency does these impact assessments.

  • The New Humanitarian report
About author

Your email address will not be published. Required fields are marked *