Data Responsibility: Let’s not wait for another wake up call
How would you feel if someone you’d only just met, after asking your name, next asks you how much money you have in your pocket or what your weekly earnings are? Depending on your privacy tolerance level, you’d probably be feeling anywhere from mild surprise to strong annoyance. If you choose to answer at all, you may also be left wondering how that information is going to be used.
This seemingly unlikely scenario is an everyday occurrence for millions of vulnerable people, being registered and assessed for their eligibility to receive humanitarian assistance. Increasingly, alongside their personal details they are also asked to hand over their biometric data in exchange for access to aid. Often they don’t have an option. If they don’t provide these types of information, they may be ignored by humanitarian organisations.
The collection of data happens in all types of humanitarian programming. But when it comes to cash and voucher assistance (CVA) the amount of personally identifiable data collected, processed and shared by representatives from international organisations, private sector actors, local partners and others is higher than in other programme delivery modalities. To a large extent, CVA programmes collect more data to be able to provide end-to-end traceability of funds from donors right through to the recipients at the last mile. This pressure for transparency is greater than in other assistance modalities, given the widespread but unjustified misperception that CVA comes with increased risk compared to in-kind. Customer Due Diligence (CDD) or Know Your Customer processes also drive the need to handle more sensitive personal data on recipients of aid.
So there are often legitimate and compelling reasons for these data requests, and those collecting data are becoming more aware of the data collection principles and protocols. There is also no shortage of existing guidance, with some excellent resources available (see the list at the end of this article for a good selection). Yet in recent discussions with CaLP member agencies, including colleagues from ICRC and the OCHA Centre for Humanitarian Data, many shared a concern that there is a significant gap in practical hands-on resources to help practitioners marry the high moral ground of data responsibility policies with the day-to-day pressure to deliver the programme.
At the same time, we still do not have satisfactory answers to a number of questions. For example, how do we turn “informed consent” from a tick box on a screen during registration into a meaningful process, respectful of people’s rights to privacy and protection? How can we ensure that affected populations understand how their data will be used, and are reassured that it won’t be mistreated by the third parties with whom we work? And what should we do if we learn that the third party we have chosen to work with uses programme participant personal information to either sell them unwanted services, or worse still, refuse certain potentially beneficial services to this group?
To provide some answers, CaLP convened a Data Responsibility event on April 8th in Geneva, with generous support from ICRC and excellent facilitation from the OCHA Centre for Humanitarian Data. The workshop brought together twenty participants representing 12 CaLP member agencies, and set out to understand the gaps in the existing guidance on Data Responsibility in CVA. Participants worked through three unique case studies (Somalia, Bangladesh and Yemen) to identify how CVA data flows between key actors in each response, where risks and bottlenecks exist, and what actions or products are needed to resolve them.
Although data responsibility is seen as important in the sector, the workshop participants agreed that it is still by and large considered to be somewhat theoretical and more relevant in the world of big tech – despite recent wake-up calls. Participants felt that preparedness should be prioritised now to avoid what Nathaniel Raymond has referred to as the ‘digital Goma‘. It shouldn’t take a crisis for us to begin to define what pre-emptive action could look like when it comes to data responsibility in CVA and beyond, even if we know that the technology to manage data evolves a lot faster than the policy frameworks that govern their use.
An important step on this journey is to begin breaking down the ‘unknown unknowns’ into ‘known unknowns’. In other words, we need to develop methods and channels for reporting data-related incidents so that we can begin to understand harms as they occur both internally within an organisation and as a broader cash and humanitarian community. The Data Responsibility conversations convened by OCHA in recent months are a good starting point.
It is a myth that humanitarians, by virtue of their mandate, are immune from doing harm through their data practices. Workshop attendees agreed that there is a need to step up collective efforts at documenting real-life stories of harm to programme participants, to start to break down this idea and the the culture of silence it perpetuates. CaLP could perhaps play the role of the “custodian”, helping to collate, anonymise and share relevant stories. Good practice examples, such as the global data sharing agreement recently finalised between UNHCR and WFP, should also be shared more widely and effectively, to ensure that learning continues to flow between CVA actors in the response.
Although there were more UN agencies than NGOs present in the room (and as organisers we were left wondering as to their reasons for choosing not to engage), those that did attend spoke highly of the need to bring all actors together to develop solutions. They suggested using regional and country Cash Working Group platforms, and finding ways to bring in other non-humanitarian CVA actor perspectives. One idea suggested was the inclusion of a data responsibility risk assessment as an integral part of the Humanitarian Programme Cycle overall protection risk assessment.
So we made a good start in Geneva – bringing the cash and data conversations together and beginning to define what Doing No Digital Harm could look like in this space. CaLP will continue to engage in this area, and is looking forward to the conversation at Wilton Park later in May. CaLP is also working to set up a Digital Cash Working Group so that members (both from the humanitarian and the private sector) have a platform to share their experiences and navigate some of the challenges specific to digital cash.
We’d love to hear from you – tell us what role you think CaLP should play in helping to move the dialogue forward around data responsibility and digital cash.
List of resources:
- Data Starter Kit, Electronic Cash Transfer Learning and Action Network (ELAN)
- Protecting Beneficiary Privacy, CaLP
- Handbook on Data Protection in Humanitarian Action, ICRC
- Working Draft OCHA Data Responsibility Guidelines, UNOCHA
- Mapping and Comparing Responsible Data Approaches, NYU GovLab and Leiden University Centre for Innovation
- Building Data Responsibility into Humanitarian Action, OCHA Think Brief
- The Signal Code: A Human Rights Approach to Information During Crisis, Harvard Humanitarian Initiative
- The Signal Code: Ethical Obligations for Humanitarian Information Activities, Harvard Humanitarian Initiative
- IASC Policy on Protection
- The EU General Data Protection Regulation
- Responsible Data Management training pack, Oxfam
- Guide to Personal Data Protection and Privacy, WFP
- Policy on the Protection of Personal Data of Persons of Concern, UNHCR
- Data Responsibility Policy, 510 (Netherlands Red Cross)
- Data Protection, Privacy and Security for Humanitarian and Development Programs, World Vision
- Data Security Guidance: Protecting Beneficiaries, USAID
- Privacy Assessment Impact of UNHCR cash interventions
- Conducting Mobile Surveys Responsibly, WFP
- Select Digital Payments Resources, Mercy Corps