Getting data protection right

At Africa’s Voices, we deal with unstructured digital data from diverse and often marginalised members of East African societies. We are able to reach these citizens through interactive media and mobile technology, and analyse their input to help media and development organisations understand, engage with, and respond to their needs.

This puts us in the unique position of being able to give a voice to the often under-represented; it also puts us in a position of responsibility towards those who give us access to their personal data, which includes spontaneous opinions and demographic information.

One challenge we face is striking a balance between legal, ethical, and practical concerns associated with data protection. In this post, we want to go beyond technical security and encryption, to give a broader sense of how we protect the interests of our participants.

In being candid about what we do and the challenges we face, we hope to open up a dialogue about best practice in terms of the specific contexts and participants we work with. Such discussions should help us to make more informed, positive decisions about how we handle digital data for development.

Data Protection Challenges

Different legal and ethical guidelines are available, from those provided by the Economic and Social Research Council (ESRC) to that directed by the 1998 Data Protection Act, and are worded differently, but their core messages overlap widely. Let’s start with one of the key principles for any organisation wishing to collect personal data:

1. Do No Harm: Avoid doing or enabling (physical, mental, financial, etc.) harm.

We are always thinking about what impact our insights will have on citizens to maximise the positive effects on their lives. This involves keeping track of how our insights are being used by client organisations and promoting respect for participants’ dignity and rights.

It remains a challenge to reach and represent all citizens: some do not have access to technology, are sceptical, or are illiterate. For some, it is not culturally acceptable to participate in public forums. Additionally, the data we deal with is often digitalised, and therefore removed from its context. We are constantly working on a data collection process and analysis that is inclusive and does the most good by using a mix of approaches for broadening our scope and getting richer insights.

Currently, we work closely with organisations to have an impact on the way they work with the data, and then follow up on how the information is incorporated into their decisions, all the while considering whether these decisions will have a positive impact on the well being of the people they want to serve.

2. Data Integrity: Check, document, and back-up data.

The insights we draw from analysis of the data are only insightful if data integrity is maintained. There are two aspects to data integrity: making sure it is protected, and making sure it is accurate. We ensure that the data we collect is protected and accurate by:

a. Checking the data at key points in the data collection, analysis, and reporting chain (e.g., checking for changes at upload); and

b. Maintaining a high standard of security to protect the data from internal leaks and interference from others.

Maintaining high security is not always straightforward; it needs to be considered in terms of particular risks and priorities such as the likelihood of losing data versus having data surveilled. A number of decisions need to be made, including:

  • Location: Where should data be stored given the type of data and potential vulnerabilities? What are the implications of the data for the people you’re working in?
  • Ultimate ownership of storage: What are the pros and cons of 3rd party storage? Do you have local data storage?
  • Back-ups: No matter how data is stored, it is essential to back it up. What level of back-up is required?
  • Access: What are the necessary levels of access?
  • Data lifecycle: What is the data lifecycle? Who will manage the data until the end of its life?
  • Saving data: What is the saving policy? If downloaded, is data encrypted?

Additionally, Africa’s Voices tries to work only with organisations that we feel we can trust in terms of practices and codes of ethics, which we ask to see before taking on a project.

3. Privacy and Anonymity: Protect the identity of participants.

Many of the communities we interact with have their own priorities and may not care or even know about privacy rights. Nevertheless, we follow rules regarding access control, tools for communication, encryption, and anonymisation early in the data flow. In aid of this, we have developed and continue to refine our data privacy framework with guidelines and a checklist for how data is collected, used, and retained.

Sometimes a compromise needs to be made between protecting data and engaging citizens. For example, asking participants not to text their names into radio stations for privacy reasons can disrupt their normal freedoms to engage in a public discussion and to be recognized in their community; taking this away discourages participation, which reduces opportunities for gaining insights from data. To deal with this tension we:

  • Take careful steps to anonymise the data after it has been collected, which allows it to be shared whilst preserving privacy.
  • Constantly ask ourselves what makes information sensitive and to whom. Data that is not sensitive to us might be sensitive to the participant. Data may also become sensitive in the future.

We also cannot be naïve about data anonymisation: aggregated data can be de-anonymised by piecing together enough information to identify individuals or groups of individuals who might then be unfairly targeted. We deal with this by only sharing data in a secure and controlled environment, whilst sharing non-sensitive and less detailed anonymised data in more public environments.

4. Transparency and Informed Consent: Be fair and open about data collection, and ensure voluntary participation.

There should be specific, legitimate, and lawful grounds for collecting and using personal data, and this data should be handled in ways that participants could reasonably expect.

This can be a challenge in a development context because the target audience might not be aware of the implications of their data being collected and used, have little awareness of their digital rights, and have little power over the process.

Interactive radio shows, SMS, and social media limit how much information can be provided to participants, and therefore affect the extent to which participants are in a position to make informed choices or to provide consent. Further, it can be unclear to us if a participant has understood what their data will be used for, and therefore if they are giving explicit consent for us to collect and analyse it.

To help participants make an informed choice, Africa’s Voices provides mechanisms for consent and opting out. In media forums, we do this first by communicating how the audience’s responses (e.g. SMS), will be used. Second in follow-up SMS surveys we repeat the purpose of the research and include an ‘opt-out’ message.

There are tensions here too, however, as some organisations we work with request that full disclosure of the research is not given – they need to remain anonymous for security reasons. This is one example of where our black and white data protection policy enters a grey area, and we must take other measures to ensure and obtain informed consent.

5. Veracity and Objectivity: Tell the truth and avoid bias.

Once data is collected, we think carefully about how we represent participants’ voices in the way we analyse the data and report insights. This means avoiding bias (cultural, self-serving, statistical, etc.) and thinking about how the data is channelled, affecting how it is reported.

For example, does the legal framework channel the data in a particular way? Strict adherence to a centralized regulatory legal system that extends itself geographically can have the undesirable effect of limiting citizens’ ability to discuss issues important to them.

It may be tempting to focus only on the broader societal benefits of such research, but Africa’s Voices is committed to our responsibility to reflect on the issues for each community we work with. This means ensuring that the voices of African citizens, and not just our client organisations, are heard.

The On-going Challenge

It is easy to start feeling overwhelmed by inconsistent or impractical advice, especially if security breaches or lost data do not seem likely, but data protection is important for protecting your participants as well as the reputations of you and the organisations you work with, because digital rights are human rights.

It can be tempting for those belonging to larger institutions to rely heavily on ethical review boards and to stop thinking about the ethical issues as soon as approval is obtained, forgetting about their on-going responsibility.

Rather than ticking boxes, Africa’s Voices is continually thinking about data protection and respecting the rights and dignity of participants.

Our data protection policy is always under development. We therefore encourage input from our readers with regards to how we handle digital data as a small start-up charity.

While it can be hard to know where to start or whose advice to follow, hopefully, by opening up a dialogue, we have taken our first steps in the right direction.

Big_Data_ev
Africa’s Voices presented our data protection and privacy policy at the Centre for Research in the Arts, Social Sciences and Humanities at the University of Cambridge

Related Posts

Categories

Recent Articles

2023 end year message from our Executive Director
18 December 2023
Empowering Somali women through radio discourse
15 October 2023
Breaking the silence: Combating female genital mutilation in Somali communities
10 October 2023

Archives