Whenever we are collecting data, we follow our data protection policy which has been developed with support from DataShift and The Engine Room. Our motivation for high data protection standards is respect for participants who share their opinions with us, and sensitivity to their personal information.
We inform audiences the purpose of gathering their opinions, and how their responses will be used. For radio projects, the radio host will read out this information as part of their script. In an endeavour to get consent to use their data, we offer every participant the opportunity to ‘opt-out’ of their answers being included in the research. For example, they can send an SMS with ‘delete’.
Despite our efforts to ensure participants know how their data is being used, we can never be sure that all of them fully understand the research process. Furthermore, there are instances when we cannot reveal the exact purpose research. This may be due to the potential impact upon people’s answers (e.g. biasing them) or more seriously for security reasons (such as a radio station being linked to a negatively perceived international NGO).
Our findings support our partners in a range of ways, including: to improve their communications for development, as baseline and formative research, to measure their impact, and to gather citizen feedback for monitoring their programmes. Here are a couple of direct quotes from organisations we’ve worked with that illustrate examples of how our insights have been used:
Africa’s Voices insights have fed into a larger baseline study, and will be used to inform the advocacy and campaigning approaches in different regions. This will also be used to inform dialogue with County and National Governments. After working with Africa’s Voices, we have definitely seen new opportunities to gather and incorporate citizens’ voices into our programmes.
Working with Africa’s Voices has opened a new world of insight and potential, hiding in data we already had but couldn’t interpret. To be able to decipher the collective meaning within our audience correspondence is like listening at the keyhole of a giant conversation. The new clarity this has given us, deepened and enhanced by the skilled professional support of the Africa’s Voices team, has helped us refine our purpose and our methods, and given us a powerful new account of the impact of our work.
Africa’s Voices ability to stimulate inclusive dialogue and collect opinions of marginalised communities in local languages through simple technologies has been invaluable in informing how we engage with our target beneficiaries. The Africa’s Voices approach – where they combine the basic mobile phone “mulika mwizi” and community radio – has helped us gather knowledge levels and opinions of communities towards oil and gas exploration in Kenya. Africa’s Voices analysis has also helped us design targeted messages, which we are then able to track using their approach.
This is a question we’re often asked. With any innovative research approach, especially one that gathers and analyses digital data, the credibility of the data and insights is often scrutinised.
We start with agreeing that Africa’s Voices data is skewed – male, younger, and more educated people are more likely to participate in public forums including radio and social media discussions. This reflects the realities of the audiences and where social influence and power lies in their society.
However, our approach also reaches and engages people at the bottom of the pyramid, so that their voices are also heard. We can identify biases in participation (e.g. overrepresentation of men) and take them into account when drawing conclusions about beliefs of different groups.
Our approach values the richness of diverse voices over statistical generalisations, so representativeness is not the main criterion to assess the validity of our findings. People who choose to participate in the discussions are heterogeneous, making it possible for us to identify collective ideas from different social groups, and how they change over time.
These credible findings are related to the robustness of our methodology, coupled with meaningful data that is gathered in a real context.
For more on this topic, take a look at this segment of a presentation given by our Head of Research, Claudia Lopes.
We are sensitive to not being extractive, and do not want to gather data from citizens without giving back findings. As well as seeing this as part of respecting participants, we think that closing the feedback loop fosters accountability and supports greater citizen engagement in the future. Therefore, we are constantly seeking new ways that are accessible and engaging to feedback knowledge and insights to citizens.
The way that we feedback findings depends upon the nature of the project and audience. For radio projects, we will have an additional show at the end of a series that shares insights, addresses questions or misconceptions that have been raised, and informs the audience on how these findings will be used by our partner organisation. For social media projects, we might create an infographic that summarises key findings and then share online.
The approach that AVF has developed is highly scalable. Developing tools for low-resource African languages is a cumulative process: the more work that is done, the stronger the analysis becomes. This cumulative nature of leads to an ever- growing base of linguistic and domain-specific resources. These resources can be then reused in future projects with our clients on a larger scale.
It is possible to include more channels of engagement (e.g. more radio stations, or have multiple channels such as social media) and deliver a project over a longer period of time. We can also scale up to different regions and countries. In fact, scaling up our approach can be more cost-effective, as costs of analysis do not increase proportionally with the level of participation – but insights get richer and deeper.
Access to technology is increasing rapidly, including in African countries. By using SMS as a data gathering channel, people can participate in our interactive forums with even the most basic mobile phones. All text messages sent or received are free for the participant.
That said, we know that our approach cannot reach everyone. As well as lacking access to communication technologies, many people in our target populations may be illiterate – both in writing skills and digital skills.
We check estimates of how many people in a given population have access to technologies and use this to inform our research design and what approach to take. It also means that we are fully aware of what percentage and sectors of a population we are not reaching. We triangulate our research with that of others whose methods are more representative to compare and contrast the different findings. This helps to generate a complete picture of a population and their views on an issue, at the deeper level of our qualitative insights as well as more quantitative insights from others’ surveys. See also answer to: How does Africa’s Voices ensure the credibility of its data and insights?