By Matt Haikin
Digital Citizen Engagement (DCE) is the buzzword of the moment. Whether it’s using technology to enhance participatory budgeting, creating SMS feedback systems for public services, or publishing government’s data in open forms to allow civil society to use it – it’s everywhere and more and more funders are starting to pay attention and support it.
This is creating demands for new skills, putting additional pressure on program managers to add DCE elements to their existing programs, and creating huge opportunities in the M&E realm.
Is it worth it? Does it live up to the hype?
Is Citizen Engagement a “game changer for development” (as the World Bank’s Coursera course puts it)? And does adding ICT into the mix make it more effective? Or just going digital exclude those on the wrong side of the digital divide and instead just focus funding and attention away from more tangible projects such as health and education?
Evaluating the effects of digital citizen engagement
In 2015, the World Bank commissioned Aptivate to lead a consortium of practitioners and academics (including IDS, ICA:UK and independent researchers) to explore this theme. The World Bank wanted to come up with a way to both evaluate the effects of Digital Citizen Engagement programs and support those designing, planning, managing or commissioning new DCE initiatives.
The resulting Evaluating Digital Citizen Engagement: A practical guide has just been published on the World Bank’s Open Knowledge Repository.
So what’s it all about?
The guide provides practical tools and guidelines for use in evaluating the expanding field of digital citizen engagement (DCE) and is a helpful resource for anyone seeking to better understand the role of digital technology in citizen engagement. Five ‘lenses’ are used as a vehicle throughout the guide — different perspectives through which DCE interventions might be viewed:
- Objective: What is the program objective, and is it logical, reasonable, and appropriate?
- Control: Who owns, controls, and influences the digital engagement process?
- Participation: Who participates and how?
- Technology: How effective and appropriate is the choice and delivery of the technology?
- Effects: What effects and impact do citizens have on processes and outcomes?
Considering DCE programs through each lens can help to build up a more comprehensive picture and uncover important themes right from the earliest involvement of the evaluation or program design. While recognizing that evaluation and program development should be iterative processes, the guide includes chapters offering guidance for every stage of a typical evaluation lifecycle: Scoping, Designing, Planning & Implementing, Analyzing, and Sharing, Reflecting & Learning, punctuated with anecdotal sectoral advice from organizations. These organizations include the Commonwealth Telecommunications Organisation, Intel, Indigo Trust, mySociety, Indian Institute of Technology, University College London, Oxfam UK and many more.
A contribution of particular relevance to this article came from Robert Chambers at the Institute for Development Studies in the UK who offers some vital questions to ask when designing or evaluating these projects:
- Who took part and who did not?
- Why did non-participants not take part?
- How were results affected by who took part and who did not?
- What were the likely views of citizens who did not take part?
- What influenced or distorted the responses of those who did participate?
- Who owned and who had access to the data?
- Who analysed the data and who was it shared with?
- Who gained and who lost from the process?
Asking the right questions is the hardest part
Asking these kinds of questions at all stages – from program design to evaluation – cuts to the heart of how to ensure digital citizen engagement achieves its game-changing potential, rather than ending up as simply an ineffective and distracting addition to people’s workloads.
Perhaps the most surprising part of the research was realizing how critical it is to simply prompt people to ask the right questions. Some people need support with various aspects of design and evaluation, but many are experts in this and just need to focus their attention on the right things by asking the right questions.
To make sure evaluators and program designers can do this, we included a Question Bank as one of the Guide’s toolkits. This contains a wealth of challenging questions that are quite similar to those raised by Chambers above. We hope this will help to ensure that ignorance can no longer be an excuse for getting this stuff wrong.
So, can digital citizen engagement be a game-changer for development? Absolutely.
In the research for the guide and the field evaluations we did to inform its development, we often found examples of where digital technology has improved the relationship between citizen and state, where it has created new spaces for deliberation, and where it seems to have had tangible impact on hard outcomes such as spending decisions, policy changes, improved service delivery etc.
But we also found countless examples where digital citizen engagement had either not lived up to expectations or in some cases was just a colossal waste of time and money. It may even have damaged the chances of doing something successful in that area in future.
So which will it be..? Game changer or digital distraction.
The answer is down to those of us implementing these projects. Do we deliver them well, in partnership with the affected communities and with a constant eye on evaluating their impact? Or do we just plough on hoping we know best and wait and see what happens?
We think the answer is clear, and we hope that the Evaluating Digital Citizen Engagement: A practical guide will help you make better choices, deliver better citizen engagement, and make a bigger positive impact with your work.