Ideas to Impact’s evaluators from Itad have been refocusing their prize evaluation plans. In this post, they explain what they hope to be reporting on when the programme draws to a close in 2020.
As you read the posts on this blog, you may spot a recurring theme: stay curious and don’t be afraid to adapt your plans as you learn more about using prizes for development. It is certainly advice that the Evaluation & Learning Team have been following.
Last December, we shared our interim findings on the UK Aid-funded Ideas to Impact prizes. Since then, we have been questioning our next steps and how to prioritise our evaluation resources so we come away with some solid and useful findings about using prizes for development.
Since the first evaluation plans were developed, prize designs have evolved, new prizes have come onboard, we’ve discovered more about the most promising areas for learning within the programme and got a sense of what works (or not) when trying to evaluate a prize.
We recently agreed with Ideas to Impact’s funder, the UK Department for International Development (DFID), our revised evaluation approach for stage 2 of the programme. It includes a new set of programme-level evaluation questions and a specific area to focus on for each prize. Our evaluators are now busy developing interview schedules, questionnaires and other data collection tools. In the meantime, while we may not have the answers yet, here are the five key findings we plan to share before Ideas to Impact wraps up.
1. Did all the Ideas to Impact prizes achieve what they set out to do?
Each evaluation report will open with the prize’s story: what it was setting out to achieve, its design and how that changed over time, its results, what went well (or not so well). Critically, we’ll tell you what you need to know about where the prize was operating and what was going on at the time, so the findings can be put into context.
2. What are prizes good for?
We’ll be reporting on how Ideas to Impact’s prizes have been used to achieve a range of effects. These include raising awareness, promoting best practice and altering the policy environment as well as any other positive, if unintended, effects they had. For example, how successful was the Adaption at Scale prize at promoting effective climate adaptation practices in Nepal?
3. Do the effects of prizes end when they are awarded?
We’re going to tell you, for at least one prize, whether the effect a prize had or the innovations it stimulated are sustained beyond the date of the award ceremony. The Climate Information Prize ends this autumn, so next year we’ll see if the services developed by participants to make climate data useable and accessible are still helping key stakeholders in Kenya.
4. Is support to solvers necessary for prizes to be successful?
The prize teams have been grappling with this issue for some time now and the introduction of support to solvers is an important distinction between prizes for development as compared to those used in other settings. The prizes are taking different approaches to providing support to solvers to help them participate. We’ll find out if this support made the prizes more inclusive for participants that are typically underrepresented and helped them stay the course, in line with the 2030 Agenda for Sustainable Development pledge to leave no one behind. We should also be able to understand where support to solvers might have been useful to help participants overcome barriers, enhance participation and widen inclusion.
5. Do prizes offer value for money (VFM) when you compare them to alternative forms of funding?
This is probably the most critical question for anyone considering investing in prizes and one that we have spent a lot of time thinking about (and getting advice on) how to answer satisfactorily. Our thinking has moved on since we started exploring VFM in August 2017. Our intention is to identify (non-prize) development programmes, e.g. grants, to use as comparators in the VFM analysis for each of the prizes and, in at least one case, look at the costs to participants.
In fact, this question is so thorny that it warrants a post of its own. We’ll share our plans for how we will analyse VFM in our next blog post later this year. Watch this space!
Cover photo: In April 2016, Sam Owilly (third from left) took the first place in the Wazo Prize, which is the first stage of the Climate Information Prize (CIP). CIP will provide a good opportunity to evaluate if a prize's effects last beyond the award ceremony.