The landscape of evaluation in the UN: Commonalities, opportunities and challenges
Jos Vaessen,
Evaluation as a practice has a long history in the UN system; in the last ten years or so the evaluation function has become more firmly institutionalized. For example, most entities in the UN (e.g. Funds and Programmes, Regional Commissions, Specialized Agencies) now have in place dedicated evaluation policies, frameworks and centralized and decentralized evaluation functions. In addition, the United Nations Evaluation Group has made significant progress in promoting collaboration in evaluation, institutionalizing norms and standards and promoting evaluation use among decision makers. Notwithstanding these developments, evaluation in the UN system is faced with several key challenges relating to the resources, planning, implementation and quality, and the use of evaluations. Moreover, with Delivering as One as an ongoing process, the practice of collaborative evaluations of joint programmes continues to be particularly challenging in many cases.
This presentation focuses on a subset of challenges that is associated with evaluating the impact of policy interventions. In recent years, the widespread efforts to adopt results-based management practices within the international development community have gone hand in hand with an increased interest in the impact of policy interventions. The collective endorsement of internationally agreed upon development goals (e.g. the MDGs, the SDGs) with corresponding indicators and targets at the outcome and impact level, increased pressures on public budgets allocated to development aid, new developments in technologies and systems of data collection, and repeated references to the paucity of available evidence on what works, are some of the key drivers behind this trend (e.g. CGD, 2006; Jones et al., 2009).
The presentation highlights common challenges in M&E systems in the UN relating to the supply of (and to a lesser extent demand for) evidence on impact. Subsequently, potential solutions are offered. One important point is that in order to strengthen the evidence base on impact (including the relation to the SDGs), it is not sufficient, even not cost-effective, to ‘merely’ promote the practice of impact evaluation. Instead, the marginal utility of improving existing (non-IE) M&E tools towards becoming more ‘impact-oriented’ is higher. Not only would this by itself lead to an improvement in the evidence base on impact, it would also ameliorate the basis for identifying strategically important gaps in the evidence base and opportunities for cost-effective impact evaluation exercises.
About the speaker
Jos Vaessen (Ph.D. Maastricht University) is Principal Evaluation Specialist at the Internal Oversight Service of UNESCO in Paris and lecturer at Maastricht University, The Netherlands. After completing his M.Sc. in 1997 (Wageningen University) and prior to starting his current position at UNESCO in 2011, he has been involved in research, teaching and evaluation activities in the field of international development at Antwerp University and, more recently, Maastricht University. Over the last fifteen years or so, he has worked for several multilateral and bilateral international organizations mostly on evaluation-related assignments. His fields of interest include: theory and practice evaluation, impact evaluation, rural development and environment. In addition to managing and conducting evaluations Jos regularly serves on reference groups of evaluations of different organizations. He has been (co-) author of more than 30 publications, including three books. Recent publications include: Impact evaluations and development – NONIE guidance on impact evaluation (2009, co-author, with F. Leeuw; NONIE), Mind the gap: perspectives on policy evaluation and the social sciences (2009, co-editor,with F. Leeuw; Transaction Publishers), Dealing with complexity in development evaluation (2015, forthcoming, co-editor with M. Bamberger and E. Raimondo; SAGE Publications).
Venue: UNU-MERIT, Boschstraat 24, room 1.23
Date: 16 November 2015
Time: 10:00 - 11:00 CEST