Training Course
Design and Evaluation of Innovation Policy:
Evaluating the Impacts of Science, Technology and Innovation Programs

Download presentations

Application deadline: 6 February 2009

The International Development Research Centre (IDRC), Canada, the United Nations University- Maastricht Economic and Social Research and Training Center on Innovation and Technology (UNU-MERIT), Maastricht, The Netherlands and the National Research and Innovation Agency (ANII), Uruguay are jointly organizing a training course on the Design and Evaluation of Innovation Policy (DEIP) in Uruguay from 30 March to 3 April 2009. Focus of this course will be explicitly on evaluation of Science and Technology Policies. This includes evaluation strategies, methodological approaches and statistical techniques.

The main objective is to foster ex-post evaluation capabilities of Offices of Science and Technology (OST) in the Latin American and Caribbean (LAC) region by providing them with methodological options and examples about different ways to assess the impact on development outcomes and effectiveness of the science, technology and innovation policies (STIP).

The methodology will aim at answering the following question: how can one evaluate whether a given STI program is actually working? This course deals with methodological techniques that have been developed to estimate the causal impact on one or more outcomes of interest of any generic STI “intervention” in the presence of selection decisions by agents (be them firms, public research organizations, universities or individual researchers). After highlighting the “evaluation problem” and the challenges it poses to the analyst, the training will focus on the empirical methods to solve it and on the “research infrastructure” needed to carry-out successful and informative evaluations. For each of these approaches, training will provide the basic intuition, discuss the assumptions needed for its validity, highlight the question it answers, and discuss its strengths and weaknesses drawing from example application in the literature.

ANII´s Conferences Room, Montevideo, Uruguay

30 March to 3 April 2009

Target Clientele:

The potential participants are:

  1. Senior and middle level officials of Ministries of Latin American countries dealing with Science, Technology and Innovation Policies,
  2. State officials involved in policy making about technology,
  3. Officials of State Science and Technology Councils and
  4. Personnel from the private sector involved in strategic planning and business excellence and managers of science parks and technology incubators.

Selection of the candidates:
The candidate's qualifications, experience, and the relevance of their present work to innovation and policy making, will be the main selection criteria.

Course Completion Certificate
All participants who successfully complete the course will be given a course completion certificate.

Reading Materials
Participants will be provided with a compilation of key reading materials.

Course fee
There is no course fee for participants. Participants from outside Uruguay are expected to arrange for their own travel funding. However, a system of 20 scholarships will be put in place initially for participants from poor and far distant countries. A tripartite academic committee between UNU MERIT, IDRC and ANII will allocate these scholarships.

The language of the training will be English (in particular with regards to the international speakers), however local speakers could lecture in Spanish. Simultaneous translation will be also provided. The trainers will be a combination of international and regionally based speakers.

Level of Knowledge Required
Basic statistical knowledge including some familiarity with regression analysis is recommendable.

Training Course Topics
Training will cover the following basic topics

1. Introduction to the evaluation of  innovation policies. What is innovation? Why Evaluation? Evaluation as part of innovation policy. The Evaluation Logic Model. The Evaluation Problem. Natural Experiments.

2. Metrics. Science, Technology and Innovation Indicators for evaluation (Innovation surveys, R&D surveys, bibliometrics, patents, etc).

3. Particularities of STI program evaluation. what to evaluate (inputs, behaviors, outputs), when to evaluate (short, medium and long term impacts) and best practices on how to evaluate (scope, rigor, type of impacts, control groups and counterfactuals, internal vs. external evaluations, transparency, replicability, etc).

4. Alternative Approaches. A brief description of alternative evaluation methods (case studies, cost-benefit analysis, bibliometrics, historical tracing and peer review) and the reasons for focusing on the statistical approach, although in combination with the other approaches.

5. Impact evaluation in action. This will cover several applications that can be tailored according to country’s specific data availability and infrastructure. For example applications when only cross-sectional data is available (instrumental variables, matching methods). Applications when panel data is available (before and after, difference in difference).  Applications based on the use of projects’ ranks or scores (discontinuity design methods). In all these cases data requirements will be analyzed and advantages/disadvantages of each application will be assessed.  Lectures will make extensive use of real life examples and will show how applications can be instrumented in standard statistical software.

6. Brainstorming and interaction. Participants will be also asked to make presentations of their main national support programs for innovation including a discussion on the ways how these policies are evaluated and main obstacles they are facing at the moment of setting their evaluation plans

7. Infrastructure for impact evaluation. The importance of building linkages with Offices of National Statistics for questionnaire and sampling harmonization. Data linking of existent databases (Examples of the DTI-ONS Business Data Linking program of the UK, Argentina´s Labour Dynamics Observatory, IPEA-IBGE program of Brazil), Introduction to the concept of micro-aggregation, the results of Eurostat experience with the CIS.

8. Review and General Lessons: How ex-post evaluation can help the overall science and technology policy process. The EU experience. The political economy of evaluation.

Application deadline: 6 February 2009
Selected candidates will be notified before 19 February 2009.
Please send your application form and CV to any of the contacts cited below. The application form can be also downloaded from the following institutional links.

application form (Word)
application form (PDF)

Ms. Eveline in de Braek
Secretary to the Training Programmes
Keizer Karelplein 19, 6211 TC Maastricht , the Netherlands.
Fax: + 31-43-388 4499
Tel: +31-43-388 4400

Alternatively, application documentation can be also sent to:

Ms Clara Saavedra
Program Assistant
International Development Research Centre (IDRC)
Latin American and the Caribbean Regional Office (LACRO)
Tel: +598-2-7090042 – Ext: 320
Av. Brasil 2655
11.300 Montevideo, Uruguay


Ms Maria Laura Fernandez
International Cooperation Officer
National Research and Innovation Agency (ANII)
Rincon 528, Piso 2
Codigo Postal:11000
Montevideo Uruguay
Tel: +598-2-9166916
Fax: +598-2-9169115
ANII Link:

up back to top