Impact-Oriented Monitoring

Pre-conference training during the
Uganda Evaluation Week 2014:
Training on Impact-Oriented M&E
May, 2014
Uganda
Trainers:
Dr. Felipe Isidor-Serrano
Christina Stern
2
Results-Based Management
Background - why more Monitoring & Evaluation (M&E) and
Results Based Management (RBM) in many countries?
 Improved planning
 Improved learning processes (“not reinventing the
wheel”)
 Improved performance based decision making (“no
blindfold policy making”)
What has been done?
 Many conferences took place and many agreements have been
made that led to more RBM and M&E
Marrakesh 2004 (Managing for Development Results), Paris 2005
(Declaration on Development Effectiveness), Hanoi 2007 (Managing for
Development Results), Accra 2008 (Accra Agenda for Action), Busan 2011
3
Results-Based Management
RBM is a management strategy aimed at achieving
important changes in the way institutions operate, with
improving performance at different levels. RBM is
…about planning
Mark Twain:
“After having lost our targets we doubled our efforts”
...implementation
...monitoring and evaluation
Bill Gates:
“Success depends on knowing what works.”
4
Results-Based Management
Implementation
(deadlines,
efficiency and
effectiveness
through lessons
learnt, team work,
etc.)
Planning
(objectives &
goals, resources)
5
Awareness,
understanding,
acceptance
(leadership,
ownership,
motivation,…)
Feedback (focus on
performance and
results)
Monitoring &
Evaluation
M&E frameworks
and tools,
data/information
management
Results-Based Management
What is results orientation?:
Focus not only on what has been done (activities) or produced (outputs)
but also on changes and benefits regarding attitudes, awareness,
understanding and behaviour (outcomes and impacts)
Example:
?
Hiring a
consultant
Behavioural
change
Providing good
services
?
6
?
Clients‘
performance
improved
Practical Exercise: 1-2-3
Monitoring vs. evaluation vs. impact evaluation
•
7
Results-Based Management
Definition of Monitoring:
Monitoring is a continuing function of observing and
assessing to provide management and the main
stakeholders of an on-going intervention with
indications of the extent of progress and achievement
of objectives and progress in the use of allocated
funds
8
Results-Based Management
Components of a monitoring framework:
Logic Model
(Results)
Monitoring
Framework
Data
Management
Specifications
9
Indicators and
Targets
Results-Based Management
Definition of Evaluation:
Evaluation is an assessment of on-going or
completed projects, programmes or policies.
Evaluations should be conducted as systematically
and objectively as possible by applying certain
principles and criteria. It refers to design,
implementation and outcome.
10
Results-Based Management
Definition of Impact-Evaluation:
Impact evaluation is intended to determine whether
the program had the desired effects on individuals,
households, and institutions and whether
those effects are attributable to the program
intervention. To do so, it is necessary to net out the
effect of the intervention from other factors.
11
12
Impact-Oriented Monitoring
Why are different levels of results needed?
 … to avoid a black box scenario (where nobody knows why
certain results have been achieved)
 … to be able to systematically track results that occur at
different points in time
 …to align results of projects and programmes with higher
level results
13
Impact-Oriented Monitoring

1 Goals/Impact
Contribution at large

2 Objectives/Outcome
Benefits, changes, behaviour…
Use of the goods and/or services

3 Deliverables/Outputs
Good and/or services

.
Impact-Oriented Monitoring
How to formulate objectives and goals?
Increased....
Improved...
Advanced...
Effective...
Better...
...
 Objectives and goals are to be formulated in a way
that it expresses an achievement (emphasis on
results not processes)
15
Impact-Oriented Monitoring
Different levels of results
Impact 2
Impact
Impact 1
Outcome 1
Outcome 3
Outcome
Outcome 3
Output 4
Output 3
Output
Output 2
Output 1
Impact-Oriented Monitoring
Different stakeholders
Impact A1
Impact B1
Outcome A3
Outcome B3
Output A1
Output B1
Output A2
Output B1
Stakeholder A
Stakeholder B
Practical Exercise: Theory of Change
•
Please formulate a possible results chain for an intervention in
the education sector aiming at increasing youth employment.
Impact
Outcome
Output A1
Output A2
Intervention/Stakeholder
18
19
Impact-Oriented Monitoring
How can we measure our results?
Operationalised descriptions of desired results
Variables that provide a measure to what
extent results are achieved at a certain point in
time
In simple words: Indicators make results
measurable
 Important: Stakeholders should have a
common understanding on how to interpret
the results
20
Impact-Oriented Monitoring
Indicators must be SMART...
Specific
The indicators identified should be a true reflection of the desired
achievement.
• Do the indicators really reflect the objective?
• Are more indicators required to reflect the objective?
• If the indicators are measured by different persons, would they
probably come to the same result?
Measurable
The indicators identified must be measurable.
• Are the indicators assessable?
• Are data to measure the indicators accessible?
• Is the measuring unit clear (e.g. USD or EUR)?
21
Impact-Oriented Monitoring
SMART
Achievable
The specified quantitative and qualitative target must be achievable
(realistic). However, too deeply set targets of the indicators might suggest
evident results, but, are of low use for the success measurement of the
considered intervention.
• Are in general the targets associated with the indicators
achievable? Are there any benchmarks?
• Are the targets realistic given the capacities and financial and
human resources to implement the project/programme?
Relevant
The information which is produced by the indicator must deliver important
information for the decision-makers.
• Does the measurement of the indicators deliver important
information for decision makers?
• Is the indicator really necessary or can information be provided by
another indicator?
Timely
Information must be available in time and to allow the management to take
corrective action in time.
• Can the indicator’s information be collected within a realistic
timeframe?
22
Impact-Oriented Monitoring
Formulation of indicators:
Number of…(Number of training participants…)
Percentage of… (20% drop out rate...)
Yes/No (e.g. … is in place)
Perception (e.g. 80% of interviewee indicate that…)
Time/frequency (e.g. time to get a certification…)
23
Impact-Oriented Monitoring
•
•
Compromise between accuracy and measurability
Practical requirements


•
Political requirements


•
Sufficient resources for the measurement?
Required measurement capacities?
Same understanding of indicators among all stakeholders?
Acceptance of the indicators among all stakeholders?
Indirect indicators (proxies)

24
Proxies help to simplify complex indicators as they allow to draw
conclusions regarding the fulfilment of indicators that are difficult
to measure
Practical Exercise: Indicators
•
Please formulate for each of your results in the results chain
one indicator.
Impact
Outcome
Impact Indicator
Outcome Indicator
Output Indicator
Output A1
Output A2
Intervention/Stakeholder
25
Output Indicator
26
Impact-Oriented Monitoring
Components of a data management process:
Data
Gathering
Data
Management
Data
Dissemination
27
Data Analysis
Impact-Oriented Monitoring
Data gathering - the main challenge

Asking the right questions and applying
appropriate tools (questionnaire, interview, use
of secondary data, observation, etc.)
Data analysis - the main challenge

How to distinguish between “good” and “bad”
data?
Data dissemination - the main challenge


How to make reporting effective and efficient?
What other ways for data dissemination can be
applied?
Coordination is
key…
Impact-Oriented Monitoring
Classical methods of data gathering, e.g.







Questionnaires
Interviews
Document analysis (e.g. collection of quantitative information)
Observation
Tests
Feedback reports from workshops, seminars, conferences, etc.
Discussions at stakeholders’ and target groups’ meetings
29
Impact-Oriented Monitoring
What is sampling?
In the case of large target groups, representative results can be
achieved via a representative sample:
 “A small quantity of something such as customers, data,
people, products, or materials, whose characteristics
represent (as accurately as possible) the entire batch, lot,
population, or universe.”
30
Impact-Oriented Monitoring
Biased results (survey design):
Getting “good” survey results requires to ask the right questions


Answers to the questions must really refer to the problem to be
examined.
Suggestive questions which push the interviewees in a certain
direction have to be avoided.
Selection bias:



Undercoverage
Nonresponse bias
Voluntary response bias
31
Random sampling helps produce
representative samples by
eliminating voluntary response
bias and guarding against
undercoverage bias
Impact-Oriented Monitoring
Crucial questions when analysing data:

What data should be analysed?

What do you need to know for management decisions?

How many people/sources were questioned? What is the
denominator?
32
Impact-Oriented Monitoring
What is data dissemination?
Data dissemination means “making data available
to the information users”.
 It is the process of extracting (and interpreting) information from a
database for reporting!
33
Impact-Oriented Monitoring
Dissemination tools:

Mailing lists
Newsletters
Websites
Briefings
Conferences
Reports
Workshops

One-to-one






34
Impact-Oriented Monitoring
Different forms of dissemination

Basic information dissemination, such as reports


Analytical information dissemination


in which the significance of data is analyzed through statistical techniques – these may have
the form of papers or articles, either in printed or in electronic format on the website
Methodological information dissemination

35
main purpose is to quickly, efficiently and effectively inform the users about the ‘main points
of the news’
setting out methods and standards used to compile statistics – these may also be in printed
format or as meta-data on the website
Practical Exercise: Monitoring Plan
Please develop a monitoring plan for three of your indicators.
36
Impact-Oriented Monitoring
General monitoring approach :
1.- Developing
monitoring
systems and
tools
37
2.- Developing
Capacities
(Monitoring
Unit)
3.-Institutionalise
M&E (Manual)
38
Evaluation
Criteria
Definition
Relevance
Are we doing the right things?
Effectiveness
Are the objectives of the intervention being achieved?
Efficiency
Are the objectives being achieved economically by the
development intervention?
Impact
Does the development intervention contribute to reaching
higher level development objectives?
Sustainability
Are the positive effects sustainable?
39
40
Evaluation
 Moving away from just “assuming“
 Correlations vs. causalities
 Often misunderstanding of what is meant with impact evaluation
 Only measuring the impact of the results chain or logframe?
 Only benefits?
 Outputs, outcomes and impacts?
 Counterfactuals: The hypothetical situation that shows what would
have happened without the intervention (no before and after
comparison)
41
Evaluation
With and without“ as leading principles of causality identification
Therefore, the before-and-after
approach overestimates the
results of the intervention if
a general positive trend has taken
place, whereas it underestimates
the effect if the
indicator value has decreased in
the absence of the intervention
42
43
Link between Monitoring
and (Impact-)Evaluation
Impact
Impact
Relevance
Effectiveness
Outcomes
Allocation efficiency
Outputs
Production efficiency
Activities
Inputs
44
Sustainability
Link between Monitoring
and (Impact-)Evaluation
Efficiency
40Impact A1
30
10
OutcomeA3
10
OutcomeB330
10
OutputA2
45
10
20
OutputA1
20
10
OutputB1
10
Link between Monitoring
and (Impact-)Evaluation
Efficiency
Impact A1
40
25
15
Outcome A3
5
Output A2
46
10
15
Output A1
Outcome B3
5
10
10
20
25
10
Output B1
10
47
Link between Monitoring
and (Impact-)Evaluation
Impact A1
Impact B1
Outcome A3
Outcome B3
Output A1
Output B1
Output A2
Output B1
Stakeholder A
48
Stakeholder B
49