By Anibal Velásquez, Fabiola Cáceres, Úrsula Martinez
1.
Introduction
The creation of the Ministry of
Development and Social Inclusion -MIDIS on October 2011 marked a milestone in
the institutionalization of social inclusion as a state priority for the
Peruvian government and consolidated the beginning of a reform towards an
evidence based development and social inclusion policy focused on focalization,
inter-agency and intergovernmental coordination and rigorous evaluation of the
results.
In that context, MIDIS created the General
Direction of Monitoring and Evaluation- DGSE as the unit in charge of
monitoring and evaluating policies, programs and projects related to
development and social inclusion in all levels of the Peruvian government. The mission
of the Direction is to improve the impact, efficiency, quality, equity and
transparency of development and social inclusion policies and programs and,
therefore, provides to both citizens and policymakers, the results of
evaluations and evidences generated by the Monitoring and Evaluation System.
Nevertheless, aware of the fact that the
use of evidence in social policy is not an automatic event but a process
conducted in a political and institutional context influenced by various
visions, actors and interests, the Monitoring and Evaluation System designed by
MIDIS has emphasized in developing instruments to ensure that the evidence
generated not only fulfills the highest quality standards but is translated in
a learning source and the continuous improvement of the developments and social
inclusion public interventions.
2. The experience of MIDIS in use of evidence
Traditionally, monitoring and evaluation
systems throughout the world have been focused in the production of high
quality and independent information and evidence on a timely manner. Those
efforts have resulted in great improvements on the professionalism and academic
rigor of the scientific evidence generated about the efficiency and efficacy of
public interventions on social issues.
However, internationally, it is
increasingly common to identify a crisis in the use of evidence in public
policy decisions. This situation suggests that even though the generation of
evidence is a necessary piece of the puzzle, it is not enough to ensure
informed public policies capable of improving the quality of social expenditure
since the use of evidence depends on multiple factors that can increase or decrease
the probability of influencing public policy.
According to Weiss (1999), there are four
factors that influence the use of the results of evaluations: interests,
ideologies, institutional culture and information sources. As stated by the
author, if the evidence generated conflicts with any of those four factors, the
probability of being used in public policy decisions decreases considerably.
From a different perspective, Innvaer
(2012) argues that the “two community thesis” explains the limited use of
evidence in public policy decisions. According to the author, there is a collision
between science and politics that obstacles the use of evidence in the decision
making process. Innvaer argues that scientists see themselves as rational, objective
and innovative and perceive politicians as interest-driven actors immune to
innovation and scientific evidence. On the other hand, politicians and
policymakers see themselves as responsible, action oriented and pragmatic
leaders and perceive scientists as naïve and commonly disconnected with the
actual reality.
In that sense, in order to promote the use
of evidence in public policy decisions, Innvaer considers that it is mandatory
to create conditions that increase the probability of use, fostering, for
example, spaces of dialogue and interaction between science and politics.
However, in opinion of the author, the responsibility of promoting further
dialogue between both fields cannot be placed on any of both sides and a third
actor is necessary to intermediate and facilitate.
In that context, in the design of the
Monitoring and Evaluation System, MIDIS has considered DGSE both as a unit that
generates high quality evidence and as a mediator between evaluators and
policymakers in order to promote the use of evidence in social policy. In that
sense, through the dual role of DGSE, MIDIS intends to close the gap between
the scientific world of the academia and the policymaker´s reality, translating
evidence into clear, timely and viable recommendations that can be understood
and mostly, used by policymakers.
Therefore, unlike the traditional systems
of monitoring and evaluation, the DGSE concept constitutes and innovation since
the final goal that justifies the existence of the unit goes beyond the
production of information and aims to improve the impact, efficiency, efficacy,
quality, equity and transparency of social and development interventions
through the use of evidence.
As shown in the logic framework developed
by DGSE (Figure 1), the unit has conceptualized several products and services
to produce systematic information and evidence regarding opportunities for
improvement in social interventions. Those results will allow DGSE to
contribute to the development of Performance Improvement Plans and to inform about
the effectivity of social policies and programs with the objective of
increasing the impact, efficiency, quality, equity and transparency of
governmental interventions in development and social inclusion matters. The
design of the DGSE model was validated and supported by all MIDIS´s internal
stakeholders who made several contributions to “MIDIS´s Guidelines for
Monitoring, Evaluation and Use of Evidence”.
In order to ensure the independence of the
evaluations, impact evaluations on MIDIS´s social programs and policies are
funded directly by the Ministry of Finance or multilateral agencies. However,
DGSE participates actively on the evaluations and works closely with the
Ministry of Finance providing technical assistance to ensure the quality of the
evidence generated.
Figure 1: DGSE Logical framework
In particular, it is important to note that
the most innovative element of DGSE regarding traditional evaluations units
relies in the evidence and recommendations management component since this
specific line of action is the one that influences Performance Improvement
Plans and the use of evidence, and therefore, allows to translate evidence into
greater impact, efficiency, quality, equity and transparency of public
interventions on social and development.
Therefore, DGSE has evolved from the
production of information to the use of evidence and has learned that what
justifies the existence of monitoring and evaluation systems is the final goal:
improving the quality, efficiency, efficacy and equity of policies, programs
and services through evidence and results based management.
In this context, DGSE is determined to be a
part of the decision making process of policies and programs, providing
reliable evidence and information on a timely manner in order to feed into planning,
design and operational decisions.
As shown in Figure 2, DGSE´s model is not only
centered in the evaluation cycle but considers as well the cycle of policy and
programs in addition to the public administrative system cycle and the political
context. Therefore, DGSE provides evidence and information to programs and
organic units according to the phase of the cycle they are going through. For
instance, in the case of programs in a design or redesign phase, DGSE provides
evidence to identify and implement adjustments if necessary, while in the case
of programs focused on the operation, DGSE provides products designed to
identify and solve specific problems affecting the efficiency and efficacy of
the intervention.
Figure 2:
DGSE in the cycle of policy and programs
In terms of the products and services designed
around the evaluation cycle, DGSE has developed a Performance Improvement
Strategy shown in Figure 3. As it can be observed, the cycle begins by
identifying potential areas of improvement in policies and social programs.
That process can be started by programs themselves or by DGSE. However, in the
second phase of the cycle DGSE and the programs decide together on the best
instruments to produce the expected evidence in a timely manner.
Once the evidence production phase concludes,
DGSE has conceived an intermediate step between the production and use of
evidence where the Recommendations Technical Reports are prepared and
presented. This management reports have been designed with the aim of providing
policymakers in charge of the design and/or operation of public interventions
on development and social inclusion clear and timely recommendations that
consider both the political and economic viability of implementation.
In that sense, through this
innovative design, MIDIS expects to connect scientific evidence with the
reality of the operation of social interventions with the goal of increasing
the use of evidence in the decision making process.
Figure 3: DGSE´s Performance
Improvement Cycle
Other innovation element in
the Peruvian Monitoring and Evaluation design relies in the development of the Performance
Improvement Plans that demand close coordination and negotiation between DGSE
and the programs or units in charge of the operation of the evaluated interventions.
The Performance Improvement Plans have been designed as a management tool that
based on the opportunities for improvement identified in the Recommendations
Technical Reports, consolidates the commitments assumed by the operators of the
interventions evaluated as well as the mechanisms for monitoring progress on
the implementation of the reforms.
The following phase in the
cycle is related to the implementation of the commitments assumed in the Performance
Improvement Plans and, even though the implementing actions depend mostly on
the operation of the evaluated programs and interventions, DGSE is intended to
provide technical assistance during the process to ensure the correct and
timely implementation.
Finally, the DGSE model
considers a final stage where the effect of the evidence based improvements
implemented in accordance to the Performance Improvement Plans are measured and
evaluated on the impact, efficiency, quality, equity and transparency of the
development and social inclusion interventions. This final step is intended as
well as a mean to evaluate the success of DGSE as an evaluation unit.
This last component
constitutes an innovation as well since in the traditional evaluation unit
model success is commonly measured in terms of the number of evaluations
performed or the scientific rigor and quality of the evaluations, forgetting
that the real success goes beyond the ability of the unit to generate high
quality evidence on a timely manner and needs to be measured in terms of its
contribution towards greater impact, efficiency, quality, equity and
transparency of the policies and programs evaluated.
Even though MIDIS´s
relatively short existence, the DGSE model design has already shown encouraging
results. In the context of the redesign process of social programs, in 2012
DGSE developed 16 evaluations that resulted in seven Technical Reports
developed by DGSE containing evidence based recommendations for social
programs.
As it can be observed in
Table 1, 65% of the recommendations made by DGSE have been implemented or are
in process of implementation, while 15% require further studies or coordination
with other sector to be implemented.
Table 1: Use
of DGSE´s recommendations -2012
Other clear examples on how
the DGSE model focused on the use of evidence can actually have an impact on
policy are the evidence based improvements introduced to Cuna Mas in 2013 and
the introduction of evidence provided by DGSE on the Policy Guidelines against
Child Chronic Malnutrition developed by MIDIS in 2012. In the first case, the
evidence based recommendations made by DGSE regarding the evaluations conducted
on 2012 over the previous Wawa Wasi program were used as an input to introduce
several improvements to the quality of the daycare service provided. In the second case, the national and
international evidence provided by DGSE served as a base to identify effective
interventions against child malnutrition and develop a Policy Tool to guide
national and subnational government agencies on the design and implementation
of their social policy.
3.
References
Horton y col. (2008). Evaluación del desarrollo de capacidades.
Experiencias de organizaciones de investigación y desarrollo alrededor del
mundo. Centro
Internacional de Agricultura Tropical (CIAT) – Colombia.
Innvær, Simon (2002). Health
policy-makers’ perceptions of their use of evidence: a systematic review, Health Services Research Unit, National
Institute of Public Health, Oslo, Norway.
Weiss, C.H. (1999). The interface between
evaluation and public policy. Evaluation.
5 (4): 468-486.
No hay comentarios.:
Publicar un comentario