The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra...
Transcript of The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra...
![Page 1: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/1.jpg)
The evaluation of an early intervention policy in poor schools
Research Proposal
Presented to PEP-AusAID Policy Impact Evaluation Research Initiative
Paula Ines Giovagnoli and
Roxana Maurizio, Irene Kit, Valeria Lahitte, Martin Scasso, Evelyn Vezza
Argentina∗
May 22, 2008
Abstract
The question of whether early retention policies have a positive impact on sub-
sequent school progression remains a controversial issue, especially in developing
countries like Argentina where repetition rates are remarkably high. If a better
policy to improve the quality of education exits should be investigated. This
research proposes an impact evaluation of Supportive Promotion -an integrated
effort of re-allocating existing inputs to assist students at risk. The project,
named Todos pueden Aprender, will be implemented in Corrientes province and
scaled up by 2011. We suggest taking advantage of the in-phase implementation
design to draw the control group. The findings will provide a guide to assess
the overall efficiency of government investment in education and will shed light
on possible applications in other developing countries. It will also contribute to
reduce poverty through its impact on human capital accumulation.∗We want to thank PEP anonymous referees for very useful comments and patience. Elda Gallese, Marco
Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and UNGS,
UNR and UNLP researchers provided very valuable insights. Special mention to Sarah Bailey for english
corrections and Sergio Espana who dedicated part of his time to coordinate several arragments, meetings
and communication with the Government and UNICEF Argentina. Without his help, this project could not
be possible.
1
![Page 2: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/2.jpg)
Contents
1 Aims 4
1.1 Study overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Main research questions and core research objectives . . . . . . . . . . . . . 4
2 Background and policy relevance 5
2.1 Literature review directly relevant to main research questions . . . . . . . . 5
2.2 Explanation of what are the gaps in this literature . . . . . . . . . . . . . . 6
2.3 Explanation of how filling these gaps is relevant to policy issues . . . . . . . 9
3 Methods 9
3.1 General description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.2 The intervention . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.2.1 What the intervention will do . . . . . . . . . . . . . . . . . . . . . . 10
3.2.2 How it works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.2.3 Who will do it . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.2.4 Potential pitfalls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.3 Data collection methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3.1 Baseline data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3.2 Population under study . . . . . . . . . . . . . . . . . . . . . . . . . 14
3.3.3 Sampling design, sample size and statistical power . . . . . . . . . . 14
3.3.4 Key and additional data . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4 Modeling and Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4.1 Hypothesis to be tested . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.4.2 Empirical methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.4.3 Potential empirical problems . . . . . . . . . . . . . . . . . . . . . . 17
3.5 Human subjects concerns . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4 Consultation and Dissemination Strategy 17
5 The study team 18
5.1 Main researcher . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5.2 Key researcher staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
2
![Page 3: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/3.jpg)
5.3 Expected capacity building and main tasks assigned to each team member . 20
5.4 Collaborators and arrangements . . . . . . . . . . . . . . . . . . . . . . . . . 22
5.5 Past, current and pending projects in relating areas . . . . . . . . . . . . . 23
6 Timeline 24
7 Budget 24
8 References 26
3
![Page 4: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/4.jpg)
1 Aims
1.1 Study overview
In Latin American countries (LAC) around 45 percent of students are retained at least
once during their formal schooling1, (Patrinos and Psacharopoulos, 1992). According to
UNESCO (2004), this means that the region’s retention rates are among the highest in the
world, after West and Central Africa and Eastern and Southern Africa. The implementa-
tion cost for LAC is estimated at 4.6 billion US dollars per year (Wolff, Schiefelbein and
Schiefelbein, 2002), which hides a siginificant resource allocation. In fact, it can be argued
that grade retention is an inefficient policy if the costs of spending an additional year in the
school system are higher than the bene?ts, Haddad (1979). Yet, despite its relevance, the
examination of its effectiveness received virtually no attention in LAC.
1.2 Main research questions and core research objectives
The question of whether early retention policies have a positive impact on subsequent school
progression remains a controversial issue, especially in developing countries like Argentina
where repetition rates are remarkably high. Repetition rates are higher during the first
grades of primary school and among poor. See Kit and Scasso (2006).
Since 2003, a pilot school project named TPA Todos Pueden Aprender (Everybody Can
Learn)2 is being implemented in a small sample of argentine public schools with a high
percentage of poor pupils. It is in public poor schools where the prevalence of repetition is
significantly high, reaching at least 4 out of 10 students. The innovative project, supported
by UNICEF and provincial governments, has been carefully designed and implemented by
a non-governmental organization Educacion para Todos (Education for All). One of its
components is mainly focused on children with high risk of repetition during the first cycle
of primary education,(See Kit and Giovagnoli 2005 and Giovagnoli 2007 for a description
of education system in Argentina). The idea is to prevent repetition and instead provide
Promocion Asistida (that is, Supportive Promotion -SP). In addition, the program proposes
a new way of teaching in the classroom in order to improve the quality of learning of the
whole group of pupils3.
Our main research objective is to evaluate TPA, a SP policy, in Corrientes province1Grade retention, grade repetition, academic failure, non promotion, funked, failed, retained, held back,
are all synonymously used in the literature to refer to the educational policy of having a child re-take the
same grade more than once.2See http://www.educacionparatodos.org.ar/tpa/index.html3Section 3.2 describes the program in details.
4
![Page 5: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/5.jpg)
where the idea is to gradually implement this project in public primary schools within 3
years - enough time to detect treatment effects. Randomizing the order of phase-in will
allow us to carry out a proper impact evaluation.
2 Background and policy relevance
2.1 Literature review directly relevant to main research questions
It is useful to apply the production function approach, a simple conceptual model widely
used among economists, to clarify how previous empirical studies analysed factors affecting
student outcome . The production function approach originated in the Coleman Report
(1966) and reviewed in Hanushek (1986). This function examines the relationship among
inputs and outputs of the educational process. A distinct feature of an educational process
is that it is cumulative; that its ”inputs applied sometime in the past affect students’ current
level of achievement” (ibidem, pag. 1150). This suggests that the educational output Y for
the ith student at time t is linked to past and current educational inputs:
Yit = f(Fi(t), Si(t), Oi(t), eit)
where f is a function that relates the inputs to output, t refers to an input cumulative to
t, F is a vector of the student’s family background and family educational inputs; S is a
vector of the student’s teacher and school inputs, O is a vector of other relevant inputs
such as friend factors (peer group characteristics). Under this model, one could think of
many other unobservable factors that could potentially - and hopefully randomly - influence
the outcome, leading to deviations from the hypothesised relationship between educational
outcomes and inputs. These influences are captured in eit : the stochastic term reflecting
all other factors affecting educational outcome that cannot be observed by the researcher
and are assumed to be generated by a random process. Note that as Wild and Pfannkuch
(1999) state clearly in their discussion of modeling variation “[T]he level at which we impose
randomness in a model is the level at which we give up on the ability to ask certain types of
question, questions related to meaning and causation” (ibidem, pag. 241). The empirical
estimation of this production function requires an extremely large amount of data, thus a
value-added specification is usually analysed instead, in which lagged student outcome is
included on the right hand side of this equation and only incremental inputs are considered
.
5
![Page 6: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/6.jpg)
2.2 Explanation of what are the gaps in this literature
To empirically assess the effect of repetition on subsequent outcomes, past studies reviewed
in meta-analyses (see Holmes and Matthews (1984), Holmes (1989), Jimerson (1999) and
Jimerson (2001)) include a binary variable indicating whether the student repeated a grade
on the right hand side of this equation. Using a standard individual level regression, they
estimate the coefficient on the repetition variable, interpreting this as the effect of grade
retention on individual outcome, ceteris paribus. Based on this, they conclude that the
policy has been demonstrated to be ineffective in the United States. Therefore, to correctly
interpret their results, they need to accept that factors which are not included in the model
might impact on outcome but only randomly, in a way that on average, conditional on
observables, they will have zero effect. However, if the assumptions do not hold, the simple
regression analysis may dramatically bias the effect of retention.
Looking beyond the United States, the empirical analysis is extremely scarce. An ex-
ception is Gomez-Neto and Hanushek (1994) who use data for rural Northeast Brazil to
estimate the effect of repeating the second grade on academic achievement over a 2-year
period. In this case, the authors explicitly recognise that these results “should be viewed
as a crude approximation of the effects of repetition” (ibidem, page. 128). Given their data
at hand, they are not able to evaluate whether or not repetition works as a strategy for
improving subsequent educational outcome.
As discussed in Brady (2004) following Rubin and Holland, the definition of causality
requires “the description of a counterfactual condition and a comparison of what did happen
with what would have happened had the cause been absent” (ibidem, page. 57). That is,
in the case of repeaters, neither Gomez-Neto and Hanushek (1994) nor the US reviewed
studies, can say much about what would have happened with the outcome if the repeaters
had been promoted. Repetition may be itself affected by performance- it is not exogenously
determinate - violating one of the conditions required to claim causal effect.
One possible solution to deal with this issue is to carry out an experimental design .
It has been argued that this method is “the best mechanism for justifying causal conclu-
sions” Cook (2003, pa. 144). However, as the same author has previously noticed, this
method “is hardly practiced in education, especially for assessing the impact of educational
interventions of obvious policy-relevance” Cook (2001) . In fact, for the case of repetition,
additional evidence comes from quasi experimental studies instead, that try to mimic exper-
imental designs (Eide and Showalter, 2001; Jacob and Legfron, 2004 and Manacorda, 2006).
These studies trying to develop possible solutions to directly address previously identified
weaknesses and problems.
Eide and Showalter (2001) find some beneficial evidence of the effect of repetition - that
6
![Page 7: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/7.jpg)
of lowering dropout rates and rising earnings in the labor markets. Their results, however,
do not allow them to conclude that retention is an effective policy, as their estimations are
not significantly different from zero. They use a research design, known as Instrumental
Variables (IV), to mitigate the problem of endogenous selection based on individual ex-
ogenous treatment assignment. According to the authors, the exogenous variation existing
across states in kindergarten entry dates in the United States and the typical pattern of a
uniform distribution of birthdays across days of the calendar year, create a kind of random
assignment of students to retention. Hence, the instrument (ndays) is equal to the difference
in days between the cut-off date of kindergarten entry in the state where the child lives and
the child’s date of birth .
The authors report their findings for four different demographic groups: white men,
white women, black men and black women. Their results suggest that the instrument works
quite well for the former two groups but it seems to be very weak for the case of black groups,
for whom the estimated coefficient between retention and ndays is not different from zero at
conventionally acceptable level of confidence. The primary reason why the instrument does
not work for the case of blacks is not clarified in the paper. This indicates that even when
the IV design is a promising way to identify the real effect of retention on school dropout,
difficulties in finding a good instrument, in some cases, cast doubt on the results.
There is a second study by Jacob and Lefgren (2004) which aims to evaluate the short-
term effects of grade retention on educational outcomes. Rather than looking at outcomes
such as high school dropout or wage earnings, they focus on the impact of retention on
student achievement, measured as standardised test scores, a couple of years after the
students have received the treatment (i.e. have been held back). Using data from Chicago
Public Schools (CPS) and examining third and sixth grades, they provide evidence that
retention has a positive net impact on third-grader achievement in math and reading, but the
effect decreases considerably after 2 years of retention. Findings for sixth graders are quite
different. In fact, retention appears to have had no effect on achievement, neither the year
after retention nor two years later. In order to deal with endogeneity, the authors identified
an exogenous variation generated by a decision rule applied in CPS since 1996, allowing them
to use a discontinuity regression design. The rule tied promotion decisions to performance on
standardised tests, resulting in a highly non-linear relationship between current achievement
and the probability of being retained. Thus, the probability of treatment (being retained)
depends on a student’s current achievement. If the student scores just below the promotional
cut-off, he/she will be retained. On the contrary, if the performance is just above, he/she
will be promoted. Therefore, the identification of retention impact can be achieved by
comparing students who scored just below and just above the promotional cut-off . So if
7
![Page 8: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/8.jpg)
differences between the cut-off are random (luck, “good/bad day”), then there is a quasi-
experimental design with “random” treatment for the cut-off vicinity group. It is worth
noting that the main assumption behind this design is that unobservable factors do not vary
discontinually around the cut-off. The validity of this assumption seems to be questioned
by Hauser, Federick and Andrew (2006) who remark that Chicago’s system is characterised
by a variety of programmes to enhance student achievement, such as after-school tutoring
for retained students.
Applying a similar research design, albeit with data from a developing country (Uruguay),
Manacorda (2006) finds that repetition appears to have a negative impact on subsequent
school outcomes. That is, according to his results, grade repetition leads to substantial drop
out and lower educational attainment after 4 and 5 years since the time repetition occurred.
In this case, the discontinuity arises because there is a rule in Uruguay’s schools that estab-
lishes automatic grade failure for students who are absent from school more than 25 days
a year. Thus, 25 days is a threshold above which repetition is automatic, irrespective of
school outcomes. The author compares students who are to the left and right of the dis-
continuity point, and follows their school progression from 1996 to 2001. To validate these
results, an assumption of random differences between students who missed 25 versus 26
days needs to be accepted. That is, controlling for observable characteristics, the remaining
difference between a student who missed 25 versus a student who missed 26 is because of
“unexpected” shocks. The author tests this assumption by moving estimates away from the
discontinuity point and exploiting a change in the retention rule. Even though the paper
demonstrates a solid research design, some data limitations give reason for pause. The only
data available for the researcher includes those individuals within the public non vocational
schools, lacking information about vocational and private schools. Hence the researcher
cannot identify those leaving the educational system from those moving to other schools
outside the non-vocational system (i.e vocational or private schools). In fact, the strong
assumption behind the results is that most of those who are leaving the non-vocational
system are not changing to a vocational or a private school, but instead are considered
as abandoning the formal educational system permanently. If this is not true, then the
estimated effects of retention on subsequent outcomes are overestimated.
The analysis of previous empirical studies shows clearly that unlike the simple multi-
ple regression approach, Instrumental Variables and Discontinuity Regression designs are
promising ways for evaluating the effectiveness of the policy. The main limitation of these
techniques is, however, the requirement of truly exogenous variation and good quality data.
Experimental settings constitute the best option in terms of methodological issues. In the
case of repetition, this means that given a group eligible students, we should randomly
8
![Page 9: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/9.jpg)
assign some of them to repeat and others not to repeat. After a couple of years, and under
certain assumptions, it could be said that the differences on educational outcomes between
the two groups reflect the effect of repetition. Because of diverse reasons, this kind of ex-
periment is not under consideration. An alternative way to evaluate repetition is, however,
available in our case.
2.3 Explanation of how filling these gaps is relevant to policy issues
The evaluation of SP will have significant impact on policy decision-making. Being able
to answer whether children under SP have better educational outcomes that pupils under
current policies will be an important contribution in terms of allocating education resources.
The findings will provide a guide to assess the overall efficiency of government investment.
For instance, if results show that grade repetition produce worse educational outcomes than
SP the system can be enhanced by reallocating resources to pursue a new policy.
Results could also shed light on possible applications in other developing countries where
repetition policy is being implemented. Last, but not least, this investigation will contribute
to look for better ways of improving quality of education in Argentina and other developing
countries, which will affect poverty through its impact on human capital accumulation.
3 Methods
This section has greatly benefited from comments provided by PEP referees.
3.1 General description
Todos pueden Aprender has already been implemented in 96 schools, 747 divisions, totalling
20.857 pupils in first and second grades of primary school in 4 different provinces (Misiones,
Chaco, Tucuman and Jujuy. Implementation recently began in Formosa)4. Because of
design details, only a descriptive and retrospective evaluation was possible. However, in
Corrientes province the government express his interest to implement a pilot project next
year and scale it up to all public primary schools by 2011, being an opportunity for setting
up an impact evaluation. The shortage of existing human resources to provide teacher’s
training leads the government to phase-in TPA over time.
The effectiveness of the program depends ultimately on whether the students’ learning
outcomes are significantly improved.4These provinces are the poorest in the country, where at least 70 percent of children under 14 years old
are poor to the Current Houlsehold Survey (EPH)
9
![Page 10: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/10.jpg)
After one year of implementation, we expect to find short-term effects observed on
improvements in test scores in treatment schools compared to control schools. A mid-term
effect may be registered after the second and the third year of implementation. It can be
argue that a more persistence effect will affect treatment group by lowering down dropout
rates during secondary school.
Administrative data provided by the provincial Ministry of Education will be used in
order to identify public primary schools and their main structural characteristics prior to
the intervention which will serve for stratification (rural, urban and gender). A baseline
survey will be carried out in all schools, to collect data on the outcome of interest before
the intervention is carried out (test numeracy and literacy skills for first graders). Data on
post tests will also be collected after one, two and three years of the intervention.
3.2 The intervention
3.2.1 What the intervention will do
The spirit of the intervention is based on adapting the school and teaching practices for
children, especially first generation learning5, without a fundamental transformation of the
system or implementation of new resources (see Duflo, Glennester and Kremer, 2007 for
interventions in the same spirit). It is based on the belief that poor educational outcomes
observed in public schools in Argentina are not mainly related to physical inputs but instead
on the teaching methods. It is also stressed that the first years of learning are critical for
a successful path along the academic track. Having received a good quality instruction of
basic skills as well as comfortable and happy learning during early grades, this may reduce
higher disparities later on in the life cycle.
Specifically, the treatment has three main components:
1. Protection of children transitions at school: to identify and provide assistance and
follow up on those children who have experienced difficulties in school transitions.
2. Teaching process improvement: to emphasize knowledge sequences in Language and
Maths. The main methodological proposals are focused on initial literacy, numeric
system, and mathematical operations.
3. Changes in school institutional organization: to support principals on coordination
and the implementation of the new teaching process in the key areas mentioned above,
as well as, the SP.5In the last two decades, enrollment growth was higher than population growth across all age groups,
Giovagnoli and Kit (2005)
10
![Page 11: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/11.jpg)
3.2.2 How it works
The SP components are developed as follows:
1. Teachers receive training on how to develop key pedagogical sequences in Maths and
Language, and periodical assessment during implementation. The teachers’ training
takes place at the beginning of the academic year. There is also a monitoring process
during the whole period to assess, correct biases, and enhance the implementation on
time.
2. Schools’ institutional enhancement consists of training to develop the principals’ man-
agement skills in pedagogical update, coordination, communication, and information
analysis during the process. For example, an information system to follow up the SP
is provided for them.
3. After a careful assessment of students achievements in Language and Maths(evaluations),
the Promotion Groups identify students at risk from the pedagogical perspective: stu-
dents with some deficits on achievements. Consequently, they formulate a pedagogical
action plan for these students in order to assist them along the following cycle. The
purpose is to address the weaknesses identified through a specific pedagogical team.
It is important to note that SP does not intend to prohibit repetition, as it is based
on pedagogical issues and not on an administrative act. The new ways of instructing have
been designed taking into account the material and books recommended in the Learning
Priorities Core (Nucleos de Aprendizajes Prioritarios - NAP). The NAP are agreements
negotiated with the Federal Council of Education, National Ministry of Education and
provincial governments on minimum learning contents curricula.
• Who are the beneficiaries?
The direct beneficiaries are the cohorts of students starting first grade attending public
schools located in the selected province. Furthermore, teachers and principals will
benefit from human capital formation through the training courses.
The project also provides schools with a simple software designed to follow the progress
of children’s learning throughout the academic year. The database acts as a valuable
tool for the principals and teachers.
• How will they benefit?
It is expected that not only weaker students benefit from the intervention, but also the
rest of the pupils in the classroom. Teachers in treatment schools will apply different
11
![Page 12: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/12.jpg)
teaching techniques to the whole classroom without changing the curriculum contents.
These techniques are designed to be implemented following a set of coherent sequences
in Math and Language across grade 1 to 3. This is a totally different approach from
the current system, where each academic year is self-content and in many cases, the
same classroom is exposed to different teachers across the cycle.
In order to help weaker students to continue within the same classroom, additional
support in Math and Language is provided during extra-curricular hours. This in-
tervention might affect high-ability students in a negative way if teachers re-allocate
their time in favour of low ability students. On the other hand, high-ability students
might now be positively affected by a better performance of their peers.
• How do we draw the control group to which you compare the treated group? We will
take advantage of the in-phase implementation design to draw the control group (see
Section 3.3.3 for more details).
3.2.3 Who will do it
The intervention will be carried out by provincial teams under the coordination of Educacion
Para Todos.
TPA is a project supported and financed by UNICEF, with the collaboration of provin-
cial governments. The program is low cost. According to previous implementation, the
total cost is around 25 dollars per pupils per year.
It should be noted that there is an institutional and formal setting in which the inter-
vention is implemented, endorsed by Decree Nr. 105/2006 in which the National Ministry
of Education encourages TPA initiative.
3.2.4 Potential pitfalls
One of the potential problems, usually present in this type of intervention, emerges if
the timing of the expansion is so rapid that it does not allow us to observe the effect of
intervention. This will not be a problem in our case given that the way in which the policy
will be implemented means that we will actually have up to 3 years to detect treatment
effects. Furthermore, we expect some short-term effects to emerge during the first year of
the intervention.
Another potential problem could appear if parents relocate their children from treat-
ment to control schools (Hawthorne effects). In the context in which the program will be
implemented -primary poor public schools -this is unlikely to happen. At least, this kind
of behavior was not present in previous applications of the program in similar provinces.
12
![Page 13: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/13.jpg)
However, we have two types of action planned. First, we will follow each student. Second,
for those students who leave the school, we are planning to collect data on reasons for
changing/leaving school, and analyze how this behavior affects results.
Instead, Hawthorne effects could appear through the teachers’ behavior, principals and
trainer groups. It may be that due to the external evaluation, they work harder during the
course of the program in comparison to what would have happened if the program were
implemented without any external evaluation. One option to identify and isolate this effect
from the true effect of the program is to follow up on the performance of eligible schools
and collect key data even after the formal evaluation. We will be exploring this possibility
in terms of practical implementation.
Similarly, a behavioral change may occur in the control group (John Henry effects) if
substitutes are available to them. In this respect, it is rare that poor people attending
public schools would think of enrolling their children in private schools. Furthermore, there
are very few private schools in the province, which limits the range of alternatives. In fact,
according to official data from the National Ministry of Education, in Corrientes only 5
percent of schools are private and they are concentrated in only two cities (the Capital and
Santo Tome).
An additional and valid concern refers to the usefulness of the findings, in particular
whether the results obtained for this specific province will hold in schools located in other
provinces or under different settings. It should be noted that SP policy is designed to act
in primary public poor schools (first cycle - grades 1 to 3). In provinces like Formosa,
Corrientes, Chaco, Santiago del Estero, Jujuy, Tucuman, most public schools receive a high
proportion of poor pupils. To generalize the results in any primary school in Argentina,
we could expand the program to provinces such as Buenos Aires and select a different set
of schools (for instance, those located in improved socio-economic neighborhoods). These
types of schools, however, have much better educational outcomes, since the universe of
implementation is much less attractive.
As discussed in Duflo, Glennester and Kremer (2006), the way in which the program
is implemented is another factor which affects how the easy the results are to generalize,
and also how replicable they are. In this respect, Educacion Para Todos developed a set of
handouts explaining each step in full detail in order to ensure a replicable implementation
by different teams. Additionally, there are specific guided materials provided to trainers,
principals and teachers. Together with these guides, a system called SITESA - Sistema
de Itinerarios Escolares Asistidos (Aided School’s Itineraries System)- was developed to
monitor student’s performance and the implementation of the program.
13
![Page 14: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/14.jpg)
3.3 Data collection methods
One of the main advantages of the previous application of TPA is that a well-structured data
collection system was designed, tested and implemented in the schools participating in the
project. A similar system will be applied in control school groups as well. Additionally, most
of the data capturing instruments have already been developed and tested. This is essential
when we are working in poor areas where the degree of simplicity in the questionnaires
should be a priority.
3.3.1 Baseline data
An important source of administrative data collected by the provincial ministry every year
is used for a first identification of public primary schools (geographic location, socieconomic
status, gender and number of sections and pupils in the school). Additionally, a baseline
survey will be conducted just before the beginning of the program and at the beginning
and end of each academic year in both types of schools, and in treatment and control. The
survey will be on an individual basis. General information on school characteristics which
is not available in the administrative records will also be recorded.
3.3.2 Population under study
The population under study is composed by all students starting first grade in public urban
schools in Corrientes province.
3.3.3 Sampling design, sample size and statistical power
The in-phase expansion of TAP and random assignment will ensure treatment and com-
parison groups will be similar in expectation. We will randomly divide public primary
schools into 3 equally-sized groups6: Group 1 schools will receive the treatment during year
1, Group 2 schools during year 2 and Group 3 during year 3. The impact of the project
will be evaluated comparing the results from Group 1 schools in year 1 with Group 2 and
3 acting as comparisons and the results of Group 1 and 2 in year 2 with Group 3 schools
acting as a comparison.
In term of statistical power it is important to bear in mind that we are randomizing over
groups (schools) rather than individuals. The outcome, test scores, will be at individual
level. We need to to take into account that pupils in the same classroom (a cluster) face6Note that we will stratify schools by socioeconomic variable and number of pupils using administrative
data.
14
![Page 15: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/15.jpg)
similar shocks, and then the error term may not be independent across individuals. We will
follow the advice presented in Duflo, Glennester and Kremer 2007, Section 4 and suggestions
provided by statisticians at University of Rosario. A discussion with the PEP specialists
on specific issues will be of great advantage at this stage.
3.3.4 Key and additional data
The key data we will collect is the main outcome of interest: children’s learning, measuring
through the use of different tests made at the beginning and end of each academic year.
These evaluation, applied in both schools, will be designed to test numeracy and literacy
skills.
Additional individual and school data will be collected in a baseline survey. This data
will play an important role in producing more precise estimations of the effect of the pro-
gram. In particular, variables which we expect to have substantial explanatory power for
test scores are the most important ones to collect.
A complete dataset will be created, each child will be uniquely identified with a numeric
code indicating which school he/she attends, which cohort he/she belongs, and which grade
he/she is in.
3.4 Modeling and Testing
3.4.1 Hypothesis to be tested
The main hypotheses we will test are the following:
1. Average Effects: A first null hypothesis to be tested is whether the average post-test
scores between treatment and comparison groups are equal at 1 percent level at the
end of first, second and third year of TPA implementation.
2. Similarly to hypothesis 1, we would like to test whether students improved more,
relative to what they would have been expected to on the basis of their pretest score,
in treatment schools than in comparison schools.
3. Distributional Effects: An additional hypothesis is to test for equal postests distribu-
tions at the end of each year. We expect to reject the hypothesis that the comparison
group distribution of post-test stochastically dominates the distribution of the treat-
ment group.
4. Because of idiosyncratic characteristics of the program - student at risk received ad-
ditional support - it is important to test heterogeneous effects as well.
15
![Page 16: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/16.jpg)
3.4.2 Empirical methods
In this subsection we briefly discuss the empirical methods we will use to test the hypotheses.
1. A first estimate of the average treatment effect of the program will use simple dif-
ferences of post test scores between treatment and control groups. A difference-in-
difference specification will also be estimated (differences of pre and post-test scores
between treatment and control groups). If randomization is effectively applied, double
differences will provide similar results to that of simple differences.
2. In order to identify any systematic variation on the average treatment effect across
different subgroups we will include an interaction between treatment dummy and the
subgroup under analysis.
3. Value Added specification: If randomization is successful and the controls variables
are uncorrelated with the treatment then the inclusions of controls in the estimations
should not change the point estimates of the effect of the program. Nevertheless, it
might generate more precise estimates of the causal effect of interest. This is because
we expect controls to have substantial explanatory power for test scores. Including
controls therefore reduces the residual variance, which in turn lowers the estimated
standard error. In education literature, a commonly used specification of this kind
is called ‘value added’, in which the test score for student i in school s before the
implementation of the project (YisPRE) will be on the right hand side of the equation
together with the treatment dummy (Ds equal 1 if the school receives the program
and 0 otherwise). The main outcome of interest, post test scores (YisPOST ), enters
on the left hand side. In particular, we will have the following:
YisPOST = λ + αDs + βYisPRE + εisPOST
This specification will be estimated using Ordinary Least Square (OLS).
4. Following Benerjee, Cole, Duflo and Linden (2006), we will implement bootstrap tests
of equality of distributions using the Kolmogorov-Smirnov statistic to measure the
discrepancy between the hypothesis of equality of the distribution and the data.
5. We will evaluate if beyond the average treatment effect, an heterogeneous treatment
exists. Under certain conditions, quantile treatment effects can be used to estimate
heterogeneous effects.
16
![Page 17: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/17.jpg)
3.4.3 Potential empirical problems
Some potential empirical problems should be noted. Differential attrition between the
treatment and comparison groups could bias the results. Based on previous experience,
almost no attrition was observed in primary school projects. This is more of an issue in
secondary school, where temporal and permanent dropout is more common among young
people.
3.5 Human subjects concerns
The project is designed to use training and reflection techniques to enhance teaching prac-
tices. Given the way and the context of implementation, there is low social risk in terms of
possible conflicts with teachers or parents.
It might be argued that a risk of participation in the Project could impact teacher’s
employment status in a negative way, e.g. teachers may perceive that the project increases
their workload. However, schools’ under TPA reallocate time and tasks to ensure teachers
do not have to do extra-hours. Furthermore, previous experience of TAP in other provinces
did not show any risk.
As the TPA project was carefully designed taking into account NAP, it complies with
provincial regulations and minimum learning contents. As already pointed out, the project
is endorsed by a national decree.
4 Consultation and Dissemination Strategy
1. How, in the elaboration and execution of your project, will you consult with policy
makers, civil society representatives and other parties interested in the research issues
you examine?
Given the nature of the proposal, we are in constant communication with NGOs,
UNICEF, policy makers and academic researchers.
During April and May, 2008 we organised specific events to present the proposal to
different actors and we talked with key individuals to get feedback (see attached letters
from UNLP, UNGS, UNR and UNICEF). Policy makers provided useful suggestions
on what they expect to infer from the study. For instance, they would like to evaluate
whether the program is more effective in certain types of schools, mainly in those with
pupils from low socio-economic backgrounds. We will take this suggestion into account
when we design the sample, together with UNR statisticians. Academic researchers
showed similar concerns and have already pointed this out in the PEP comments.
17
![Page 18: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/18.jpg)
We also made a list of electronic mail addresses and other contact details for the
organisation of future events.
2. How and where research results will be disseminated to academics, policy-makers and
the public: publications, policy briefs, seminars, conferences, etc.
A wide range of channels will be used to disseminate our findings.
• A web page will be designed to post not only the results but also all materials and
datasets generated by this project, conditional of PEP agreement. The existence
of this web page will be promoted in different universities and research centres.
We will also request that UNICEF, the Ministry of Education, the Education
Commission (National Congress) and the University Provincial Network of the
Ministry of Social Development add the link to this web page on their own web
pages.
• We would like to organise a one day workshop on impact evaluation in education
and invite an international expert to talk to students and teachers.
• We will organise other events during the execution of the project to discuss
ideas and results, bringing together academics, policy makers and representatives
of international organisations, such as the World Bank and the International
American Development Bank.
• A special space will be asked for to present the project proposal at the Network
of Inequality and Poverty (World Bank / Inter-America Development Bank /
LACEA) and the Asociacin Argentina de Economa Politica (AAEP, Argentine
Economic Policy Assocation). Once results are available, they will be discussed in
local and international congresses and seminars in public and private universities.
• We will distribute a short policy brief.
• We expect to publish at least 2 papers in well-known international journals. The
Working Paper Series at PEP and UNLP will be a first step in this respect.
• We will arrange face to face meetings to present our findings to different national
and local authorities. Since part of our team has worked for international organ-
isations, many useful contacts have been previously developed. This will allow
us to request specific meetings to present the results to them as well.
18
![Page 19: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/19.jpg)
5 The study team
5.1 Main researcher
Paula Ines Giovagnoli (32, female). Paula is a senior economist specialising in education.
She is now finishing her PhD at the London School of Economics looking at the impact of
retention policy on dropout in secondary school in Argentina. Her research work experience
is focused on the evaluation of educational and social programmes. In a two-year experi-
ence in the Human Development Department at World Bank, Paula was responsible for
developing and implementing a quick series of household surveys to monitore socieconomic
conditions of Argentina. She worked for Educate Girls Globally as general coordinator in all
surveys applied in the Argentina’s study. Paula has vast experience in applied econometric
analysis and published papers in different national and international journals.
5.2 Key researcher staff
• Roxana Maurizio (36, female). Roxana is a senior economist specialised in poverty
and labor markets. She is currently working as a Researcher and Professor at Uni-
versidad Nacional de General Sarmiento (UNGS). Roxana’s job experience includes
an extensive career as technical assistance to the Ministry of Economy and to the
Ministry of Labor. She finished her PhD in Economics at Universidad Nacional de La
Plata with a high quality training in econometric techniques.
• Irene Kit (45, female). Irene is an Educational Specialist and an expert in the area
of early childhood education. Her work experience as policy maker includes the design
and implementation of programs focused on educational needs of the poor. She worked
at the National Ministry of Education as National Director of Compensatory Programs
and as a consultant for provincial and national government as well as international
organizations. She is now director of Educacion para Todos.
• Evelyn Vezza (29, female). Evelyn has a MSc in Economics at UNLP, with a strong
background on empirical analysis of data. She finished her two year experience at the
World Bank, where she has provided technical assistance and performanced impact
evaluation activities on longer schedule in secondary schools in the Province of Buenos
Aires as well as in Brazil Quality of Education. Her analytical and research skills will
be an important input for our team.
• Valeria Lahitte (31, female). Valeria is a social worker, and holds her Diploma on
Curriculum and School Practices at Facultad Latinoamericana de Ciencias Sociales
19
![Page 20: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/20.jpg)
(FLACSO). She has been doing extensive fieldwork, specially in poor neighborhood.
Valeria also worked for the Social Promotion Secretary in Rosario, Argentina as well as
in Centro de Organizaciones de la Comunidad. She has recently finished a qualitative
study for the General Secretary of National President to understand poor’s needs in
order to improve the design of social policies and increase its impact on welfare. She
is currently working on the operational management of Todos Pueden Aprender for
the provinces of Chaco, Jujuy, Misiones and Tucuman.
• Martin Scasso (27, male). Martin is a sociologist, worked as a consultant at Centro
de Implementacion de Politicas Publicas para la Equidad y el Crecimiento (CIPPEC)
in the Education Sector. He is very well-trained on processing and analysing official
data provided by the Ministry of Education, and its link with other databases. He
focused on improving data communication to be accessible for the general public.
Martin has also being involved in the preparation and dissemination of materials used
in the project Todos Pueden Aprender applied in the four provinces.
5.3 Expected capacity building and main tasks assigned to each team
member
The participation of team members in this project creates distinct capacities in many di-
mensions.
It is an inter-disciplinary team made up of economists, an educational specialist, a social
worker and a sociologist, which naturally gives rise to a more comprehensive approach to
the problem being studied. This has produced a constructive debate since the beginning of
the project, where all the team members learn from each other. We expect potential gains
and productive answers derived from integrating these different disciplines in the next steps
of the project. The interdisciplinary team also opens up a variety of networks and contacts
providing the opportunity to disseminate the project further.
Furthermore, the implementation of the survey will enrich the interrelation among differ-
ent national universities (UNR, UNLP and UNGS), government representatives and NGOs.
It is expected that the seminars organised to present the evolution of the project will also
serve as a place where individuals from different institutions can relate to and communicate
with each other, creating additional local and national links.
This evaluation will build an unprecedented network among policy makers, researchers,
and academic and non-government institutions in Argentina. Building such a network
and, above all, demonstrating that interaction is possible and positive, will create the pre
conditions necessary for similar collaborations in the future within Argentina.
20
![Page 21: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/21.jpg)
It is important to note that up to the present date, most of the research that has been
carried out so as to identify a more appropriate way of providing better quality of education
has been descriptive. This project widens the scope and offers significant insights into this
issue, providing academics and policy makers with a useful tool for better practices in policy
design.
In the following items we briefly describe the specific research capacities expected to
be built on the PEP researchers and the main tasks assigned to each team member during
execution of the project:
• Roxana Maurizio
Roxana will revise the existing literature on analysis of experimental data. She will
mainly focus on understanding in depth the theory and practice of Quantile Treat-
ment Effect estimation. This will also imply incorporating new tools and software to
apply to data. As a product of this task, she will organise a set of short classes for
those economists and statisticians directly or indirectly involved in the project and
will incorporate the knowledge in her teaching activities in UNGS. Roxana will also
help in writing interim and final reports and she will be responsible for preparing com-
munication materials to be presented to academia and government officials (mainly
to those in the National Ministry of Economy)
• Evelyn Vezza
Evelyn will join Roxana with literature revision and will expand her knowledge on
impact evaluation research. She will participate in the design of the experiment re-
viewing the material that statisticians from the UNR will provide to us. Evelyn will
analyse the quality of data collected and will jointly work with Roxana and Paula
in the estimations of quantitative impact of the program. She will need to incorpo-
rate specific knowledge on statistical software (mainly R) so as to work on Quantile
Treatment Effects, using this project as an opportunity to significantly upgrade her
statistical skills. She will also be involved in the preparation of interim and final
reports and she will be responsible for presenting the preliminary results to academia.
• Martin Scasso
Martin will be responsible for compiling the first descriptive analysis of data collected
and discussing it with the team. He will also design a simple strategy to communicate
the research findings to the policy makers (mainly the Ministry of Education) and for
them to understand the implications of the evaluation. He is now studying his MSc
in Applied Statistics at the National University of Tres de Febrero (UNTRF), and he
21
![Page 22: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/22.jpg)
aims at using work carried out on this project towards his master thesis work. This
project will also help him to incorporate relevant theoretical and empirical literature
on impact evaluation. Martin will be actively working with the team on writing the
interim and final reports, together with short policy notes to be distributed among
policy makers.
• Irene Kit
As an educational specialist, Irene will revise the documents and instruments to be
applied in the fieldwork and the policy notes prepared for the policy makers. She will
follow the implementation of the project and present preliminary and final results to
non-academic audiences. Irene has vast experience in the dialogue with National and
Provincial governments, international organisations as well as UNICEF. She will en-
hance her capacity to analyse and understand theory and practice of impact evaluation
projects.
• Valeria Lahitte
Valeria will enhance her research capacity by supporting Irene with the revision of
documents and instruments to be used in the fieldwork. She will prepare a literature
review focused on the children’s development, and she will incorporate knowledge on
how to integrate qualitative research into impact evaluations. In the schools under
treatment, Valeria will be responsible for carrying out an analysis of specific cases
with the aim of understanding the children’s learning processes and how retention
policy acts in this process as well as the perceptions and beliefs of different actors.
• Paula Ines Giovagnoli
Paula plays the role of co-ordination and supervision of tasks described above. She
is responsible for writing the interim and final reports and for presenting the final re-
sults in seminars. This project provides her with the possibility of putting theoretical
knowledge on impact evaluation into practice, from the design of the experiment to the
impact estimation. She needs to assure that all PEP researchers take full advantage
of participating in the project, make valuable contributions and incorporate knowl-
edge. It is an opportunity to create a network and link people from different areas,
from academia to national and international organisations. She will also improve her
presentation skills by discussing the project in different environments, including top
level international universities. And last, but not least, she is responsible for making
things happen.
22
![Page 23: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/23.jpg)
5.4 Collaborators and arrangements
• UNICEF. The institution is one of the collaborators who already supported and fi-
nanced TPA in other provinces. Attached a letter from the Director of Education
Sector, Elena Duro, showing the interest on carrying out the evaluation presented
here.
• Educacion para Todos. They technically designed the project and a national team
provides training to provincial teams who will be responsabile for carrying out the
teacher’s training in each school. The team showed their interest in collaborate in
future applications in Corrientes. The president of the NGO, Irene Kit, also opened
the dialogue with policy makers from the National level (Ministry of Education) with
some positive responses on the possibility of co-financing the implementation of the
project.
• National Minsitry of Education and authorities of Corrientes province. Both expressed
interests in implementing the program in order to get results from the evaluation. The
province is quite similar to Formosa in terms of socioeconomic characteristics as well
as educational performance. We believe the interest showed by the National Ministry
of Education and UNICEF will be of great advantage in guaranteeing the agreement
and application.
• Faculty of Economic Sciences and Statistics, National University of Rosario. Re-
searchers at UNR are happy to colaborate and work in the implementation of the
survey, including the sample design (see letter signed by PhD. Elda Gallese, a senior
researcher). The University has a long trajectory and experience on implementing
this kind of research. Last year they worked on evaluating a new teaching technique
at University using cuasi-experimental design. They results were useful for approving
changes in Math curricula and the way it is tought within Economics and Statistics.
We consider that this interaction at national level with another public university, their
teachers and researchers, will also create positive externalities related to dissiminate
the project among the academic audiences and will give tools for teachers who can
show how to apply theory to concrete projects and inform policy makers.
5.5 Past, current and pending projects in relating areas
Past Projects:
• “ Argentina Social Inclusion - Youth” , financed by SIPU International - Swedish
Institute for Public Administration. Paula Giovagnoli and Roxana Maurizio.
23
![Page 24: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/24.jpg)
• ‘U.CO.PRO. Project: “Programa para la Modernizacion del Estado en la Provincia
de Cordoba” Educaction component, impact evaluation financed by Provincial Gov-
ernment of Cordoba, Argentina. Paula Giovagnoli, Irene Kit and Martin Scasso.
• “Urban Female Employment in Argentina” financed by Educate Girls Globally. Paula
Giovagnoli and Irene Kit.
• “Diagnosis of the impact of repetition in Misiones Province” financed by provincial
Government and World Bank. Irene Kit
• “Improvement of schooling retention”, financed by the World Bank. Irene Kit
• “The Annual State of the Children”, financed by UNICEF. Irene Kit and Roxana
Maurizio.
Current Project:
• “Todos Pueden Aprender in Chaco, Jujuy, Misiones and Tucuman financed by UNICEF:
Irene Kit, Martin Scasso and Valeria Lahitte.
6 Timeline
We identify 5 key dates in the proposed evaluation:
1. The analysis of administrative data in order to characterise the potential universe of
schools.
2. The design and implementation of the baseline survey and the experiment (including
student testing in all schools under analysis - treatment and controls). This will allow
us to evaluate the success of the random assignment. We will produce a report with
the analysis of this baseline data to be discussed with PEP specialists.
3. The launch of the program in selected schools.
4. The follow-up survey (including intermediate variables) together with subsequent test-
ing procedures.
5. First analysis of short term effects of the program.
Note that although short terms effects could be evaluated within the first year of imple-
mentation, given the characteristics of this specific program, a minimum of 2 years (ideally
3 years) is required to provide any useful insights about the net benefits of the complete
implementation.
24
![Page 25: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/25.jpg)
7 Budget
• Funds for the experimental component will be used to finance the first years of eval-
uation including the survey work, design of instruments, collection and data entry.
Annex I presents an estimated budget summary on how funds will be used. We expect
the implementation of the program to be co-financed by other institutions.
• The core research grant (CAN 20,000) will be used for paying any specific train-
ing/courses required by the team in the benefit of the evaluation; for remunerating
professional and support staff.
• The additional fundings (up to CAN 30,000) will be used to cover participation in
PEP meetings, study visits and other events and for the cost of dissemination of
results (including local seminars, printing of policy notes for distribution, CD data
publications, communications and any administrative costs related to the project).
25
![Page 26: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/26.jpg)
8 References
Brady, H. E (2004). “Doing Good and Doing Better: How Far Does the Quantitative
Template Get Us”. In Rethinking Social Inquiry. Diverse Tools, Shared Standards. Edited
by Brady, H.E and Collier, D. Rowman and Littlefield Pubishers, Inc.
Benerjee, A, Cole, S and Duflo, E (2006). Remedying Education: Evidence from two
randomized experiments in India. Discussion Paper Series. No. 5446. CEPR.
Coleman, J. S., Campbell, E. Q., Hobson, C. J., McPartland, J., Mood, A. M., Wein-
feld, F. D. and R. L. York (1966), Equality of Educational Opportunity. Washington DC,
Government Printing Office.
Cook, T.D (2003), “Why have Educational Evaluators Chosen Not to Do Randomized
Experiments?.” The ANNALS of the American Academy of Political and Social Science,
Vol. 589, 1: 114-149
Duflo, E., R. Glennerster and M. Kremer (2006). “Using randomization in development
economics research: a toolkit. Forthcoming Handbook of Development Economics Volume
4, edited by Robert E. Evenson and T. Paul Schultz, Amsterdam, North-Holland).
Eide, E. R. and M. H. Showalter (2001), “The Effect of Grade Retention on Educational
and Labor Market Outcomes.”, Economics of Education Review, 20, 6: 563-576.
Giovagnoli, P (2007). “Failures in school progression.” Working Papers 50, CEDLAS,
Universidad Nacional de La Plata.
Giovagnoli and Kit, I. (2005), “The education system in Argentina and factors affecting
students’ performance”. In Female Urban Employment in Argentina. EGG-IADB. Mimeo.
Gomez Neto, B.J and E. Hanushek (1994), “Causes and consequences of grade repetition:
Evidence from Brazil.”, Economic Development and Cultural Change, 43, 1: 117-148.
Hanushek, E. A. (1986), “The economics of schooling: production and efficiency in public
schools.” Journal of Economic Literature, 24, (3): 1141-1177.
Holmes, C. T. (1989), “Grade-level retention effects: A meta-analysis of research studies.” In
L. A. Shepard and M. L. Smith (Eds.), Flunking grades: Research and policies on retention.
pp. 16-33. London: The Falmer.
Holmes, C. T. and K. M. Matthews (1984), “The effects of nonpromotion on elementary and
junior high school pupils: A meta-analysis.” Reviews of Educational Research, 54: 225-236.
26
![Page 27: The evaluation of an early intervention policy in poor schools · 2018. 5. 2. · Manacorda, Sandra McNally, Steve Piscke, Valeria Paula Dieulefait, Giulia Ferrari, Hugo Labate and](https://reader036.fdocuments.net/reader036/viewer/2022081410/60a043926d953d2676045737/html5/thumbnails/27.jpg)
Jacob, B.A. and L. Lefgren (2004), “Remedial Education and Student Achievement: A
Regression Discontinuity Analysis.” The Review of Economics and Statistics, 84, 1: 226-
244.
Jimerson, S. (1999), “On the failure of failure: Examining the association of early grade
retention and late adolescent education and employment outcomes.” Journal of School Psy-
chology, 37 (3): 243-272.
Jimerson, S. R. (2001), “Meta-Analysis of Grade Retention Research: Implications for
Practice in the 21st Century.” School Psychology Review 30(3):420-437.
Kit, I. and M. Scasso (2006), Situacion Educativa de las provincias argentinas. CD room.
Educacin para Todos and UNICEF.
Manacorda, M. (2006), “Grade Failure, Drop out and Subsequent School Outcomes: Quasi-
experimental Evidence from Uruguayan Administrative Data.” Mimeo.
Patrinos, H. A. and G. Psacharopoulos (1992). “Socioeconomic and Ethnic Determinants
of Grade Repetition in Bolivia and Guatemala.” Technical Department. Working Papers
Series 1028, The World Bank, Washington, DC
UNESCO (2004), Global Education Digest. Institute for Statistics (UIS). Montreal: UIS.
Wild, C.J and M. Pfannkuch (1999), “Statistical Thinking in Empirical Inquiry.” Interna-
tional Statistical Review, 67, 3: 223-265.
Wolff, L. Schiefelbein E, and P. Schiefelbein (2002), “Primary Education in Latin Amer-
ica: The Unfinished Agenda.” Technical Papers Series EDU-120. May. Inter-American
Development Bank Washington, DC Sustainable.
27