Three Extensions to Evaluating Educational Interventions
Skip to main content
eScholarship
Open Access Publications from the University of California

UC Irvine

UC Irvine Electronic Theses and Dissertations bannerUC Irvine

Three Extensions to Evaluating Educational Interventions

Creative Commons 'BY' version 4.0 license
Abstract

Educational intervention research addresses multiple barriers that under-sourced communities face in education. I test three methods to improve education program design and evaluation to develop programs that have substantial and lasting impacts on student outcomes. In my dissertation, I review current challenges to educational program evaluation research and present three different extensions to the methodologies researchers use to improve their practice. To address these challenges, I draw upon perspectives and methodological innovations from adjacent fields including causal inference methods from policy research, measures from implementation science, and practical insights from research-practice partnerships. I contribute three separate studies as potential extension to different parts of our process to discovering what works, for whom, and when. Study 1 describes methods to over-come the challenge of conducting long-term follow up evaluations by empirically testing different intervention design features and analytical decisions for forecasting medium-term impacts of early skill-building interventions in mathematics. I empirically test different study designs and analytical approaches to determine which combinations improve the accuracy of forecasting the medium-term impacts of math interventions using nonexperimental data. Study 2 embraces the challenge of identifying what mechanism or lever to intervene on and how much nonexperimental data can inform theory of change to design interventions. In study 2, I use the data from the Baby’s First Years Study a longitudinal randomized controlled trial (RCT) of an unconditional cash-gift given to the mothers of newborn children living in poverty. I compare nonexperimental estimates of the impact of income on child development and maternal well-being–using data from the control group–to the experimental estimates from the RCT. This study helps us understand and document the importance and difficulty of formulating interventions from theory and nonexperimental data. I discuss the implications of this work for attempting to craft evidence-based policy based on corresponding experimental and non-experimental estimates. Study 3 embraces the challenge of program effect variation across different contexts. I conduct semi-structured interviews and a comparative case analysis to capture evidence about what adaptations and modifications are made to math programs when researchers iterate through program design and implementation. I describe when the adaptations are made, who decides on making the adaptation, what the adaptation is, where it is adapted, and how it is adapted using the Framework for Reporting Adaptations and Modifications-Enhanced (FRAME-IS; Miller et al., 2021) expanded framework for reporting adaptations. I develop a set of guiding principles that can inform researchers about what adaptations to anticipated when designing and implementing math programs.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View