Developmental Evaluation in Theory versus Practice: Lessons from Three Developmental Evaluation Pilots

Background. Developmental Evaluation (DE) practitioners turn to DE theory to make design and implementation decisions. However, DE practitioners can experience difficulty in fully understanding how to implement DE using theory because it is method agnostic (Patton, 2016). Instead, DE is a principle-...

Full description

Saved in:
Bibliographic Details
Published inJournal of multidisciplinary evaluation Vol. 17; no. 40; pp. 16 - 33
Main Authors Esper, Heather L., Fatehi, Yaquta K., Baylor, Rebecca
Format Journal Article
LanguageEnglish
Published The Evaluation Center at Western Michigan University 26.04.2021
Online AccessGet full text

Cover

Loading…
More Information
Summary:Background. Developmental Evaluation (DE) practitioners turn to DE theory to make design and implementation decisions. However, DE practitioners can experience difficulty in fully understanding how to implement DE using theory because it is method agnostic (Patton, 2016). Instead, DE is a principle-based approach. Purpose. This article presents an empirical examination of how DE theory was (or was not) applied during three DE pilots. Our analysis aims to better understand how DE theory is used in practice to expand the evidence base and strengthen future DE implementation. Setting. A consortium of three organizations implemented three DE pilots through the United States Agency for International Development (USAID) from November 2016 to September 2019. The authors—who participated in the consortium—did not implement the DEs but instead conducted a study or meta-evaluation across the DE pilots. Data Collection and Analysis. This article focuses on the results of an ex post facto analysis of three DE pilots based on the entire DE implementation experience. For each DE studied, we used mixed methods to collect data on the effectiveness of the DE approach, to identify adaptations to strengthen DE implementation in the USAID context, and to measure its value to stakeholders. Data included more than 100 hours of interviews, 465 pages of qualitative data, and 30 surveys completed by DE participants. Findings. We find that the ability to apply the DE principles in practice is influenced, in no particular order, by DE participant buy-in to the DE, the Developmental Evaluator’s aptitude, support and resources available to the Developmental Evaluator, and the number of DE participants. We also find that buy-in can change and this should be closely monitored throughout a DE to inform whether a DE should be paused or prematurely ended. Keywords: Developmental Evaluation; developmental evaluator skills; buy-in; DE practice; DE funder; meta-evaluation
ISSN:1556-8180
1556-8180
DOI:10.56645/jmde.v17i40.685