No Impacts?: How to Enhance What You Learn from Your Evaluation

Publication Date: December 13, 2022
what to do no impacts oct 2022 cover

Download Brief

Download PDF (250.76 KB)
  • File Size: 250.76 KB
  • Pages: 9
  • Published: 2022

Introduction

The primary goal of an impact evaluation is to learn about program effectiveness. Impact evaluations usually aim to show a program achieves credible impacts on key outcomes. But not all programs demonstrate favorable impacts at the end of an evaluation, and some evaluations yield program impacts that are small or statistically insignificant.

Purpose

This brief provides suggestions for grantees and evaluators who have completed an impact evaluation but did not find favorable, statistically significant impacts of their program. It provides suggestions for exploratory analyses that could help you understand why you didn’t find impacts and how to acknowledge the results of those exploratory analyses in any reporting. The brief also provides suggestions for disseminating your findings and additional supplemental analyses that help you learn as much as you can from the data you’ve collected. The brief is directed primarily toward evaluators of adolescent pregnancy prevention programs, but the concepts presented should be applicable to a variety of social programs.

Key Findings and Highlights

This brief recommends the following to evaluators:

  • Carefully assess implementation data to identify potential reasons you may not have observed impacts
  • Revisit decisions related to measurement to see if your measures were well-aligned with the population and intervention and revisit your analytic decisions to see whether they affected your results
  • Disseminate your findings, both the program impacts and other lessons learned, to inform the field.  When reporting findings:
    • Make sure you unpack the reasons the findings are small or nonsignificant. Tell the reader the results of any exploratory analyses that helped you understand why impacts might be smaller than expected.
    • Make sure you discuss the ways your exploratory analyses supplemented your original analysis plan. Transparency about your analyses will help ensure the reader interprets the supplemental results with caution.
    • Report the original minimum detectable impacts the study was designed to achieve. You can report what you powered your study to detect at the design phase (for example, the study was powered to detect impacts of 0.20 standard deviation units, which you thought was justifiable based on a set of assumptions that you should articulate). You can also report what you ultimately observed (for example, you observed impacts of only 0.10 standard deviations).

Citation

Knab, Jean, Russell Cole, and Emily LoBraico. (2022). “No Impacts?: How to Enhance What you Learn from your Evaluation.” OPRE Report Number 2022-149. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.