Child Welfare Evidence-Building Academy: A Training on Rigorous Evaluation Design and Implementation

Publication Date: October 20, 2021

Introduction 

More and better evidence about what truly helps children and families involved in the child welfare system is increasingly in demand. Demonstration projects and interventions under development require strong evaluations to build effective programs and services. The Child Welfare Evidence-Building Academy convened child welfare agency administrators (“implementers”) and evaluators. The Academy included a series of 15 training modules hosted online in summer and fall 2020. The goal was to increase participants’ knowledge about how to design and implement a rigorous evaluation in a child welfare setting. 

Modules included the following:

  1. Introduction to Evaluation for Child Welfare Program Administrators (PDF): This opening webinar for child welfare administrators—“implementers”— covers the basic building blocks of evaluation.
  2. Key Concepts (PDF): This module introduces key analytic concepts that are the foundation for building evidence: have a theory of change, ask a good question, know your population, and think longitudinally.
  3. What Types of Evaluation Can Answer Your Research Questions? (PDF) Module 3 uses the principle ask a good question to discuss different types of evaluations and how to select one according to the questions that you have. There is also an Evaluation Type Quick Reference Guide (PDF) that provides a one-page summary of the information from the webinar.
  4. Implementation and Fidelity (PDF): Following the overview of evaluation types, module 4 dives into the topics of implementation and fidelity. It includes considerations for process evaluation as one important type of evaluation—one that explicitly tests the “so I plan to...” part of a theory of change.
  5. Aligning the Target Population, Study Population, and Analysis Sample (PDF): Module 5 uses the theory of change to guide participants through the process of identifying a study population and making sure it aligns with the intervention under evaluation.
  6. Beyond the Classroom: RCT Design in the Real World (PDF): This module for evaluators briefly describes the benefits and challenges of randomized controlled trials (RCTs). It describes the difference between intent-to-treat versus treatment-on-the-treated analysis, and it focuses more deeply on the referral and randomization process, some approaches for common problems in RCT implementation, threats to randomization, and internal versus external validity.
  7. Implementing RCTs (PDF): This module for implementers focuses on defining an RCT, the advantages of conducting an RCT when the program resources are limited, and how to administer an RCT’s critical features on the ground.
  8. Evaluation Design: QED (PDF): This module reviews four common quasi-experimental designs (QEDs): regression discontinuity, difference-in-differences, interrupted time series, and matched comparison group. It highlights the strengths and limitations of each design.
  9. What to Include in an Evaluation Plan (PDF): This module outlines the components of a thorough evaluation plan, for both implementation and outcome studies.
  10. Power Analysis (PDF): This webinar meant for evaluators demonstrates the basic components of a simple power analysis for continuous outcomes. There is also a handout (PDF) that highlights the key points addressed in the slide deck.
  11. Measuring Outcomes (PDF): This module uses a life course or event history perspective to discuss child welfare outcomes and their measurement.
  12. Types of Data Used for Impact Evaluation (PDF): This module provides an overview of primary and secondary data sources commonly used in evaluation. In particular, the module examines the relative strengths and limitations of various data sources, such as surveys, assessments, and administrative data.
  13. Writing a Strong Research Report or Journal Article (PDF): This module provides participants with knowledge and strategies for producing written findings that contribute to the evidence base.
  14. How to Critically Appraise the Evidence about What Works in Child Welfare (PDF): Using examples of evidence from the field, this module outlines strategies regarding how this information can be used to improve future child welfare programming.
  15. Presenting Data Visually (PDF): This module describes different types of visualizations and their uses, as well as strategies for how to avoid creating graphics that result in ineffective and inaccurate perceptions of data.

The Academy is part of a broader project, Child Welfare Evidence Strengthening Team (CWEST), aimed at increasing the number of evidence-supported interventions for the child welfare population. The CWEST project is led by the Urban Institute in partnership with the University of Chicago, Child Trends, and Chapin Hall with support from OPRE and the Children’s Bureau.

Purpose

After participating in the Academy’s 15 topic modules, participants should

  • be able to identify rigor in evaluation designs;
  • be able to apply core scientific principles and tools required to produce evaluation evidence; and
  • have greater capacity to conduct and support rigorous evaluation.

Glossary

The Academy also produced a glossary of key research and evaluation terms (PDF) used during the module presentations.

Citation

Urban Institute et al. (2021). Slide Deck Session 1: Introduction to Evaluation for Child Welfare Program Administrators - Child Welfare Evidence-Building Academy. OPRE Report 2021-142, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 2: Key Concepts - Child Welfare Evidence-Building Academy. OPRE Report 2021-108, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 3: What Types of Evaluation can Answer your Evaluation Questions? - Child Welfare Evidence-Building Academy. OPRE Report 2021-109, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 4: Implementation and Fidelity - Child Welfare Evidence-Building Academy. OPRE Report 2021-110, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 5: Aligning the Target Population, Study Population, and Analysis Sample - Child Welfare Evidence-Building Academy. OPRE Report 2021-111, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 6: Beyond the Classroom: RCT Design in the Real World - Child Welfare Evidence-Building Academy. OPRE Report 2021-112, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 7: Implementing RCTs: What Agency Administrators Need to Know - Child Welfare Evidence-Building Academy. OPRE Report 2021-113, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 8: Quasi-Experimental Evaluation Designs - Child Welfare Evidence-Building Academy. OPRE Report 2021-114, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 9: What to Include in an Evaluation Plan - Child Welfare Evidence-Building Academy. OPRE Report 2021-115, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 10: Power Analysis - Child Welfare Evidence-Building Academy. OPRE Report 2021-143, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 11: Measuring Outcomes - Child Welfare Evidence-Building Academy. OPRE Report 2021-116, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 12: Types of Data Used for Impact Evaluation - Child Welfare Evidence-Building Academy. OPRE Report 2021-117, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 13: Writing a Strong Evaluation Report or Journal Article - Child Welfare Evidence-Building Academy. OPRE Report 2021-118, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Urban Institute et al. (2021). Slide Deck Session 14: How to Critically Appraise the Evidence about What Works in Child Welfare - Child Welfare Evidence-Building Academy. OPRE Report 2021-119, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Schwabish, Jonathan (2021). Slide Deck Session 15: Data Visualization Done Differently - Child Welfare Evidence-Building Academy. OPRE Report 2021-120, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.