NSF Workshop on Implementation Evaluation

The Future of Implementation Evaluation:
Towards a Science of “Continuous Improvement Research in Education”

A Day-Long Workshop Sponsored by the EHR Evaluation Group

Monday, October 28
National Science Foundation
Arlington, VA
Room 375

Outcome evaluation looks at results. It measures the direct effects of project or program activities on targeted recipients. But results don’t tell the whole story. Evaluation that only focuses on outcomes is sometimes called a “black box” evaluation because it does not take process or implementation evaluation into consideration. Outcome evaluation alone won’t provide information about why a program did or didn’t work. These are implementation evaluation questions. Disappointing outcome evaluation results can frequently be illuminated by examining how the treatment was implemented. There are other equally important benefits too. A focus on implementation integrity also allows researchers to develop interventions that can withstand variation in treatment settings, to design studies that better account for such sources of variation, and to change the treatment itself so that the intervention achieves maximum effect. This workshop provided an overview of issues to consider when assessing the methodological strength of EHR proposals that could benefit from looking inside the “black box” of implementation.

8:30-9:00am   Registration
9:00-9:15am   Welcome

Janice Earle, National Science Foundation (bio)

9:15-10:00am   Overview of fundamental issues to consider

Barbara Schneider, ARC and Michigan State University (bio)

10:00-10:15am   Break
10:15-11:15am   What is Best Quasi-experimental Practice for Evaluating Educational Programs? Comparisons with Experimental Estimates

Tom Cook, Northwestern University (bio)

Kelly Hallberg, AIR (bio)

11:15-noon   Response to Cook and Hallberg (w/ Q&A)

Larry Hedges, ARC and Northwestern University (bio)

noon-12:30pm   Lunch break
    Implementation of Interventions and Continuous Improvement

Developing and Using Implementation Evidence in Research and Development Efforts

Bill Penuel, University of Colorado at Boulder (bio)

The Top Ten Considerations for Evaluating Implementation Measurement and Analysis Strategies

Jeanne Century, CEMSE and University of Chicago (bio)

1:30-2:30pm   Discussion Groups (Led by Schneider, Hedges, Century, & Penuel)
2:30-3:15pm   Report Back and Q&A

Barbara Schneider, ARC and Michigan State University

3:15-3:30   Adjourn