Using Evaluation in Implementation Efforts to “Fall Forward” Instead of Standing Still


The Clearinghouse website, www.militaryfamilies.psu.edu, is our primary access point. It provides detailed information on more than 1,300 programs (the largest repository of its kind) for individuals and families facing a broad domain of issues and challenges, including parenting programs, programs to improve couple relationships, interventions to decrease domestic and family violence, psychological treatments, and suicide prevention. The 1,300 programs were rigorously vetted for amount and quality of evidence of effectiveness by our research and evaluation scientists and placed on our Continuum of Evidence. Professionals working with service members and their families can search for programs on the Continuum and select the ones that have the strongest evidence of their effectiveness. Professionals can then use a summary of information about relevant programs to make a reasoned decision about which to implement. Why is the Continuum so important? The Clearinghouse provides a large and diverse audience with science-based information on programs that have proven effective in addressing challenges faced by military personnel and their families.

The Clearinghouse focus on evaluation is grounded in a phrase my mother said to me when growing up, that is, “it is better to fall forward than stand still.” Evaluating where we are going is essential to ensuring that we progress effectively. In other words, it is better to start collecting and looking at data about what a program is doing than to not collect any data while waiting for a better way to evaluate the program. “Falling forward” means moving in the right direction in assessing whether a program is effective. So while not every program can or should be evidence-based*, but every program at least should be informed through evaluation. For example, the rationale for an opioid prescription take-back program is obvious: it gets opioid doses out of homes and off the streets. The program still needs to be evaluated to see if the drop-off points are in the right place, or to see if everybody knows the drop-off points are available.

* Evidence-based: rigorous evaluation of a program or service that tests its effectiveness as well as being based on experience, other research studies, theory, and logic models

Evaluation Frameworks to support implementation practice: Applied Developmental Science Perspective and the Dynamic Sustainability Framework

All of the work of the Clearinghouse is grounded in the two main frameworks:  Applied Developmental Science Perspective (ADS; Fisher et al., 1993, p.4) and Dynamic Sustainability Framework (DSF; Chambers et al., 2013). ADS is:

  1. The programmatic synthesis of research and applications to describe, explain, intervene, and provide preventive and enhancing uses of knowledge about human development.

  2. Applied because it has implications for what individuals, practitioners, and policy makers do.

  3. Developmental because it focuses on systematic and successful changes within human systems that occur across the life span.

  4. Science because it is grounded in a range of research methods designed to collect reliable and objective information that can be used to test the validity of theory and application.

The Dynamic Sustainability Framework lays out a systematic road map as to how the adaptation of interventions may occur over time through sound implementation. It also conveys the role of continuous monitoring in supporting the integration and sustainability of interventions as they are adapted to the ever-changing context in which they are delivered, including changes occurring in the delivery setting, the target population, the evidence base, the political context, and other key variables that are known to occur over time (Chambers et al., 2013).

This article was featured in our monthly Implementation in Action bulletin! Want to receive our next issue? Subscribe here.

Previous
Previous

Solutions for Sustainability Planning: The Long Term Success Tool

Next
Next

Balancing Rigour With Reality: Designing Evaluations for Community/Organization-Led Implementation Initiatives