Course – Lesson 5: III. Evaluating, DATA AND EVALUATING INNOVATIONS
LESSON 5:III – Q+A WITH AUTHORS KATHERINE ORTEGA COURTNEY, PHD AND DOMINIC CAPPELLO
Why is the evaluation plan important to consider prior to implementation?
Katherine Ortega Courtney, PhD responds:
Prior to implementing your experiment or initiative, you need to have a solid evaluation plan. Evaluation plans map out how you plan to measure the fidelity and effects of your experiment. Most of the time, evaluation plans will include some type of data tracking to measure fidelity — discussed in previous lessons — as well as data related to short-term, intermediate, and long-term outcomes. If your experiment is implemented before you have developed a strong evaluation plan, you may find that you don’t have the data you need for sound evaluation when it comes time to assess the results of your experiment.
The first step of creating an evaluation plan is to consult your logic model and ensure that each outcome — short-term, intermediate, and long-term — can be measured or assessed in some way.
What type of evaluation plan would be appropriate to measure short-term outcomes?
Dominic Cappello responds:
Within a few months of implementing your experiment, you probably wouldn’t expect to see immediate improvements in your target performance measure — such as improved graduation rates within high school — quite yet, but you might be able to assess small changes that have occurred due to your activities.
Perhaps you have provided some supplemental training for staff about more effective crisis management with clients. In this example, the short-term outcome you would be evaluating is staff demonstrates a better understanding of crisis situations with clients.
Pre- and post-training surveys measuring staff knowledge and skills in this area can help you evaluate the effectiveness of your training activity.
What type of evaluation plan would be appropriate to measure intermediate outcomes?
Katherine Ortega Courtney, PhD responds:
Intermediate-level outcomes should go beyond measurable changes in knowledge and skills and start demonstrating changes in behavior. For example, if one is evaluating the impact of an afterschool jobs training programs, one would not only evaluate the skill level of the participants, but also measure if the participants are able to apply for and attain employment.
In addition, you could administer surveys to the staff and participants who participate in the afterschool program to assess what specific actions were most helpful in graduates of the workshop achieving employment — or, in certain situations, why some workshops are not successful in achieving their goals.
Perhaps another activity designed for high school to reduce school drop out is offering remedial lessons after school any time a high school teacher identified students struggling to pass certain key subjects. In this example, you might want to track how many of these after school lessons result in a plan to keep the student from failing a course; how often the high school students are met with to discuss their challenges; and whether the combination of after school lessons and teacher-student conversations lead to a student finding success with challenging school topics.
In addition, you could administer surveys to the after school teaching staff and students who participate in the afterschool lessons to assess what specific actions were most helpful in achieving success with particular topics — or, in certain situations, why some afterschool lessons are not successful in achieving their goals.
What type of evaluation plan would be appropriate to measure long-term outcomes?
Dominic Cappello responds:
Long-term outcomes should incorporate measures which are directly related to your experiment area. This could include a comparison of current data with data collected at baseline.
Long-term outcomes should also include the assessment of ongoing, consistent change in practice within the office implementing their experiment: Are the new activities and interventions happening consistently? What unanticipated or unintended consequences have occurred as a result of the new practice? Is the new way of doing things sustainable over time, or are additional resources needed to continue?
If within a school district a high school’s chosen experiment area is increasing high school graduation, it might look at the graduation rate data provided in by the district to see if that measure is trending in the desired direction.
Long-term outcomes should also include the assessment of ongoing, consistent change in the school implementing their experiment: Are the new activities and interventions happening consistently? What unanticipated or unintended consequences have occurred as a result of the new practice? Is the new way of doing things sustainable over time, or are additional resources needed to continue?
What are the challenges of managing by data?
Katherine Ortega Courtney, PhD responds:
The success of the data-driven approach is reliant upon the quality of the data gathered and the effectiveness of its analysis and interpretation. Errors can creep into data analytics processes at any stage of the endeavor and serious issues can result when they do.
How does data-informed decision-making support CQI?
Dominic Cappello responds:
CQI is fundamentally about assessing and solving problems. CQI starts with diagnosing a problem. The entire CQI process is, in many ways, a series of decisions. Decision-making, based on data, makes this process more effective.
How does one help one’s agency staff or any organization get comfortable using data?
Dominic Cappello responds:
Start with simple exercises and presentations. For example, having staff spend time with data such as the Youth Risk and Resiliency survey. Pick one piece of data and have staff talk about it. Reserve time at every staff meeting to make sure data is a topic. Some of our communities have started integrating results of the Resilient Community Experience Survey into their City Council meetings. The more you talk about data in ways that demonstrate its relevance, the more comfortable stakeholders will be with data.
How does one track and adjust implementation of innovations?
Katherine Ortega Courtney, PhD responds:
If there is a good plan and structure in place, one can document the data, analysis, implementation, challenges, and results in one place where all involved can go to see the information. As the experiment runs its course, checking the progress at agreed-upon intervals will allow participants to discuss possible adjustments to practice.
Where does one find support and technical assistance to help strengthen the data-informed process?
Dominic Cappello responds:
Technical assistance, ideally, should be available from the initiative’s sponsoring organization or local institute of higher learning.