Effective Program Evaluation

Views: 729
Ratings: (0)

Educators are increasingly coming to realize the importance of making decisions based on reliable, accurate data. This short guide provides a blueprint for evaluating academic programs, practices, or strategies within a simple, effective framework. It includes a step-by-step walkthrough of the program evaluation cycle and an appendix that explains vital concepts and vocabulary in accessible language.

List price: $23.99

Your Price: $19.19

You Save: 20%

 

8 Slices

Format Buy Remix

1. Defining Program Evaluation

ePub

As legislation and departments of education place increasing emphasis on performance and accountability, school leaders are more frequently called on to make decisions about the effectiveness of school programs, practices, and strategies. How good those decisions turn out to be is largely a function of the quality of information on which the decision was based. Program evaluation provides school leaders with the information they need to make good decisions about programs, practices, and strategies in use or being considered for use at a school or district. It can answer questions such as:

As in these examples, the specific questions addressed by the evaluation process will vary, but the process used to find the answers will be similar. The goal of program evaluation is to use data to guide decisions about how well school programs are working and to do so in a way that is both time and cost effective.

Program evaluation is an essential tool in many professions, including medicine, science, government, engineering, and business. Learning how to conduct an evaluation enables leaders “to create the best possible programs, to learn from mistakes, to make modifications as needed, to monitor progress toward program goals, and to judge the success of the program in achieving its short-term, intermediate, and long-term outcomes” (Centers for Disease Control and Prevention, 2005, p. 5). Evaluation can be used to assess key aspects of a program’s progress and implementation, and it can provide vital data to inform next steps. It can:

 

2. Creating and Completing a Program Evaluation Blueprint

ePub

A tool that will be helpful in planning for and carrying out your program evaluation is the program evaluation blueprint. The blueprint identifies the:

In addition to being helpful in planning the evaluation, the blueprint is also useful in communicating about the evaluation. Questions about the evaluation can often be answered by simply providing a completed copy of the blueprint.

The blueprint will also help those responsible for implementing the evaluation resist scope creep—the expansion of a project’s breadth or depth beyond its original objectives. At some point in the evaluation, an evaluation team may be tempted to “improve” on the existing plan by adding new questions, grade levels, or data collections. Sticking to the plan outlined in the blueprint will help prevent the plan from becoming too large or too complex. Figure 2.1 (page 8) shows the template.

As we move through a description of each step of the evaluation in this chapter, we will illustrate how that step would appear in an example blueprint.

 

3. Gathering Data

ePub

Once the planning is complete and the blueprint drafted, the data collection phase of the program evaluation can begin. While not every program evaluation will make use of surveys, interviews, focus groups, or observational data, many will. This chapter will focus on how to create and use these specific data collection tools, but you may consider many other forms of data, such as curriculum data and student work and assessment data.

Start by looking at your blueprint to see what information will be collected and what tools will be used to collect it. It also helps to think ahead to the data analysis phase (discussed in chapter 4) since that might provide some clues that help refine the data collection tasks.

In order to avoid making decisions using limited data or a single data point, sufficient data must be collected from multiple sources to answer each question. For example:

Only after evidence is collected from at least two different sources and not contradicted by any other data is the evidence deemed sufficient for decision making.

 

4. Analyzing the Data

ePub

Once the evidence has been collected, the most exciting time of a program evaluation has arrived: it is now possible to begin drawing conclusions from the evidence. The evaluation team will now take stock of and organize the collected data, look for trends, ensure the data are corroborated, and begin to think about how to display the data so that answers to the evaluation questions emerge.

The team begins the data analysis by examining the data collected to ensure that all the evidence pertinent to the evaluation questions has been gathered. Here the evaluation team answers questions such as:

If changes were made to the evidence collection methodology, or if one or more data collection methods were dropped altogether, the team must consider the effect that loss of data will have on the team’s ability to answer the research questions. If you discover that data crucial to fully understanding the picture as a whole have not been captured, it may be necessary to plan to collect that information.

 

5. Using the Results of the Program Evaluation

ePub

Once the evaluation team has completed the analysis of the data, the review moves into the final stages: reporting, recommending, and decision making. In the decision-making stage, the results of the data analysis are used to determine the performance level of the program, practice, or strategy and what, if any, actions should be taken to address performance deficits.

Consider a program review whose purpose is to evaluate the effectiveness of an existing math program. The data analysis might show that while all students are not meeting grade-level standards, there has been a steep and steady increase over time in math scores both across grade levels and by all subgroups. On the other hand, the data analysis might show that there is less improvement than anticipated, that not all students are benefiting equally from the program, or that the school may be doing relatively well in math compared to the rest of the state but not as well as other similar schools.

The evaluation team will need to determine whether the current outcomes or results of the program, practice, or strategy are satisfactory or not, and what actions the team will recommend as a result. The results and recommendations can be classified into four groups according to whether a program, practice, or strategy has been shown to be:

 

6. Communicating Results

ePub

Once the collected information has been analyzed, and recommendations and decisions have been made, it is time to communicate that information to stakeholders. The findings should be communicated in such a way that multiple audiences can easily understand them. This aids in building support should any programmatic changes be called for.

Every program evaluation should include a brief written report summarizing the reason for the evaluation, the research questions, the data collection and analysis, the evaluation team’s recommendations, and decisions resulting from the evaluation. Consider the different audiences to whom the report will be disseminated (such as teachers, parents, and community members), and strive for an understandable tone that is not filled with jargon or acronyms so that the report conveys a clear message to all groups. The report should be succinct and present the information objectively. Figure 6.1 provides an outline of an evaluation team report.

Figure 6.1: Sample program evaluation report outline.

 

7. Looking Back and Planning Forward

ePub

While a program evaluation has a discrete beginning and end, it is only one component in an ongoing, data-driven school improvement cycle. Often the team process used in the evaluation grows to become a routine way of making research-based decisions. In this chapter, we look back on the program evaluation process and discuss strategic planning processes that should be embedded in the fabric of a high-performing school.

A well-done program evaluation will generate considerable information about the program and, perhaps more broadly, about the school. Most likely you will find many other uses for the information gathered. Consider the example in which a school’s K-3 math curriculum was reviewed to determine the degree to which it readied students for grade 4. The information derived may provide important insights into the efficacy of the textbook or other instructional materials in use, the need for tighter instructional alignment between and within grade levels, the effectiveness of existing intervention and enrichment programs and materials, or the need for professional development.

 

Appendix: The Vocabulary of Program Evaluation

ePub

The Vocabulary of Program Evaluation

Program evaluation, like every field, has its own vocabulary. This vocabulary ensures that everyone working on the evaluation shares a common understanding when discussing program evaluation and its concepts and procedures. Keeping the language of program evaluation in mind as you plan for and conduct the evaluation will help keep you on track toward producing a quality product. Some of the terms may be familiar to you, while others may be new. Use this handout as a reference throughout the program evaluation process.

An ability to calculate averages is important for working with and understanding the significance of sets of data, such as that illustrated in figure A.1.

Figure A.1: Calculating the mean of a sample data set.

The mean is what most people think of when they say average. To calculate the mean, add all of the numbers in a set, and then divide by the number of figures in the set. In figure A.1, there are eleven numbers in the set.

 

Details

Print Book
E-Books
Slices

Format name
ePub (DRM)
Encrypted
true
Sku
2370003851359
Isbn
9781935542926
File size
0 Bytes
Printing
Disabled
Copying
Disabled
Read aloud
No
Format name
ePub
Encrypted
No
Printing
Allowed
Copying
Allowed
Read aloud
Allowed
Sku
In metadata
Isbn
In metadata
File size
In metadata