Program Evaluations Jennifer Glenski The Next Step Public Charter School
Contents Introduction Introduction Program Evaluation Guidelines Program Evaluation Guidelines Make Goals Explicit Make Goals Explicit Evaluation Design Evaluation Design Evaluation Focus Evaluation Focus Develop an Evaluation Timeline Develop an Evaluation Timeline Data Collection Data Collection Types of Data Types of Data Making a Data Collection Plan Making a Data Collection Plan Data Analysis Data Analysis Data Analysis Best Practices Data Analysis Best Practices Evaluation Conclusions Evaluation Conclusions Develop an Action Plan Develop an Action Plan References References
Introduction This document provides an introduction to designing, conducting, and analyzing program evaluations. It is intended for teachers and staff at schools looking to evaluate school programs, using data. While some practical suggestions are offered in this document, it’s important to keep in mind that program evaluations should be tailored to different schools and occasions. Schools can pull from other program evaluations used elsewhere, but should make efforts to adjust them to their particular school environment. Contents Page
Program Evaluation Guidelines Rockwood School District (RSD) describes some guidelines to keep in mind when conducting a program evaluation: Rockwood School District Care should be taken to ensure that the evaluation goals are addressed, and that sufficient, accurate, and valid information are provided to aid sound decision making. The evaluation should be fair and balanced, and whenever possible, multiple data sources and data types should be obtained to corroborate the findings. Steps should also be taken to avoid bias, and in cases where bias or other problems are unavoidable, those weaknesses should be clearly identified in the evaluation so that decision makers can draw valid conclusions. Only consider using those informational resources that are defensible. The report should be clearly written so that essential information is easily understood. (2010, p. 2) Contents Page
Make Goals Explicit The first step in designing a program evaluation is putting together a small team of people to organize, conduct, and analyze the evaluation. Once a team is established, Ball (2011, p. 6) emphasizes the importance of making the goals of the program evaluation explicit.Ball To assist in setting clear goals, RSD offers three key questions to consider:RSD 1. What is the purpose of this evaluation? 2. What do we desire to know about the program to be evaluated? 3. How has staff development impacted student achievement? (2010, p. 3) Contents Page
Evaluation Design After defining clear goals for the evaluation, the evaluation team may find the following list of questions from RSD helpful in designing the program evaluation:RSD What are the characteristics and distinctive features of the program to be evaluated? How will the results of the evaluation be used? What data and information are currently available? What critical new information is needed? What resources will be needed to conduct the evaluation? What are the time limits? When are the results needed? What decisions might be based on the evaluation? Who should be involved in making the decisions? Will the data collected adequately answer the questions asked about the program? What forms of staff development (administrative, professional, support staff, etc.) are necessary to support effective implementation of the program? (2010, p. 3) Contents Page
Evaluation Focus Whether or not a program has met it’s defined goals is not the only area programs can be evaluated on. The following are additional possible areas of focus, provided by RSD, that may be considered valuable in the evaluation of the program:RSD Views of the staff involved with the program. Comparison of actual program results with expected results. Reviews of currently available student achievement data or other data. Evaluation of the current instructional materials or proposed new materials for the program. Obtain the views of those affected by the program. Evaluation of student effort toward learning. Comparison of the current program with its actual program design or original goals. Professional evaluation of the program by (external, non-district) colleagues. Comparison of the program with similar programs in other schools or districts. Comparison of program participant performance to nonparticipant performance at similar schools within the district. Where applicable, evaluate the effectiveness of staff development on student achievement. (2010, p. 4) Contents Page
Develop an Evaluation Timeline Once an evaluation plan is created, it is necessary to develop a timeline and project management plan for the program evaluation. As RSD cautions, “Care should be taken to outline key procedures within several of the steps [of the program evaluation plan] to ensure that they are implemented in a logical and efficient manner” (2010, p. 5). RSD A tip from Ball is to “make the program staff active participants in the evaluation process,” to encourage staff to be open to and trusting of the evaluation process (2011, p. 8).Ball When developing a timeline, be sure to communicate with the staff members who will be responsible for conducting various steps in the evaluation plan, in order to set realistic expectations of what is require and practical deadlines. Contents Page
Data Collection ProviniProvini offers some guidelines on collecting data for the program evaluation: Use multiple formal and informal data collection tools (e.g., observation, record review, survey, interview). Use multiple informants (students, parents and teachers). Measure at multiple levels (individual, small-group, class, grade and population/school levels). Measure at the group level to evaluate efforts that reach smaller groups; measure at the population level to evaluate efforts that reach the entire school. Track process indicators (reflect upon how implementation is going) so that corrections can be made early in the process. Track both short-term and long-term outcome indicators (assess what immediate and longer-term effects efforts are having on students). (2011) In addition, RSD recommends, “if and when possible, triangulate your information, i.e., collect information/data on the same questions or issues from different sources to corroborate the evidence gathered” (2010, p.4).RSD Contents Page
Types of Data Select the best sources of quantitative (numerical) and qualitative (descriptive/narrative) data to use in the program evaluation. See examples below: Quantitative Attendance and Tardiness Awards Received Budgetary and Other Financial Data College/Vocational Enrollment, Attrition, Completion and Placement Extracurricular Participation Documentation of program implementation Graduate Follow-up Data Graduation Rates Rates of Completion of Homework Staff Development and training workshops and attendance rates Structured observation Student Discipline Data Student Grades (and longitudinal data) Survey results - numerical ratings Time in the program/on task Qualitative Case study information Documents, materials, lesson plans Focus group interviews Observations Parent compliments/complaints Structured interviews Student portfolios Survey results - commentary sections Visiting team reports Contents Page (RSD, 2010, p. 4-5)RSD
Making a Data Collection Plan When designing a plan for collecting data, RSD recommends taking the answers to the following questions into consideration:RSD What data collection/scoring/analysis instruments and procedures already exist that we can use as examples or tap into? What data have already been collected that we can use? How will new information be collected? What scoring strategies should be utilized? What data will be needed to answer the evaluation questions? Who needs to provide information? How long will it take? (2010, p. 6) Contents Page
Data Analysis “The program evaluator [should think] through the question of data analysis before entering the data collection phase. Plans for analysis help determine what measures to develop, what data to collect, and even, to some extent, how the field operation is to be conducted” (Ball, 2011, p. 11). The next step after data collection though, RSD notes, is to:RSD Gather the data collected and determine that it is valid, accurate, and reliable. Summarize the data. Develop a list of recommendations based upon the results. Take care to make valid and justifiable recommendations for program change or continuation. When writing recommendations, make sure they are practical, efficient, and capable of being implemented. (2010, p. 7) Contents Page
Data Analysis Best Practices RSDRSD lists some best-practices for analyzing data: Report verbatim responses for small surveys. Report results by topic or by question. Use measures of central tendency (mean, median or mode) for reporting quantitative data. Report the total number of respondents and when they are about 20 or more, report results to survey questions using percentages. When possible, to show that you do not have an unduly biased sample, report the total population sampled from, i.e., there are approximately 1,000 students in the program, 500 surveys were distributed to their parents, and 300 or 60% responded. Use the standard deviation as a measure of variability to indicate how narrowly or broadly the results are spread out in relation to the mean. To report longitudinal growth, or year to year score gains, use Normal Curve Equivalent Scores. (2010, p. 7) Contents Page
Evaluation Conclusions At the conclusion of the program evaluation, prepare a report and share with stakeholders of the program evaluation (i.e. those most likely to use the results). Ball points out that “Possibly the most important principle in program evaluation is that interpretations of the evaluation’s meaning—the conclusions to be drawn—are often open to various nuances” (2011, p. 13). Ball In order to be useful, program evaluations should be viewed as part of an on-going cycle, not a one-time event. Contents Page
Develop an Action Plan After reviewing the results of the program evaluation with stakeholders, it is crucial to develop an action plan designed to work on areas of improvement identified in the program evaluation. If no further steps are taken, the program evaluation is essentially of no value. According to RSD, an effective action plan should answer the following questions:RSD What will the plan accomplish (the objectives)? What activities are needed to achieve the objectives? Who will be responsible for carrying out the activities? What is the target completion date for each activity? What will serve as evidence of success for each activity? What further Staff Development is needed to support the program? (2010, p. 8) Contents Page
References Ball, S. (2011, April). Evaluating Educational Programs. Retrieved from Educational Programs Rockwood School District. (2010, February 2). Rockwood School District Program Evaluation Plan. Retrieved from Documents/Program%20Evaluation%20Plan.pdfRockwood School District Program Evaluation Plan Education World. (2011). Evaluation Planning Worksheet. Retrieved from Worksheet.pdfEvaluation Planning Worksheet Provini, C. (2011). Jump-Start Your School's Program Evaluation. Retrieved from evaluation-basics.shtmlJump-Start Your School's Program Evaluation Contents Page