Program Evaluations Jennifer Glenski The Next Step Public Charter School.

Slides:



Advertisements
Similar presentations
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Advertisements

M & E for K to 12 BEP in Schools
Assessment Assessment should be an integral part of a unit of work and should support student learning. Assessment is the process of identifying, gathering.
Semonti Basu PBS Technical Assistance Facilitator Grace Martino-Brewster PBS Specialist Austin Independent School District Connecting Data Dots Using data-based.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
Chapter Fifteen Understanding and Using Standardized Tests.
Project Monitoring Evaluation and Assessment
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Formative and Summative Evaluations
The Academic Assessment Process
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Evaluation. Practical Evaluation Michael Quinn Patton.
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Student Assessment Inventory for School Districts Inventory Planning Training.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
Professional Growth= Teacher Growth
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Principles of Assessment
REC 375—Leadership and Management of Parks and Recreation Services Jim Herstine, Ph.D., CPRP Assistant Professor Parks and Recreation Management UNC Wilmington.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Chapter 9 Qualitative Data Analysis Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
School Counselors Doing Action Research Jay Carey and Carey Dimmitt Center for School Counseling Outcome Research UMass Amherst CT Guidance Leaders March.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Presented By: Tehmina Farrukh Topic: Long Report Writing
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Building Your Assessment Plan Esther Isabelle Wilder Lehman College.
 Collecting Quantitative  Data  By: Zainab Aidroos.
Evaluating a Research Report
Methods: Pointers for good practice Ensure that the method used is adequately described Use a multi-method approach and cross-check where possible - triangulation.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
ANNUAL EVALUATION PLAN Schoolwide Programs. Annual Evaluation Plan.
South Western School District Differentiated Supervision Plan DRAFT 2010.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
EDU 8603 Day 6. What do the following numbers mean?
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Draft TIP for E-rate. What is E-rate? The E-rate provides discounts to assist schools and libraries in the United States to obtain affordable telecommunications.
TOP TEN LIST OF COACHING BELIEFS CURRICULUM 511 DR. PECK BY: HALI PLUMMER.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Scale Scoring A New Format for Provincial Assessment Reports.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
School Accreditation School Improvement Planning.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Chapter 19: Action Research: The School as the Center of Inquiry
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
National Board Study Group Meeting Dan Barber 5 th Grade Teacher, Irwin Academic Center
Elementary School Administration and Management GADS 671 Section 55 and 56.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Session 2: Developing a Comprehensive M&E Work Plan.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
Dr. Kathleen Haynie Haynie Research and Evaluation November 12, 2010.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Representing Variability with Mean, Median, and Mode 6 th Grade MDC Formative Assessment Lesson 6.SP. I can find the number of observations. I can find.
Office of Education Improvement and Innovation
Using Data for Program Improvement
Understanding and Using Standardized Tests
Using Data for Program Improvement
Accountability and Assessment in School Counseling
Presentation transcript:

Program Evaluations Jennifer Glenski The Next Step Public Charter School

Contents  Introduction Introduction  Program Evaluation Guidelines Program Evaluation Guidelines  Make Goals Explicit Make Goals Explicit  Evaluation Design Evaluation Design  Evaluation Focus Evaluation Focus  Develop an Evaluation Timeline Develop an Evaluation Timeline  Data Collection Data Collection  Types of Data Types of Data  Making a Data Collection Plan Making a Data Collection Plan  Data Analysis Data Analysis  Data Analysis Best Practices Data Analysis Best Practices  Evaluation Conclusions Evaluation Conclusions  Develop an Action Plan Develop an Action Plan  References References

Introduction  This document provides an introduction to designing, conducting, and analyzing program evaluations. It is intended for teachers and staff at schools looking to evaluate school programs, using data.  While some practical suggestions are offered in this document, it’s important to keep in mind that program evaluations should be tailored to different schools and occasions. Schools can pull from other program evaluations used elsewhere, but should make efforts to adjust them to their particular school environment. Contents Page

Program Evaluation Guidelines  Rockwood School District (RSD) describes some guidelines to keep in mind when conducting a program evaluation: Rockwood School District  Care should be taken to ensure that the evaluation goals are addressed, and that sufficient, accurate, and valid information are provided to aid sound decision making.  The evaluation should be fair and balanced, and whenever possible, multiple data sources and data types should be obtained to corroborate the findings.  Steps should also be taken to avoid bias, and in cases where bias or other problems are unavoidable, those weaknesses should be clearly identified in the evaluation so that decision makers can draw valid conclusions.  Only consider using those informational resources that are defensible.  The report should be clearly written so that essential information is easily understood. (2010, p. 2) Contents Page

Make Goals Explicit The first step in designing a program evaluation is putting together a small team of people to organize, conduct, and analyze the evaluation. Once a team is established, Ball (2011, p. 6) emphasizes the importance of making the goals of the program evaluation explicit.Ball To assist in setting clear goals, RSD offers three key questions to consider:RSD 1. What is the purpose of this evaluation? 2. What do we desire to know about the program to be evaluated? 3. How has staff development impacted student achievement? (2010, p. 3) Contents Page

Evaluation Design After defining clear goals for the evaluation, the evaluation team may find the following list of questions from RSD helpful in designing the program evaluation:RSD  What are the characteristics and distinctive features of the program to be evaluated?  How will the results of the evaluation be used?  What data and information are currently available?  What critical new information is needed?  What resources will be needed to conduct the evaluation?  What are the time limits? When are the results needed?  What decisions might be based on the evaluation? Who should be involved in making the decisions?  Will the data collected adequately answer the questions asked about the program?  What forms of staff development (administrative, professional, support staff, etc.) are necessary to support effective implementation of the program? (2010, p. 3) Contents Page

Evaluation Focus Whether or not a program has met it’s defined goals is not the only area programs can be evaluated on. The following are additional possible areas of focus, provided by RSD, that may be considered valuable in the evaluation of the program:RSD  Views of the staff involved with the program.  Comparison of actual program results with expected results.  Reviews of currently available student achievement data or other data.  Evaluation of the current instructional materials or proposed new materials for the program.  Obtain the views of those affected by the program.  Evaluation of student effort toward learning.  Comparison of the current program with its actual program design or original goals.  Professional evaluation of the program by (external, non-district) colleagues.  Comparison of the program with similar programs in other schools or districts.  Comparison of program participant performance to nonparticipant performance at similar schools within the district.  Where applicable, evaluate the effectiveness of staff development on student achievement. (2010, p. 4) Contents Page

Develop an Evaluation Timeline  Once an evaluation plan is created, it is necessary to develop a timeline and project management plan for the program evaluation. As RSD cautions, “Care should be taken to outline key procedures within several of the steps [of the program evaluation plan] to ensure that they are implemented in a logical and efficient manner” (2010, p. 5). RSD  A tip from Ball is to “make the program staff active participants in the evaluation process,” to encourage staff to be open to and trusting of the evaluation process (2011, p. 8).Ball  When developing a timeline, be sure to communicate with the staff members who will be responsible for conducting various steps in the evaluation plan, in order to set realistic expectations of what is require and practical deadlines. Contents Page

Data Collection ProviniProvini offers some guidelines on collecting data for the program evaluation:  Use multiple formal and informal data collection tools (e.g., observation, record review, survey, interview).  Use multiple informants (students, parents and teachers).  Measure at multiple levels (individual, small-group, class, grade and population/school levels). Measure at the group level to evaluate efforts that reach smaller groups; measure at the population level to evaluate efforts that reach the entire school.  Track process indicators (reflect upon how implementation is going) so that corrections can be made early in the process.  Track both short-term and long-term outcome indicators (assess what immediate and longer-term effects efforts are having on students). (2011) In addition, RSD recommends, “if and when possible, triangulate your information, i.e., collect information/data on the same questions or issues from different sources to corroborate the evidence gathered” (2010, p.4).RSD Contents Page

Types of Data Select the best sources of quantitative (numerical) and qualitative (descriptive/narrative) data to use in the program evaluation. See examples below: Quantitative Attendance and Tardiness Awards Received Budgetary and Other Financial Data College/Vocational Enrollment, Attrition, Completion and Placement Extracurricular Participation Documentation of program implementation Graduate Follow-up Data Graduation Rates Rates of Completion of Homework Staff Development and training workshops and attendance rates Structured observation Student Discipline Data Student Grades (and longitudinal data) Survey results - numerical ratings Time in the program/on task Qualitative Case study information Documents, materials, lesson plans Focus group interviews Observations Parent compliments/complaints Structured interviews Student portfolios Survey results - commentary sections Visiting team reports Contents Page (RSD, 2010, p. 4-5)RSD

Making a Data Collection Plan When designing a plan for collecting data, RSD recommends taking the answers to the following questions into consideration:RSD  What data collection/scoring/analysis instruments and procedures already exist that we can use as examples or tap into?  What data have already been collected that we can use?  How will new information be collected?  What scoring strategies should be utilized?  What data will be needed to answer the evaluation questions?  Who needs to provide information? How long will it take? (2010, p. 6) Contents Page

Data Analysis  “The program evaluator [should think] through the question of data analysis before entering the data collection phase. Plans for analysis help determine what measures to develop, what data to collect, and even, to some extent, how the field operation is to be conducted” (Ball, 2011, p. 11).  The next step after data collection though, RSD notes, is to:RSD Gather the data collected and determine that it is valid, accurate, and reliable. Summarize the data. Develop a list of recommendations based upon the results. Take care to make valid and justifiable recommendations for program change or continuation. When writing recommendations, make sure they are practical, efficient, and capable of being implemented. (2010, p. 7) Contents Page

Data Analysis Best Practices RSDRSD lists some best-practices for analyzing data:  Report verbatim responses for small surveys.  Report results by topic or by question.  Use measures of central tendency (mean, median or mode) for reporting quantitative data.  Report the total number of respondents and when they are about 20 or more, report results to survey questions using percentages.  When possible, to show that you do not have an unduly biased sample, report the total population sampled from, i.e., there are approximately 1,000 students in the program, 500 surveys were distributed to their parents, and 300 or 60% responded.  Use the standard deviation as a measure of variability to indicate how narrowly or broadly the results are spread out in relation to the mean.  To report longitudinal growth, or year to year score gains, use Normal Curve Equivalent Scores. (2010, p. 7) Contents Page

Evaluation Conclusions  At the conclusion of the program evaluation, prepare a report and share with stakeholders of the program evaluation (i.e. those most likely to use the results).  Ball points out that “Possibly the most important principle in program evaluation is that interpretations of the evaluation’s meaning—the conclusions to be drawn—are often open to various nuances” (2011, p. 13). Ball  In order to be useful, program evaluations should be viewed as part of an on-going cycle, not a one-time event. Contents Page

Develop an Action Plan After reviewing the results of the program evaluation with stakeholders, it is crucial to develop an action plan designed to work on areas of improvement identified in the program evaluation. If no further steps are taken, the program evaluation is essentially of no value. According to RSD, an effective action plan should answer the following questions:RSD  What will the plan accomplish (the objectives)?  What activities are needed to achieve the objectives?  Who will be responsible for carrying out the activities?  What is the target completion date for each activity?  What will serve as evidence of success for each activity?  What further Staff Development is needed to support the program? (2010, p. 8) Contents Page

References  Ball, S. (2011, April). Evaluating Educational Programs. Retrieved from Educational Programs  Rockwood School District. (2010, February 2). Rockwood School District Program Evaluation Plan. Retrieved from Documents/Program%20Evaluation%20Plan.pdfRockwood School District Program Evaluation Plan  Education World. (2011). Evaluation Planning Worksheet. Retrieved from Worksheet.pdfEvaluation Planning Worksheet  Provini, C. (2011). Jump-Start Your School's Program Evaluation. Retrieved from evaluation-basics.shtmlJump-Start Your School's Program Evaluation Contents Page