Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement.

Slides:



Advertisements
Similar presentations
1 of 21 Information Strategy Developing an Information Strategy © FAO 2005 IMARK Investing in Information for Development Information Strategy Developing.
Advertisements

Assessment types and activities
Performance Assessment
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
Introduction to Monitoring and Evaluation
From Research to Advocacy
21 st Century Process for Teaching for Learning: A Performance-Based, Results-Driven Whole School Model for Schools Rossi Ray-Taylor And Sharon Claytor.
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
Ray C. Rist The World Bank Washington, D.C.
Victoria Bernhardt Research Overview Prepared by Douglas R. Hazlett, Ph.D. Bernhardt, Victoria L. (1998). Data Analysis for Comprehensive Schoolwide Improvement.
Evaluation.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Principles of High Quality Assessment
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
What should be the basis of
Standards and Guidelines for Quality Assurance in the European
CDC Evaluation Process Harlen Hays, MPH Office of Health Promotion, KDHE.
performance INDICATORs performance APPRAISAL RUBRIC
Continuous Quality Improvement (CQI)
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
Reporting and Using Evaluation Results Presented on 6/18/15.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Performance Measurement and Analysis for Health Organizations
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
Let’s Talk Assessment Rhonda Haus University of Regina 2013.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Monitoring and Evaluation
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Context Evaluation knowing the setting Context Evaluation knowing the setting.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
National Standards in Reading & Writing Sources : NZ Ministry of Education websites. G Thomas, J Turner.
Kathy Corbiere Service Delivery and Performance Commission
WELCOME Challenge and Support. What is challenge and support Table discussion As a governor what do you think Challenge and Support looks like?
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
INFORMATION AND PROGRESS An analysis of what is happening in the Caribbean with information, decision- making and progress in Education.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
PERFORMANCE MEASURES GROUP 4. DEFINITION PERFORMANCE MEASURES. These are regular measurements of outcomes and results which generates reliable data on.
Instructional Leadership Supporting Common Assessments.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Meredith Davison Mike Roscoe
Stages of Research and Development
Presented by Anne C. Adams, MSW (919) and Bea Sweet
Consider Your Audience
Nursing Process Applied to Community Health Nursing
Introduction to Program Evaluation
Creating Priority Objectives
INFORMATION AND PROGRESS
Consider the Evidence Evidence-driven decision making
OGB Partner Advocacy Workshop 18th & 19th March 2010
Presentation transcript:

Program Evaluation Strategies to Improve Teaching for Learning Rossi Ray-Taylor and Nora Martin Ray.Taylor and Associates MDE/NCA Spring School Improvement Conference 2008

Introductions Overview for the session

Evaluation comfort levels

Research driven school design Rigor, relevance, relationships

Evaluation can provide information about Definition and evidence of the problem (needs assessment) Contextual information about the factors related to the problem and solutions The input factors and resources The interventions, processes and strategies employed Outcomes, results and effectiveness

Program evaluation can be a form of action research

Elements of a sound evaluation Clear statement of what the project is intended to do and why –Needs assessment –Theory of action –Clear, measurable statement of goal attainment Appropriate evaluation methods and tools Transparency

Evaluation can lay the ground work for accountability

Key concepts Identify the total system impact of program and financial decisions Institutions led by data and evidence of results

Types of Data Achievement Demographic Program Perception Process Costs

Qualities of Data & Information Time series, repeated data collection– how does the effect change over time? Cohort analysis – overall effect on one group over time Benchmarking & standards – makes the data relative, establishes context and comparables

Triangulation – looking for information in multiple indicators Pattern & trend analysis Leading & lagging indicators

Practical matters Accuracy Reliability Validity Accessibility

Evaluation Employ evaluation strategies from the very beginning of a project and assemble and review effectiveness data. State measurable outcomes, document processes and review progress throughout the life of the project.

“Where outcomes are evaluated without knowledge of implementation, the results seldom provide a direction for action because the decision maker lacks information about what produced the outcomes (or lack of outcomes).” Michael Quinn Patten Quoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998

Central Evaluation Questions Did the program / project do what was intended? –Did the project stick to the plan? There may be valid reasons for varying from the plan – if so what are they?

What is the theory of change for the project? Why is this project being carried out this way and why is it judged to be the most appropriate way? –These questions are important because they help when judging the impact of changes in the plan.

What is the context for the project? –Issues include past trends, local politics, resource distribution, threats, opportunities, strengths and weaknesses

What actually happened during the course of the project? Who was served by the project? Why? Who was not served by the project? Why? Was the project valued by the intended audience?

What were the inputs, resources – both real e.g., financial and material – and intellectual? What was the “cost” of the project?

What were the results /outcomes? Were goals and objectives met? What were the intended and unintended consequences? What was the impact on the overall system? Was there a process impact – did the project result in a change in the way that business is done?

A successful project begins and ends with a good evaluation design

How do these questions improve teaching for learning?

The best and most “sticky”, lasting interventions have the following components They are based in research and evidence They are locally constructed and customized

They are targeted to a clear view of the problem to be solved They are built for sustainability They are “owned” – not just a matter of compliance They are designed to build local capacity

Understand change and apply research about change to improve teaching for learning Change takes trust Change takes building relationships Change takes endurance (time) Change takes knowledge of research

Examine policies and practices that serve as barriers and those that serve as catalysts to achievement

Evaluate Audit the system Measure results Change to goal not just awareness or implementation

Meta Evaluation Use meta evaluation strategies to look for results across projects and interventions

Evaluation readiness Identify and clearly state program goals and outcomes Transform these goals into measurable objectives Define program theory and supporting research

Develop the formative and summative evaluation plan Develop the plan to gather data, deploy evaluation resources and gather information from and report to stakeholders

Design for continuous feedback and transparency

Consider system evaluation policies and expectations At the time of proposal initiatives, programs and projects should be designed to include program evaluation

“It’s easy to make judgments – that’s evaluation. It’s easy to ask questions about impact – that’s evaluation. It’s easy to disseminate reports – that’s evaluation. What’s hard is to put all those pieces together in a meaningful whole which tells people something they want to know and can use about a matter of importance. That’s evaluation.” Halcolm Quoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998

Program evaluation planning document Goal: What does this project intend to achieve? State goals in observable terms. Limit the project to 3-5 overall goals. Objective State project objectives in measurable terms Measure/ Criteria How will the objective be measured? What tools or methods will be used? What are the criteria for success? Evaluation activities & questions What activities and tasks need to be accomplished to carry out evaluation of this objective? State the evaluation questions Time line When will data be collected and reported? Person responsible Who is responsible for this task? Resources and budget What is the budget and what other resources will be needed?

In the beginning you think. In the end you act. In between you negotiate the possibilities. Some people move from complexity to simplicity and on into catastrophe. Others move from simplicity to complexity and onward into full scale confusion. Simplification makes action possible in the face of overwhelming complexity. It also increases the odds of being wrong. The trick is to let a sense of simplicity inform our thinking, a sense of complexity inform our actions, and a sense of humility inform our judgments…” Michael Quinn Patten (p. 143 Bernhardt) Quoted in “Data Analysis for Comprehensive Schoolwide Improvement”, Victoria L. Bernhardt, 1998

Resources Center for Evaluation & Education Policy The Evaluation Center; Western Michigan University

More Resources What Works in Schools Translating Research into Action by Robert J. Marzano Data Analysis for Comprehensive Schoolwide Improvement; Victoria L. Bernhardt

More Resources The “Data Wise” Improvement Process: Eight steps for using test data to improve teaching and learning, by Kathryn Parker Boudett et al in Harvard Education Letter, January/February 2006 Volume 22, Number 1.

Rossi Ray-Taylor, PhD Nora Martin, PhD Ray.Taylor and Associates 2160 S. Huron Parkway, Suite 3 Ann Arbor, Michigan