Policy Evaluation: Determining if The Policy Works Fowler Ch. 11 Dr. Wayne E. Wright Royal University of Phnom Penh.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Introduction to Monitoring and Evaluation
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Conducting the Community Analysis. What is a Community Analysis?  Includes market research and broader analysis of community assets and challenges 
From Research to Advocacy
Donald T. Simeon Caribbean Health Research Council
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
Chapter 2 Analyzing the Business Case.
Wynne HARLEN Susana BORDA CARULLA Fibonacci European Training Session 5, March 21 st to 23 rd, 2012.
Designing an Effective Evaluation Strategy
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Why is Educational Research Important ?. Objectives Explain the difference in Evaluation and Research. Explain the difference in Evaluation and Research.
Introduction to Program Evaluation Dr. Suzan Ayers Western Michigan University (courtesy of Dr. Mary Schutten)
Chapter 3 Doing Sociological Research 1. Sociology & the Scientific Method The research process: 1.Developing a research question 2.Creating a research.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
PPA 501 – Analytical Methods in Administration Lecture 2c – The Research Proposal.
© 2010 Cengage Learning. Atomic Dog is a trademark used herein under license. All rights reserved. Chapter 4 Analyzing Jobs.
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Title I Needs Assessment and Program Evaluation
PPA 502 – Program Evaluation Lecture 4a – Qualitative Data Collection.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
UOFYE Assessment Retreat
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
VIRTUAL BUSINESS RETAILING
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Sociological Research Methods and Techniques
The Evaluation Plan.
Toolkit Series from the Office of Migrant Education Webinar: Program Evaluation Toolkit August 9, 2012.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Medical Audit.
Too expensive Too complicated Too time consuming.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Pontotoc City School District. Pontotoc City School District believes LEARNING is a priority, a need, and a desire. To be successful, we must nurture.
Orientation and Induction of Traditionally and Alternatively Educated New Teachers Jennifer Conkin October, 2012.
Outcome Based Evaluation for Digital Library Projects and Services
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
Evaluating a Research Report
Chapter 2 Observation and Assessment
Using Needs Assessment to Build A Strong Case for Funding Anna Li SERVE, Inc.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Chapter 6: Getting the Marketing Information We Need.
Review: Alternative Approaches II What three approaches did we last cover? What three approaches did we last cover? Describe one benefit of each approach.
Data sources and collection methods Ken Mease Cairo, June 2009.
The Needs Assessment Used with permission of: John R. Slate, Ph.D.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Advances in Human Resource Development and Management Course code: MGT 712 Lecture 9.
Chapter 11 POLICY EVALUATION: DETERMINING IF POLICY WORIIS.
Dr. Dan Bertrand LEEA 554 Chapter 11- Policy Evaluation: Determining If the Policy Works.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Prepared by /Dr.Amira Yahia.. Introduction  Research proposals are an integral part of most studies, and are typically prepared after a researcher has.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Publisher to insert cover image here Chapter 9
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessments Checklists, Rating Scales, and Rubrics
2 Selecting a Healthcare Information System.
Presentation transcript:

Policy Evaluation: Determining if The Policy Works Fowler Ch. 11 Dr. Wayne E. Wright Royal University of Phnom Penh

Focus Questions Why should education leaders be knowledgeable about policy evaluation? How can one tell if a proposed or completed evaluation is of high quality? Why are evaluations always political? How can a leader facilitate the evaluation process?

A “Nervous-Making” Topic Evaluation is an integral part of the professional lives of all educators Teacher evaluations School Inspectors School labeling Educational Policies must be evaluated too In an ideal world... All policies would be thoroughly and fairly evaluated Policy makers would carefully read and act on the findings Policies would be modified or terminated based on the results Reality – our world is far from ideal Many policies never evaluated at all Some are evaluated poorly Oftentimes no one acts on the findings However, sometimes it is done right

Policy Evaluation Stufflebeam (1983) We cannot make our programs better unless we know where they are weak and strong. … We cannot plan effectively if we are unaware of options and their relative merits; and we cannot convince our constituents that we have done good work and deserved continued support unless we can show them evidence that we have done what we promised and produced beneficial results. For these and other reasons, public servants must subject their work to competent evaluation. (p. 140)

Definitions Evaluation The systematic investigation of the worth or merit of an object Educational Evaluation A type of applied research in which the practices and rigorous standards of all research are used in specific a specific setting for a political purpose: determining to what extent a policy is reaching its goals Projects Policies are usually projects first – educational activities that are provided for a defined period of time Programs Projects that become institutionalized – educational activities that are provided on a continuing basis Stakeholders Individuals or groups that may be involved in or affected by a program evaluation.

Brief History of Policy Evaluation Policy evaluations have been conducted in the U.S. since the late 1880s The Elementary and Secondary Education Act (ESEA) of 1965 required the evaluation of federally funded education projects and programs Resulted in a boom of thousands of evaluations But most had negative results Too much focus on “outcomes” and not enough on the implementation process Evaluation became professionalized as a field in the 1970s Academic journals, textbooks, professional associations, graduate degree programs Today Policy Evaluation is a well-established field with a wealth of experience It is possible to determine with a fair degree of accuracy how well a policy is meeting its objectives

Characteristics of Policy Evaluations Basic steps in the policy evaluation process 1. Determine the goals of the policy 2. Select indicators Measurements or signs the goal has been reached 3. Select or develop data-collection instruments 4. Collect data 5. Analyze and summarize data 6. Write evaluation report 7. Respond to evaluators’ recommendations

Criteria for Judging Evaluations Usefulness Conducted by qualified evaluators Collected meaningful data from stakeholders Report is comprehensible to stakeholders Short, includes an executive summary Feasibility Can be done within the given the time frame, and not cause too much interruption to the project/program Propriety No conflict of interest between evaluator and project/program and its staff Accuracy Understands the sociocultural and economic context Provides enough detail about the sources of information so readers can determine the value Conclusions are well supported by the data provided

Purposes of Evaluations Summative Evaluation Assessing the quality of a policy that has been in force for a long time Purpose is to hold the implementers accountable Formative Evaluation Assessing the quality of the program on an on-going basis throughout implementation Enables implementers to make changes to improve the policy Pseudo-Evaluations (unethical!) Politically-controlled study Data collection and dissemination of final report are controlled to create the desired impression of the policy Public relations evaluation Findings are dictated in advance Data provided to evaluators are limited Limited in terms of what data they can look at, where they can go, who they can talk to Purpose is to create a positive image for a school district or program

Methodologies Used in Policy Evaluations Quantitative Research Designs Collection and statistical analysis of numerical data Test scores Retention rates Attendance figures Dropout rates Per-pupil expenditures Teachers’ salaries Teacher-pupil ratios Percentage of students on free or reduced lunch Enrollment figures Percentage of teachers with Master’s degrees Strengths People like and trust numbers Quick and cheap Weaknesses Not well suited for discovering unexpected facts about the policy E.g., Were teachers even implementing the policy right, or at all?

Methodologies Used in Policy Evaluations Qualitative Methods Types of data Transcripts of interviews and focus groups Observation field notes Open-ended surveys Personal statements Diaries/journals Meeting minutes Official reports and documents Books and materials Student work Photographs Strengths Can yield rich, detailed, in-depth information on policy implementation and outcomes Weaknesses More time consuming and expensive Findings viewed as more subjective

Methodologies Used in Policy Evaluations Holistic Evaluation Also called Mixed-Methods Evaluations Use of both Quantitative and Qualitative Data Strengths Draws on the strengths of both types of research Weaknesses in one covered by the other Weaknesses Can be more time consuming Many evaluators are only trained in one or the other method Qualitative data may be given less value

The Politics of Evaluation Three reasons why evaluations are political Programs and projects being assessed are products of the political process Evaluation reports can influence what happens in the political arena Whether or not the policy is continued, how much funding it will receive The results can affect the jobs, careers, and reputations of many individuals

The Politics of Evaluation Policy Evaluation Players Policy Makers Request the evaluations and use the results Self-Interests May want evaluations that make them or their bosses look good for policies they supported May want evaluations that are negative to get rid of programs that are controversial or unpopular among voters Policy Implementers Self-interests Typically want a positive evaluation as their jobs, careers, reputations are on the line The Clients Students, parents Self-interests Want positive evaluations of programs that provide special services to their children, even if found not be effective Evaluators Self-Interests Want to produce good research so that good policy decisions can be made Wants to produce evaluations that enhance their careers and reputations as evaluators If too negative, no one will want to hire them

Tricks to Prevent a Quality Evaluation Block the evaluation Votes to cancel or postpone an evaluation, take away funds for it, or reduce funding for it Shape the criteria to be used for evaluation to ensure the outcome Restrict what data can be accessed, who can interviewed, what can be observed, etc. Make data collection difficult or impossible Cancel interview appointments, refuse to complete surveys, “loosing” requested documents and records Attacking the quality of a completed evaluation report

Suggestions for Completing a Sound Evaluation Five key steps an educational leader can take to increase the likelihood of sound evaluations 1. Building evaluation in early 2. Communicating with stakeholders 3. Selecting indicators 4. Building in data collection 5. Choosing evaluators Internal evaluators Internal evaluators from a specialized evaluation office Outside agency or consultant Organization who funded the new policy

Acting on an Evaluation Report Educational leaders must decide what to do with an evaluation report once it is completed Four ways of acting upon it: Do nothing (Inaction) Quality of the evaluation is questionable Changes are not feasible Minor modifications Changes which do not affect spending or staffing levels, or objectives of the policy

Acting on an Evaluation Report Four ways of acting upon it: Major modifications Replacement A new program with same objectives replaces the old one Consolidation Two or more programs or parts of programs are combined Splitting One aspect of the program is removed and developed into a separate program or project Decrementing A substantial cut in funding for the program or parts of the program Termination The program is shut down and nothing is put in its place

Final Points To meet the demands of an age of accountability, school leaders must be literate about evaluation. They should take the lead in applying modern evaluation techniques to their own work and in acting on the findings of sound evaluations. The children in our schools, as well as the general public, deserve no less.

Activity Care’s work to develop bilingual education in Ratanakiri for ethnic minority children is in the process of being evaluated Recall Jan Noorlander’s guest lecture Review the methodology and tools that will be used to evaluate the program Discuss Does this evaluation cover the basic steps in the policy evaluation process? (see Fig. 11.1, p. 313) Who are the stakeholders? What are their self-interests in this evaluation? Are the methods and tools used quantitative, qualitative, or both? How effective do you think methods and tools will be? What do you think the evaluation results will show? How do you think the evaluation results will be used?