Download presentation
Presentation is loading. Please wait.
Published byNathan Long Modified over 9 years ago
1
Dr. Dan Bertrand LEEA 554 Chapter 11- Policy Evaluation: Determining If the Policy Works
2
Definitions Evaluation- Systematic investigation of the worth or merit of an object. Projects- educational activities that are provided for a defined period of time. Stakeholders- individuals or groups that may be involved in or affected by a program evaluation.
3
History of Education Policy Early evaluation can be traced to pre-civil war Boston. 1 st large scale evaluation by Joseph Rice to evaluate spelling instruction. In the 1930’s Ralph Tyler (OSU) directed the 8 th Yr Study that set the guidelines for program evaluations in terms of objectives.
4
History of Evaluation The War on Poverty Passage of the Elementary and Secondary Education Act of 1965 mandated the evaluation of Title I and II The Professionalization of Evaluation It emerged and became an integral part of the education profession. Professional Journals were established. Government and universities established centers to conduct research and development. In mid 1990’s, ASCD and PDK published evaluation handbooks.
5
Characteristics of Policy Evaluation Evaluation Process-7 steps 1) Determine the goals of the policy 2) Select indicators 3) Select or develop data collection instruments 4) Collect data 5) Analyze and summarize data 6) Write evaluation report 7) Respond to evaluators recommendations.
6
Criteria for Judging Evaluation From Program Evaluation Standards of 1994 4 categories with 30 standards 1) Usefulness- evaluation must be done by a qualifed team. 2) Feasibility- evaluation must be doable without imposing unreasonable strains on the school. 3) Propriety- must conform to accepted norms for research. 4) Accuracy
7
Purposes of Evaluation Summative Evaluation- to hold the implementers of the policy accountable. May assess the quality of the policy over time. Formative Evaluation- enables the implementers to make changes as needed to improve it. On-going and recurrent ( version) Pseudo-Evaluations- unethical in nature. Politically controlled or a public relations evaluation.
8
Methodologies Used Quantitative – involves the collection and statistical analysis of numeric data Quantitative- Collection of verbal or pictorial data Triangulation- Collecting of several types of data for comparison. Holistic – Combination of quantitative and qualitative
9
Facilitating Meaningful Policy Evaluations Political Program or projects are products of the political process. Reports influence what happens in the political arena. Careers and reputations of individuals depend on the outcome of the evaluation. Political Players in the Evaluation Arena Policy makers, Policy Implementers, Clients and Evaluators
10
Maneuvers to Prevent a Good Evaluation Block the evaluation and prevent it from occurring. Shape the criteria so the desired outcome is assured. Mobilizing clients against the evaluators. Implementers make data gathering impossible or difficult. Attacking the quality of the evaluation upon its completion.
11
5 Key Steps to a Good Evaluation 1) Building evaluation in early 2) Communicating with stakeholders 3) Selecting indicators at the start 4) Building in data collection- on-going 5) Choosing evaluators- inside, outside, organization
12
Acting on the Evaluation Report Inaction- do nothing and maintain the policy Minor Modifications Major Modifications Replacement Consolidation Splitting Decrementing Termination
13
Activity Case study – page 329 News Story for Analysis- page 330
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.