September 20, 2005 POLS 4300.06/PUBL 6800.03 Questions from last week? Rigor versus relevance in P. E. Maximum value for investment Limits of rational.

Slides:



Advertisements
Similar presentations
Chapter 3: Clinical Decision-Making for Massage
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
Copyright © 2003 Pearson Education, Inc. Slide 1 Computer Systems Organization & Architecture Chapters 8-12 John D. Carpinelli.
Science Subject Leader Training
Developing Indicators to measure progress of implementation of HFA An Indian Perspective P.G.Dhar Chakrabarti Executive Director National Institute of.
1. 2 Considering the Establishment Survey Response Process in the Context of the Administrative Sciences Diane K. Willimack U.S. Census Bureau.
Developing and Implementing a Title I Plan
Healthy Schools-Successful Students Coordinated School Health in Washington.
Development and Implementation of a Recovery-Based System: Comparison of Instruments for Assessing Recovery Jeanette M. Jerrell, Ph.D. Professor of Neuropsychiatry,
Dissertation Preparation Meeting on Dissertation Preparation for MSc Projects – 30 th June 2005 by Lydia Lau Why this meeting? More than just write-up!
School Based Assessment and Reporting Unit Curriculum Directorate
Break Time Remaining 10:00.
DBS Development Lifecycle & DB Analysis
Chapter 5 – Enterprise Analysis
2009 Strategic Planning playbook
EIS Bridge Tool and Staging Tables September 1, 2009 Instructor: Way Poteat Slide: 1.
Promoting Rational Drug Use in the Community Monitoring and evaluation.
SEED – CT’s System for Educator and Evaluation and Development April 2013 Wethersfield Public Schools CONNECTICUT ADMINISTRATOR EVALUATION Overview of.
7/16/08 1 New Mexico’s Indicator-based Information System for Public Health Data (NM-IBIS) Community Health Assessment Training July 16, 2008.
Evaluation What, How and Why Bother?.
Creating District C.I.P. And Building S.I.P. Making Sure The Canaries Don’t Die While You Are Data Mining.
Essential Cell Biology
1 Phase III: Planning Action Developing Improvement Plans.
Clock will move after 1 minute
Energy Generation in Mitochondria and Chlorplasts
Murach’s OS/390 and z/OS JCLChapter 16, Slide 1 © 2002, Mike Murach & Associates, Inc.
PROGRAM EVALUATION Carol Davis. Purpose of Evaluation The purpose of evaluation is to understand whether or not the program is achieving the intended.
Summative Evaluation The Evaluation after implementation.
 Reading School Committee January 23,
1 Michigan Department of Education Office of School Improvement One Common Voice – One Plan Michigan Continuous School Improvement (MI-CSI)
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
Formative and Summative Evaluations
The Academic Assessment Process
Objectives for Session Nine Observation Techniques Participatory Methods in Tanzania Hand back memos.
Creating Research proposal. What is a Marketing or Business Research Proposal? “A plan that offers ideas for conducting research”. “A marketing research.
At the end of this module, participants should have a better understanding of the following : Elements of Gender Mainstreaming Basics of Gender Analysis.
Instructional Design presented by Roberto Camargo
Action Research.
Product Evaluation the outcome phase. Do the magic bullets work? How do you know when an innovative educational program has “worked”? How do you know.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
ADDIE Instructional Design Model
FACT OVERVIEW. 22 Inquiry Focus and Number /Year Program Level Decision  CONTEXT FOR TEACHING Class, School, District, and Community Conversation Guides.
Texas Education Agency Updated F-2 FOUNDATION.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Glendale Elementary School District Professional Development August 15, 2012.
Evaluation Basics Principles of Evaluation Keeping in mind the basic principles for program and evaluation success, leaders of youth programs can begin.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Conducting Community & Program Needs Assessments.
Peer Review: Promoting a quality culture Associate Professor Gordon Suddaby & Associate Professor Mark Brown Massey University New Zealand Contact details:
By: Catherine Mendoza. Evaluate Implement Develop Analyze Design.
Identify a Health Problem Qualitative Quantitative Develop Program -theory -objectives -format -content Determine Evaluation -design -sampling -measures.
The Program Evaluation Cycle Module 3. 2 Overview n Overview of the evaluation cycle n Major components of the cycle n Main products of an evaluation.
The Interactive Model Of Program Planning
Program Evaluation Presenter: Dr. Laura R. Dawson Executive Vice President/COO The Dawson Group of Virginia, Inc.
Research, Research, Research Understanding the Basics Jim Yonazi, Ph. D The Center for ICT Research and Innovations – C i RI
Chapter 4 Developing and Sustaining a Knowledge Culture
How People Learn – Brain, Mind, Experience, and School (Bransford, Brown, & Cocking, 1999) Three core principles 1: If their (students) initial understanding.
Unit 1 Module 4 Explain: Roles of the Evaluator Introduction to Educational Evaluation Dr. Kristin Koskey.
Assessment Small Learning Communities. The goal of all Small Learning Communities is to improve teaching, learning, and student outcomes A rigorous, coherent.
1 Classroom Assessment Compiled by Linda Blocker.
Math and Science Partnership National Science Foundation MSP Project Description FY’06 Institute Partnerships  Vision, Goals and Outcomes  Vision, Goals.
Preparing for a Special Visit: What Works Marjorie Jaasma, Roxanne Robbin, Scott Davis.
Assessment of Your Program Why is it Important? What are the Key Elements?
Five Steps to Creating Quality Assessment Plans and Reports Spring 2009.
Session VII: Formulation of Monitoring and Evaluation Plan
Individual Thinking Time
Institutional Self Evaluation Report Team Training
Presentation transcript:

September 20, 2005 POLS /PUBL Questions from last week? Rigor versus relevance in P. E. Maximum value for investment Limits of rational comprehensiveness Strategy for focusing the issues ‘Lines crossing on the graph’ An applied, not a basic, social science

September 20, 2005 (2) POLS /PUBL P.E. and Organizational Improvement Creating and Sustaining an Evaluation Habit of Mind Positive versus Negative Effects of P.E. on Staff The Manitoba School Improvement Program Inc. (MSIP) as a Case Study Motivation to Improve Human Services?

September 20, 2005 (3) POLS /PUBL Evaluator’s Handbook, chap 1 Page 10: Forms of Program Evaluation Stages of the Program Cycle to be Evaluated The Request or Requirement: What Is Called for? Formative and Summative Assessments Quantitative and Qualitative Approaches

September 20, 2005 (4) POLS ?PUBL Evaluator’s Handbook, chap 1 (continued) Deciding what to observe and/or measure Processes, outcomes, goals, decisions Contexts, costs and constraints User orientation and responsiveness Central to ongoing needs assessments The key to successful implementations P.E. as mainly social and political

September 20, 2005 (5) POLS /PUBL Evaluator’s Handbook, chap 2 Formative and/or Summative: A matter of different but overlapping emphases Phase A: Set the Boundaries of the Evaluation (8-10 tasks, pp ) Phase B: Select Appropriate Methods ( tasks, pp ) Phase C: Collect and Analyze Information

September 20, 2005 (6) POLS /PUBL Phase D: Report Findings Table 4: Timeline of Formative Evaluator’s Responsibilities (p. 40) Table 5: Differences Between Formative and Summative Evaluation Reports (p. 41) Structure in Context the Essential Key to Responsible Program Evaluation between Rigor and Relevance

September 20, 2005 (7) POLS /PUBL Stecher and Davis, How to Focus an Evaluation (pp. 9-61) Focus = Clarify and Sharpen Elements of the Focusing Process Beliefs/Expectations of Evaluator & Client Information Gathering Activities Formulating an Evaluation Plan (phases) 5 Approaches to Evaluation (pp )

September 20, 2005 (8) POLS /PUBL What is the Program being Evaluated? Why precisely is it being evaluated? What constraints limit the evaluation? Information gathering using each method Table 3: Questions to guide information gathering (pp ) VERY USEFUL! Questions and Comments?