Daniel L. Stufflebeam C. I. P. P. Evaluation Model.

Slides:



Advertisements
Similar presentations
Critical Reading Strategies: Overview of Research Process
Advertisements

Chapter 6 Process and Procedures of Testing
Dr. Hamda Qotba, M.D,MFPH,FFPH
Bringing it all together!
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Evaluating the Curriculum
Basic Concepts of Strategic Management
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Laura Pejsa Goff Pejsa & Associates MESI 2014
©2006 OLC 1 Process Management: The Foundation for Achieving Organizational Excellence Process Management Implementation Worldwide.
Management: Analysis and Decision Making
The Purpose of Action Research
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Chapter 4 Validity.
Planning and Strategic Management
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
introduction to MSc projects
Chapter 2 Succeeding as a Systems Analyst
WM Software Process & Quality Generic Processes - Slide #1  P. Sorenson SPiCE Reference Model - how to read Chapter 5 Capability Levels (process.
System Analysis System Analysis - Mr. Ahmad Al-Ghoul System Analysis and Design.
Hazard Analysis and Critical Control Points
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Stufflebeam’s Approach to Evaluation The Model Norbert Jerina Javier Leung Tirana Porter Destini Robinson Michelle Salmona Cheral Wintling.
Chapter One: The Science of Psychology
Standards and Guidelines for Quality Assurance in the European
Continuous Quality Improvement (CQI)
6. Penilaian Kurikulum The Meaning of Evaluation 1.Evaluation is a process or group of processes by which evaluators gather data in order to make decisions.
RESEARCH DESIGN.
Integrated Capability Maturity Model (CMMI)
Parliamentary Committees in Democracies: Unit 4 Research Services for Parliamentary Committees.
Management-Oriented Evaluation …evaluation for decision-makers. Jing Wang And Faye Jones.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Chapter One: The Science of Psychology. Ways to Acquire Knowledge Tenacity Tenacity Refers to the continued presentation of a particular bit of information.
Quality Management.  Quality management is becoming increasingly important to the leadership and management of all organisations. I  t is necessary.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
How To Build a Testing Project 1 Onyx Gabriel Rodriguez.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
CHAPTER 1 Understanding RESEARCH
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
+ Chapter 9: Management of Business Intelligence © Sabherwal & Becerra-Fernandez.
CMPT 880/890 The Scientific Method. MOTD The scientific method is a valuable tool The SM is not the only way of doing science The SM fits into a larger.
Conducting and Reading Research in Health and Human Performance.
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Michael Campe U.S. Army Aviation and Missile Command NDIA TID Technical Information Division Symposium Royal Sonesta Hotel, New Orleans, LA August 2003.
Basic Nursing: Foundations of Skills & Concepts Chapter 9
Introduction Chapter 1 and 2 Slides From Research Methods for Business
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
CRITICAL THINKING AND THE NURSING PROCESS Entry Into Professional Nursing NRS 101.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Discuss the analytical skills, including systems thinking, needed for a systems analyst to be successful Describe the technical skills required of a systems.
The Psychologist as Detective, 4e by Smith/Davis © 2007 Pearson Education Chapter One: The Science of Psychology.
Chapter 14 - Analyzing a Case and Writing a Case Report 1 Understanding the Case Method of Learning What is the case method?  Applies the ancient Socratic.
DECISION MAKING This is an act of choice where a conclusion is formed about what must and what must not be done in a given situation.
The Nature of Research.  What is the value of research -- Why Research is of Value -- Ways of Knowing -- Types of Research -- General Research Types.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Certified Software Tester How To Build a Testing Project, Part 1.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Viewing Data-Driven Success Through a Capability Lens
TechStambha PMP Certification Training
Outline What is Literature Review? Purpose of Literature Review
WEEK 4 CURRICULUM EVALUATION
Chapter # 8 Quality Management Standards
Presentation transcript:

Daniel L. Stufflebeam C. I. P. P. Evaluation Model

CIPP Model Objectives: Be familiar with Stufflebeam’s educator background Understand Stufflebeam’s CIPP model Be able to discuss the HRD “essence” of the CIPP model

CIPP Model Pre - Test 1. What do the letters CIPP stand for? 2. What profession is Daniel L. Stufflebeam? 3. Name the three major steps for any evaluation. 4. Draw the matrix for the four decision-making settings. .

CIPP Model Pre-Test 5. Describe the General Evaluation Model. 6. Classify each evaluation type within the ends, means, intended and actual matrix. 7. Name the four evaluation types and their decision-making purpose

Stufflebeam Biography CIPP Model Stufflebeam Biography Daniel Leroy Stufflebeam, education educator Born in Waverly, Iowa, September 19, 1936 BA, State University Iowa, 1958 MS, Purdue University, 1962, Ph D, 1964; postgrad., University of Wisconsin 1965

Stufflebeam Biography CIPP Model Stufflebeam Biography Professor, Director Ohio State University Evaluation Center, Columbus, 1963 - 1973 Professor education, Director Western Michigan University Evaluation Center, Kalamazoo, 1973 - Author monographs and 15 books; contributed chapters to books, articles to professional journals

CIPP Model Recipient Paul Lazersfeld award Evaluation Research Society, 1985 Member American Educational Research Association, National Council on Measurement in Education, American Evaluation Association Served with the United States Army, 1960 Children: Kevin D., Tracy Smith, Joseph

CIPP Model Key Components : 2. Major 3 steps for any evaluation 1. Evaluation definition 2. Major 3 steps for any evaluation 3. Decision-making settings 4. Types of decisions 5. General evaluation model 6. Types of evaluation 7. Total evaluation model

CIPP Model Definition: Evaluation is the process of delineating, obtaining and providing useful information for judging decision alternatives

CIPP Model Evaluation: ascertainment of value Definition Key Terms: Evaluation: ascertainment of value Decision: act of making up one’s mind Then from the decision-maker viewpoint: Evaluation is the process of ascertaining the relative value of competing alternatives

CIPP Model Evaluation is: Decision-making driven Systematic and continuing process Made-up of 3 major steps/methodologies 1. Delineating 2. Obtaining 3. Providing

CIPP Model Definitions of Evaluation Steps: 1. Delineating - focusing the requirements for information to be collected through specifying, defining and explicating

CIPP Model Definitions of Evaluation Steps: 2. Obtaining - making information available through processes such as collecting, organizing and analyzing and through means such as statistics and measurement 3. Providing - fitting together into systems or sub-systems that best serve the needs or purposes of the evaluation

CIPP Model Decision-Making Settings High Low Degree of Change Small Information Grasp Low Degree of Change Small Large Decision-Making Settings

CIPP Model Decision-Making Settings - Key Points: Driven by the relation of useful information available to degree of change to be effected Importance/consequences of the decision to be made drives evaluation extensiveness Little information available or not in useful form drives more evaluation extensiveness

CIPP Model Decision-Making Setting Definitions 1. Metamorphic - utopian complete change in the educational system with full information/knowledge of how to effect the desired changes (low probability) 2. Homeostatic - small, remedial, restorative to normal state changes to the educational system guided by technical standards and routine data collection systems (prevalent “quality control” with low risk)

CIPP Model Decision-Making Setting Definitions 3. Incremental - continuous improvement in an educational system intended to shift the program to a new norm (rather than correct back to a norm for homeostatic) but guided by little available knowledge and ad-hoc/special project in nature (allows “innovation” in a trial and error and iterative nature with acceptable risk since small corrections can be made as problems are detected)

CIPP Model Decision-Making Setting Definitions 4. Neomobilistic - innovative activities for major change/new solutions to significant problems in an educational system but supported by little theory and little knowledge; driven by great and compelling opportunities like knowledge explosion, critical conditions or world competition (becoming more prevalent in response to needed higher rates of change under worthy risk)

CIPP Model Ends Means Types of Decisions

CIPP Model Types of Decisions Matrix: Forms the model of all possible educational system needed decision-making categories while also being mutually exclusive (ends, means, intended and actual) Provides for a generalizable evaluation design model

CIPP Model General Evaluation Model System Activities Decisions 3. 1. 2. General Evaluation Model Decisions

CIPP Model Types of Evaluation: Context Evaluation - to determine objectives Input Evaluation - to determine program design Process Evaluation- to control program operations Product Evaluation -to judge and react to program attainments

CIPP Model Ends Means Types of Decisions and Evaluations

CIPP Model Evaluation Design: Evaluations are designed after a decision has been made to effect a system change and the actual evaluation design is driven by the decision-making setting Generally: greater the change and lower the information grasp the more formal, structured and comprehensive the evaluation required

Evaluation Type Objectives: CONTEXT EVALUATION Provides rationale for determination of objectives Defines relevant environment Describes desired and actual conditions of environment Identifies unmet needs Identifies unused opportunities

Evaluation Type Objectives: INPUT EVALUATION Determines how to use resources Assesses capabilities of responsible agency Assesses strategies for achieving objectives Assesses designs for implementing a selected strategy

Evaluation Type Objectives: PROCESS EVALUATION Detect or predict defects in procedure design or its implementation Provide information for programming decisions Maintain record of the procedure as it occurs

Evaluation Type Objectives: PRODUCT EVALUATION Measure attainments Interpret attainments Done as often as necessary during the program life

A Total Evaluation Model: 1. Follows the general evaluation model relationships between activities, evaluation and decisions and uses the 3 major steps for any evaluation 2. Need a full time program evaluator

A Total Evaluation Model: 3. Need a continuous and systematic context evaluation process sponsored by the program planning body for the purpose of deciding to change or continue with program goals and objectives 4. Initiate specific and ad-hoc input, process and product evaluations only after a planning decision to effect a system change

A Total Evaluation Model: 5. Specific evaluation designs vary according to the setting for the change Homeostatic (small changes with adequate information) Incremental (low information for small changes) Neomoblistic (low information for large changes) (exclude Metamorphic since only theoretical relevance)

CIPP Model HRD Essence HRD viewpoint Formative - Summative Evaluation traditions

CIPP Model HRD Viewpoint Discrepancy Democratic Analytical Diagnostic - CIPP: logical and research based approach of the total training system

CIPP Model Formative - Summative Context Input formative Process Product summative

CIPP Model Evaluation Traditions Scientific - 1950’s Systems - 1970’s CIPP Qualitative - 1980’s Eclectic - late 1980’s

Post - Test 1. What do the letters CIPP stand for? 2. What profession is Daniel L. Stufflebeam? 3. Name the three major steps for any evaluation. 4. Draw the matrix for the four decision-making settings. 5. Describe the General Evaluation Model. 6. Classify each evaluation type within the ends, means, intended and actual matrix. 7. Name the four evaluation types and their decision-making purpose.