Models of Curriculum Evaluation Teaching & Learning Commons

Slides:



Advertisements
Similar presentations
This is a subject-based presentation for you as PV to adapt to your subject. The subject content at present is English for Speakers of other Languages.
Advertisements

Curriculum Development and Course Design
M & E for K to 12 BEP in Schools
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Evaluation.
Evaluating and Revising the Physical Education Instructional Program.
6 Chapter Training Evaluation.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Measuring Learning Outcomes Evaluation
USING THE METHODOLOGY FOR EXTERNAL LEGAL EDUCATION QUALITY ASSESSMENT Training on behalf of USAID FAIR Justice project – 27 th and 28 th May 2015.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
Chapter 6 Training Evaluation. Chapter 6 Training Evaluation Concepts Training Evaluation: The process of collecting data regarding outcomes needed to.
5 Chapter Training Evaluation.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Dr. Shulagna Sarkar Assistant Professor , Jt. Coordinator – PGDM – HRM
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
The Nature of Organisation Chapter 2, page 55. Structure of Part 1: The Nature of Organisations The concept and role of organisations Elements of an organisation.
Chapter 6: High-Leverage Practice 1: Formative Evaluation.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Background to Program Evaluation
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
How to evaluate the impact of CPD? 28 th April 2016.
Decision-Oriented Evaluation Approaches Presenters: Chris, Emily, Jen & Marie.
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Chapter 1 Market-Oriented Perspectives Underlie Successful Corporate, Business, and Marketing Strategies.
Information for Parents Key Stage 3 Statutory Assessment Arrangements
3 Chapter Needs Assessment.
Software Quality Control and Quality Assurance: Introduction
PowerPoint to accompany:
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Training Trainers and Educators Unit 8 – How to Evaluate
TechStambha PMP Certification Training
Planning for Social Studies Instruction
ASSESSMENT OF STUDENT LEARNING
Programme Review Dhaya Naidoo Director: Quality Promotion
BEST PRACTICES IN LIBRARY INSTRUCTION FORUM November 7, 2007
SYSTEM APPROACH TO EDUCATION
PART IV TRAINING THE SALES TEAM. PART IV TRAINING THE SALES TEAM.
WEEK 4 CURRICULUM EVALUATION
Chapter Six Training Evaluation.
Chapter Three Research Design.
Training Trainers and Educators Unit 8 – How to Evaluate
Logic Models and Theory of Change Models: Defining and Telling Apart
SPIRIT OF HR.in TRAINING EVALUATION.
Student Assessment and Evaluation
Topic Principles and Theories in Curriculum Development
Job Analysis CHAPTER FOUR Screen graphics created by:
Faculty Development Dr Samira Rahat Afroze.
CATHCA National Conference 2018
Critically Evaluating an Assessment Task
Performance Management
Unit 7: Instructional Communication and Technology
Assessments and the Kirkpatrick Model
Training Evaluation Chapter 6
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Changing the Game The Logic Model
BBA V SEMESTER (BBA 502) DR. TABASSUM ALI
Designing and delivering a learner centred curriculum
Chapter 4 Instructional Media and Technologies for Learning
Student Assessment and Evaluation
6 Chapter Training Evaluation.
Kirkpatrick’s Four Levels of Evaluation
Presentation transcript:

Models of Curriculum Evaluation Teaching & Learning Commons Presentation http://talc.ukzn.ac.za

Models of Curriculum Evaluation Concept of Model Need for Models 1. Tyler’s Model 2. CIPP Model 3. Stake’s Model 4. Roger’s Model 5. Scriven’s Model 6. Krikpatricks model 4. Criteria for judging evaluation studies http://talc.ukzn.ac.za

Concept of a model Theory: Explains a process (Why?) Model Describes a process (How?) Model is a representation of reality presented with a degree of structure and order. http://talc.ukzn.ac.za

Classification of Models Three dimensional models Models Graphical Models Diagrams Flow charts Mathematical Static working Small Large model model scale scale http://talc.ukzn.ac.za

Types of Models Models Conceptual Models Procedural Models Describes: Mathematical Models Describes: What is meant by the concept? Describes: The relationship between the various elements of a situation or process Describes: How to perform a task? http://talc.ukzn.ac.za

Why do we need a model for curriculum evaluation? To provide a conceptual framework for designing a particular evaluation depending on the specific purpose of the evaluation. http://talc.ukzn.ac.za

1. Tyler’s Model (1949) Key Emphasis: Instructional Objective Purpose: To measure students progress towards objectives Method 1. Specify Instructional Objectives 2. Collect performance Data 3. Compare performance data with the objectives/standards specified http://talc.ukzn.ac.za

Limitation of Tyler’s Model 1. Ignores process 2. Not useful for diagnosis of reasons why a curriculum has failed http://talc.ukzn.ac.za

Tyler’s Planning Model(1949) What educational goals should the school seek to attain? Objectives How can learning experiences be selected which are likely to be useful in attaining these objectives? Selecting learning experiences How can learning experiences be organised for effective instruction? Organising learning experiences How can the effectiveness of learning experiences be evaluated? Evaluation of students performance http://talc.ukzn.ac.za [Print, M. (1993) p 65]

Context of the programme Input into the programme 2. CIPP Model (1971) The CIPP model of evaluation concentrates on: Context of the programme Input into the programme Process within the programme Product of the programme http://talc.ukzn.ac.za

Focuses on Decision making Intended Actual Ends Planning Decisions To determine objectives (Policy makers and Administrators) Recycling Decisions To judge and react to attainments (Policymakers,Administrators Teachers,HODs and Principals) Means Structuring Decisions To design procedures (Administrators, Principals and HODs) Implementing Decisions To utilise control and refine procedures (Teachers,HODs and Principals) http://talc.ukzn.ac.za

Types of Decisions Intended Ends(goals) Intended means(procedural designs) Actual means (procedures in use) Actual ends (attainments) http://talc.ukzn.ac.za

Intended Actual CIPP ENDS Context Evaluation Qn: What? Environment & Needs Product Evaluation Qn:Have we? Attainments Outcomes - both quality and significance MEANS Input Evaluation Qn: How? Procedural Designs Strategies & Resources Process Evaluation Qn: Are we? Procedures in use Monitoring & implementation http://talc.ukzn.ac.za

Types of Evaluation Context Evaluation Input Evaluation Process Evaluation Product Evaluation http://talc.ukzn.ac.za

Context Evaluation Objective: To determine the operating context To identify and assess needs and opportunities in the context To diagnose problems underlying the needs and opportunities Method: By comparing the actual and the intended inputs and outputs Relation to decision making: For deciding upon settings to be served For changes needed in planning http://talc.ukzn.ac.za

Needs of Industry,Society Future Technological developments Mobility of the students http://talc.ukzn.ac.za

Input Evaluation Objective: To identify and assess system capabilities, available input strategies and designs for implementing the strategies Method: Analysing resources, solution strategies, procedural designs for relevance,feasibility and economy Relation to decision making: For selecting sources of support solution strategies and procedural designs for structure changing activities http://talc.ukzn.ac.za

Entry behavior of students Curriculum Objectives Detailed contents Methods and media Competencies of teaching faculty Appropriateness of teaching / learning resources http://talc.ukzn.ac.za

Process evaluation: Objectives: To identify process defects in the procedural design or its implementation Method: By monitoring the procedural barriers and remaining alert to unanticipated ones and describing the actual process Relation to decision making: For implanting and refining the programme design and procedure for effective process control http://talc.ukzn.ac.za

The effectiveness of teaching –learning methods Feedback to judge The effectiveness of teaching –learning methods Utilisation of physical facilities Utilisation of teaching learning process Effectiveness of system of evaluation of students performance http://talc.ukzn.ac.za

Product evaluation: Objectives: To relate outcome information to objectives and to context input and process information Method: Measurement Vs Standards interpreting the outcome Relation to decision making: For deciding to continue, terminate, modify, build or refocus a change of activity. http://talc.ukzn.ac.za

Employability of technician engineers Social status of technician engineers Comparability of wage and salary structures Job adaptability and mobility http://talc.ukzn.ac.za

STUFFLEBEAM’S CIPP Model(1971) Context Input Process and Product evaluation • Key Emphasis : Decision-making • Purpose : To facilitate rational and continuing decision-making Strengths : a) Sensitive to feedback b) Rational decision making among alternatives • Evaluation : Identify potential alternatives,set up activity quality control systems http://talc.ukzn.ac.za

Limitation’s of CIIP Model Over values efficiency But undervalues students aims http://talc.ukzn.ac.za

CIPP View of Institutionalized Evaluation http://talc.ukzn.ac.za

CIPP approach recommends… • Multiple observers and informants • Mining existing information • Multiple procedures for gathering data; cross-check qualitative and quantitative • Independent review by stakeholders and outside groups • Feedback from Stakeholders http://talc.ukzn.ac.za

3. STAKE’s MODEL (1969) Antecedent is any condition existing prior to teaching and learning which may relate to outcome. Transactions are the countless encounters of students with teacher, student with student, author with reader, parent with counsellor Outcome include measurements of the impact of instruction on learners and others http://talc.ukzn.ac.za

Description Matrix Judgement Matrix Rationale Intents Observation Standards Judgement Antecedents Transactions Outcomes http://talc.ukzn.ac.za

Conditions Existing prior to Curriculum Evaluation ANTECEDENTS Conditions Existing prior to Curriculum Evaluation Students interests or prior learning Learning Environment in the Institution Traditions and Values of the Institution http://talc.ukzn.ac.za

Interactions that occur between: TEACHERS STUDENTS STUDENTS STUDENTS TRANSACTIONS Interactions that occur between: TEACHERS STUDENTS STUDENTS STUDENTS STUDENTS CURRICULAR MATERIALS STUDENTS EDUCATIONAL ENVIRONMENT TRANSACTIONS = PROCESS OF EDUCATION http://talc.ukzn.ac.za

Impact of curriculum implementation on Students Teachers OUTCOMES Learning outcomes Impact of curriculum implementation on Students Teachers Administrators Community Immediate outcomes Vs Long range outcomes http://talc.ukzn.ac.za

Conditions existing before implementation Transactions Three sets of Data Antecedents Conditions existing before implementation Transactions Activities occurring during implementation Outcomes Results after implementation Describe the program fully Judge the outcomes against external standards http://talc.ukzn.ac.za

STAKE’s Model Key Emphasis: Description and judgement of Data Purpose: To report the ways different people see curriculum Focus is on Responsive Evaluation 1.Responds to audience needs for information 2.Orients more toward program activities than results 3. Presents all audience view points(multi perspective) Limitations: 1.Stirs up value Conflicts 2.Ignores causes http://talc.ukzn.ac.za

4. KAUFMAN ROGER’S MODEL Need Assessment Where are we now? Where are we to be? Discrepancy between current status and Desired status Discrepancies should be identified in terms of products of actual behaviours (Ends) Not in terms of processes (Means) Discrepancy http://talc.ukzn.ac.za

The drawing of a particular truth from a general, antecedently known Deduction The drawing of a particular truth from a general, antecedently known Rule – examples Induction Rising from particular truths to a generalisation Examples - rules http://talc.ukzn.ac.za

GOAL FREE EVALUATION (1973) Proponent : Michael Scriven Goals are only a subset of anticipated effects Intended effects Effects Unintended effects http://talc.ukzn.ac.za

Roles of curriculum evaluation: Scriven differentiates between two major roles of curriculum evaluation: the “formative” and the “summative” Formative evaluation – during the development of the programme Summative evaluation – at its conclusion http://talc.ukzn.ac.za

It is carried out during the process of curriculum development Formative evaluation It is carried out during the process of curriculum development The evaluation results may contribute to the modification or formation of the curriculum For example, results of formative evaluation may help in Selection of programme components Modification of programme elements Summative evaluation – at its conclusion http://talc.ukzn.ac.za

Summative evaluation It is carried out after offering the curriculum once or twice. Such an evaluation will summarise the merits and demerits of the programme. A curriculum that operates satisfactorily over a period time may become obsolete. To prevent this from occurring a permanent follow up of curriculum and quality control of the programme should be maintained http://talc.ukzn.ac.za

Evaluate the actual effects against a profile of demonstrated needs Methodology: Determine what effects this curriculum had, and evaluate them whether or not, they were intended Evaluate the actual effects against a profile of demonstrated needs Notice something that everyone else overlooked or produce a novel overall perspective Do not be under the control of the Management. Choose the variables of the evaluation independently. http://talc.ukzn.ac.za

Criteria for judging evaluation studies: Validity Reliability Objectivity / Credibility Importance / Timeliness Relevance Scope Efficiency http://talc.ukzn.ac.za

Kirkpatrick's Four Levels of Evaluation In Kirkpatrick's four-level model, each successive evaluation level is built on information provided by the lower level. ASSESSING TRAINING EFFECTIVENESS often entails using the four-level model developed by Donald Kirkpatrick (1994). http://talc.ukzn.ac.za

According to this model, evaluation should always begin with level one, and then, as time and budget allows, should move sequentially through levels two, three, and four. Information from each prior level serves as a base for the next level's evaluation. http://talc.ukzn.ac.za

Level 1 - Reaction Evaluation at this level measures how participants in a training program react to it. It attempts to answer questions regarding the participants' perceptions - Was the material relevant to their work? This type of evaluation is often called a “smilesheet.” According to Kirkpatrick, every program should at least be evaluated at this level to provide for the improvement of a training program. http://talc.ukzn.ac.za

Level 2 - Learning Assessing at this level moves the evaluation beyond learner satisfaction and attempts to assess the extent students have advanced in skills, knowledge, or attitude. http://talc.ukzn.ac.za

                 To assess the amount of learning that has occurred due to a training program, level two evaluations often use tests conducted before training (pretest) and after training (post test). http://talc.ukzn.ac.za

Level 3 Evaluation - Transfer This level measures the transfer that has occurred in learners' behavior due to the training program. Are the newly acquired skills, knowledge, or attitude being used in the everyday environment of the learner? http://talc.ukzn.ac.za

Level 4 Evaluation- Results This level measures the success of the program in terms that managers and executives can understand -increased production, improved quality, decreased costs, reduced frequency of accidents, increased sales, and even higher profits or return on investment. http://talc.ukzn.ac.za

            Level four evaluation attempts to assess training in terms of business results. In this case, sales transactions improved steadily after training for sales staff occurred in April 1997. http://talc.ukzn.ac.za

Methods for Long-Term Evaluation Send post-training surveys Offer ongoing, sequenced training and coaching over a period of time Conduct follow-up needs assessment Check metrics to measure if participants achieved training objectives Interview trainees and their managers, or their customer groups http://talc.ukzn.ac.za

Thank you hansrajhr@ukzn.ac.za http://talc.ukzn.ac.za