Instructional Design and Content Development Workshop

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

A Systems Approach To Training
Performance Assessment
Customised training: Learner Voice and Post-16 Citizenship.
Curriculum Development and Course Design
SNDT Women's University Introduction to Evaluation and Assessments  Presented by Kathleen (Kat) Miller, Senior Consultant with Booz Allen Hamilton  4.
Training Evaluation Presentation by Ranjith Menon.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Training Kirkpatrick’s Four Levels of Evaluation Kelly Arthur Richard Gage-Little Dale Munson Evaluation Strategies for Instructional Designers.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
An Introduction to Instructional Design Online Learning Institute Mary Ellen Bornak Instructional Designer Bucks County Community College.
Fred Barranger (N72) John Wishall (N721B) 24 February 2015
SIX STAGE MODEL FOR EFFECTIVE HRD EVALUATION
Assisting Peers to Provide W orthwhile Feedback UC Merced SATAL Program.
Learning, Instruction and the PADI System
Instructional System Design.  The purpose of instructional design is to maximize the value of instruction for the learner especially the learner's time.
Unit 10: Evaluating Training and Return on Investment 2009.
Formative and Summative Evaluations
Dr. Pratibha Gupta Associate professor Deptt. of Community Medicine ELMC & H, Lucknow.
Human Resource Management: Gaining a Competitive Advantage
Tutorial of Instructional Design
Measuring Learning Outcomes Evaluation
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
La Naturaleza.  The generic term for the five-phase instructional design model consisting of Analysis, Design, Development, Implementation, and Evaluation.
Professional Growth= Teacher Growth
FLCC knows a lot about assessment – J will send examples
Chapter 9 Employee Development
How to Develop the Right Research Questions for Program Evaluation
FORMATIVE EVALUATION Intermediate Injury Prevention Course August 23-26, 2011, Billings, MT.
Human Learning John Penn. Learning Theory A learning theory is a theory that explains how people learn and acquire information. A learning theory gives.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
LISA HOLMES INSTRUCTIONAL DESIGN. The process of creating an effective learning experience that takes into consideration: what needs to be learned what.
A Security Training Program through Transformational Leadership and Practical Approaches Tanetta N. Isler Federal Information Systems Security Educators’
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Barry Williams1 Analyzing Learners & Context Dick & Carey, Chp. 5.
Instructional/Program Design Produced by Dr. James J. Kirk Professor of HRD Western Carolina University.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
CPS ® and CAP ® Examination Review ADVANCED ORGANIZATIONAL MANAGEMENT By Garrison and Bly Turner ©2006 Pearson Education, Inc. Pearson Prentice Hall Upper.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
EVALUATION OF HRD PROGRAMS Jayendra Rimal. The Purpose of HRD Evaluation HRD Evaluation – the systematic collection of descriptive and judgmental information.
Designing a Training Program RATIONALE OF THE TRAINING Background or introduction of what the training is all about –Developments in the field/discipline/area.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
Facilitate Group Learning
ANALYSIS 1 WEEK 2. WHY ANALYZE? to determine what the root cause is between the way things are and they way they should/could be. ADDIE Model Explained.
Training  Addresses a knowledge and skill deficit  “How to get the job done” Technology Transfer  Broader scope than training  Create a mechanism.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Kirkpatrick’s Four-Level Model of Evaluation
Instructional Design Ryan Glidden. Instructional Design The process of planning, creating, and developing instructional or educational resources.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
Employee Development Human Resource Management. Employee Training: Trends n Four economic and demographic trends u Unskilled and undereducated youth u.
Relationships in the 21 st Century Parent Teachers Students Association (PTSA) Goals, Membership, Participation.
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Output of Development Phase Training Title: Training Duration: Trainer: Seating: Target Population: Overall Training Objectives: After having attended.
How to Demonstrate the Value of Contract Education Jack J. Phillips, Ph.D. Chairman ROI Institute, Inc. May 11, :00 AM- 12:00 PM 5 Levels of Evaluation.
Evaluating Training The Kirkpatrick Model.
Training processes for extension education
INSTRUCTIONAL DESIGN Many definitions exist for instructional design 1. Instructional Design as a Process: 2. Instructional Design as a Discipline: 3.
Instructional Design Gibran Carter.
EDU 675 STUDY Education for Service-- edu675study.com.
Topic Principles and Theories in Curriculum Development
Assessments and the Kirkpatrick Model
Adjunct Training – August 2016 | Jason Anderson
Chapter 4 Instructional Media and Technologies for Learning
Presentation transcript:

Instructional Design and Content Development Workshop Evaluating Learning Effectiveness

Topics Define a program of evaluation Evaluation in learning Outline major reasons to conduct a program evaluation Why conduct an evaluation? General evaluation models. Who is Kirkpatrick? Kirkpatrick’s four level learning evaluation model Write questions for each level to assess and evaluate the developed course.

What is an evaluation? Evaluation is a systematic method for collecting, analyzing, and using information to answer basic questions about a program. Evaluation is valuable for strengthen the quality of the program and improve outcomes for the learners. Evaluation answers basic questions about a program’s effectiveness, and evaluation data can be used to improve program services.

Evaluation In Learning Evaluation means assessing the effectiveness and possible improvement of a course. Evaluation is part of any instructional design model. Evaluation provides review checkpoints for each phase of ADDIE that allow the instructor to evaluate the work that has been produced. Without completing this portion of ADDIE model, the e-course is incomplete because the course cannot be redesigned and improved without EVALUATION is implemented.

MAJOR REASONS TO CONDUCT AN EVALUATION Evaluation finds out “what works” and “what does not work.” Evaluation showcases the effectiveness of a program to the community. Evaluation improves staff’s practice with participants. Evaluation increases transfer of learning to behavior/performance in order to maximize program results. Evaluation increases a program’s capacity to conduct a critical self assessment and plan for the future. Evaluation builds knowledge for the out-of-campus time field. Evaluation finds out “what works” and “what does not work.” A process or outcome evaluation enables to answer basic questions about a program’s effectiveness, including: Are participants benefiting from program services? Do instructors have the necessary skills and training to deliver services? Are participants satisfied with the program? Are some sub-groups benefiting, but not others? Knowing “what works” helps to focus resources on the essential components of the program that benefit participants. Knowing “what does not work” allows to improve and strengthen the service delivery models. Not knowing what is working may waste valuable time and resources.

Why conduct an evaluation? The ADDIE model stresses the concept that good training programs require planning, review, and revision. The evaluation phase focuses on gabs to mend them. Increase transfer of learning in order to maximize learning / e-content results. The evaluation phase measures the course's efficiency and locates opportunities to improve learners' performance.

Benefits of Evaluation Effectiveness: It shows the progress made toward program goals and objectives. Best Practices: It provides the ability to determine what program approaches are most effective. Improvement: It provides ongoing assessment of program design and implementation to identify areas of improvement. Impact: It demonstrates economic or human impact. Accountability: It provides the base for interpreting an organization or programs’ worth to its stakeholders. Promotion & Advocacy: It informs policymakers about programs successes. Appraisal & Coordination: It gives managers the performance information to make better operational decisions.

General Evaluation Models There are a lot of models clarifying the evaluation process and determining the information needs of the intended audiences of the evaluation, these models such as: Tyler’s early conception of assessing attainment of program objectives. Decision-Making Evaluation Approaches. Naturalistic Evaluation Approaches Kirkpatrick’s Four Levels for evaluating program effectiveness; which suggests the most appropriate evaluation methodology to be used.

Who is Kirkpatrick? Donald Kirkpartick is Professor of the University of Wisconsin in the USA and a past president of the American Society for Training and Development. He has developed a very popular evaluation model that has been used by the training/learning community. He focused on measuring four kinds of outcomes that should result from a highly effective training program.

Kirkpatrick’s four levels of evaluating learning Kirkpatrick’s model includes four levels or steps of outcome evaluation: Learning Reaction Results Behavior

Level one - Evaluate Learner Reaction How well did the learners like the learning process? The goal is find out the reaction of the learners to the instructor, course and learning environment. The purpose is not to measure what the learners has learned, but whether the delivery method was effective and appreciated. This type of evaluation can be incorporated at the end of the instruction and can be delivered online.

Level one - Evaluate Learner Reaction General questions can include the following: Did the instructor attend the sessions on time? Did the instructor respond to learners’ comments and questions? Did the instructor deliver the information clearly and smoothly? Were there distraction? Did the learners feel comfortable in the surroundings? E-content questions can include the following: Level of appeal of the instruction. The ease of navigation and use of tools. The ability of the course to motivate and retain interest. The amount of interactive exercises. The relevance of the objectives. Quality and relevance of multimedia.

Level two - Evaluate Learning What did learners learn? (the extent to which they gain knowledge, skills and attitude) The goal is measuring learning results. In other words: did the learners learn what intended to be taught?

Level two - Evaluate Learning Measurement methods of level two include: Formal and informal testing. Self assessments at the beginning (pre-test) and end (post-test) for learners. http://www.reap.ac.uk/reap/public/papers//DN_SHE_Final.pdf Interviews, observation and feedback. Product Assessment. Creation of a project as an authentic assessment to showcase the knowledge and talents. This form of assessment evaluates whether the learner can apply the learned skills or concepts in a concrete fashion.

Level three - Evaluate Performance What changes in performance resulted from the learning process? To what degree participants apply what they learned during training when they are back on the job. The goal is measuring “what happens when learners leave the classroom and return to their daily lives/job. Did the learners put their learning into effect when back on the job/live? Would the learners be able to transfer their learning to another person? How much transfer of learning occurs? It is the most obvious sign to evaluate the training program’s effectiveness.

Level three - Evaluate Performance Measurement methods of level two include: Allow time for a change in behavior to take place Testing Observation, interview and survey. The learner interactions on the job. Evaluate both before and after the program if that is practical

Level four - Evaluate Results What are the tangible results of the learning process in terms improved quality, increased production, efficiency, etc.? To what degree targeted outcomes occur as a result of learning event(s) and subsequent reinforcement The goal is to find out if the training program led to final results, especially business or environment results Level four outcomes include the major results that contribute to the well functioning of an organization. Level four outcomes Improve quality of work, increase productivity and profits. Across an entire organization it becomes more challenging and affect organizational and business performance, which cloud the true cause of good or poor results.

Level four - Evaluate Results Measurement methods of level four include: Methods of Measuring Learning Outcomes Grid How colleges and universities can measure and report on the knowledge and abilities their students have acquired during their college years is an issue of growing interest. http://web.mit.edu/tll/assessment-evaluation/methods-of-measuring-learning-outcomes- grid.doc http://www.elcamino.edu/academics/slo/docs/SLOFocusOnResults.doc Course-embedded assessment/Assurance of Learning Standards. http://www2.cortland.edu/dotAsset/f2e9f1ee-0100-40ab-b78d-48ead9c047d3.pdf

To Sum Up

Checklist for effective questions Do Not Do Use complex phrases Give clear instructions Frame questions in the negative Keep questions structure simple Use abbreviations, contractions or symbols Ask one question at a time Mix different words for the same concept Maintain a parallel structure for all questions Use “loaded” words or phrases Define terms before asking the question Combine multiple response dimensions in the same question Be explicit about the period of time being referenced by the question Give the impression that you are expecting a certain response Provide a list of acceptable responses to closed questions Bounce around between topics or time periods Ensure that response categories are both exhaustive and mutually exclusive Insert unnecessary graphics or mix many font styles and sizes Label response categories with words rather than numbers Forget to provide instructions for returning the completed survey Ask for number of occurrences, rather than providing response Save personal and demographic questions for the end of the survey

Activity Based on your study of Kirkpatrick's model of learning / training programs evaluation, apply this model to evaluate your course using the 4 levels of Kirkpatrick's model. Write questions that explores the items within each level and describe how you will collect information from participants

ONLINE DETAILED COURSE MANUAL http://www.kirkpatrickpartners.com/Portals/0/Training-Events/Program%20Prework/Kirkpatrick%20Four%20Levels(tm)%20Evaluation%20Online%20Certificate%20Manual.pdf

Thanks for Attending