Program Evaluation Making sure instruction works..

Slides:



Advertisements
Similar presentations
By: Edith Leticia Cerda
Advertisements

Online Course Design Online Course Design EVALUATION AND REVISION
A Systems Approach To Training
The Teacher Work Sample
Focus on Instructional Support
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
M. Reber © 5/1/2015 Document Development Cycle Creating Your User’s Guide Step-by-Step.
In Europe, When you ask the VET stakeholders : What does Quality Assurance mean for VET system? You can get the following answer: Quality is not an absolute.
PBL Post-Project Review. 1. Student Engagement2. Project Idea3. Student Learning4. Authenticity of Project Tasks and Products5. Quality and Use of Driving.
Domain 1: Planning and Preparation
Evaluating tests and examinations What questions to ask to make sure your assessment is the best that can be produced within your context. Dianne Wall.
Problem Based Lessons. Training Objectives 1. Develop a clear understanding of problem-based learning and clarify vocabulary issues, such as problem vs.
Instructional Plan and Presentation Bernard Q Mallada CUR/516 Dr
Aligning Assessments to Ohio Social Studies Standards Teaching American History Grant Presentation.
Formative and Summative Evaluations
RESEARCH METHODS End of topic assessment feedback.
Instructional Design Dr. Lam TECM 5180.
Evaluation of Training and Education Activities. Objectives By the end of this presentation, participants will be able to List reasons why evaluation.
AnalyseDesignDevelopImplementEvaluate ADDIE - Model.
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Measuring for Success Module Nine Instructions:
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Portfolios Definitions Advantages and Disadvantages Specific Steps Examples Try it!
Maths Counts Insights into Lesson Study 1. Tim Page and Joanne McBreen Transition Year or Senior Cycle Introducing Tolerance and Error (Leaving cert.
Instructional Design Aeman Alabuod. Instructional Design instructional Design (also called Instructional Systems Design (ISD)) is the practice of creating.
/0503 © Business & Legal Reports, Inc. BLR’s Human Resources Training Presentations Exit Interviews.
Instructional Design Eyad Hakami. Instructional Design Instructional design is a systematic process by which educational materials are created, developed,
HOW TO PROVIDE EFFECTIVE INSTRUCTION THAT WORKS E-Learning Instructional Design Model.
SENSE 2013 Findings for College of Southern Idaho.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Timber Trace Elementary School October 4, Introduction Module # 1 Structure of the Handbook Design Questions and Modules Sample Activity Box How.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Instructional Plan Template | Slide 1 AET/515 Instructional Plan Misty Lunsford.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
Quality Assessment July 31, 2006 Informing Practice.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
By: Catherine Mendoza. Evaluate Implement Develop Analyze Design.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
Use of digital diagnostic Tests Baleni Z.G. R.P.L. Manager.
Consistency of Assessment (Validation) Webinar – Part 1 Renae Guthridge WA Training Institute (WATI)
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Performance Management Procedural Job Aid A consultant’s guide through performance management. Learning Team A AET/550 University of Phoenix April 13,
Final Presentation, European Cooperative House Brussels, 16 Dec.2009 Training +45 “Teachers Facilitating Learning among Elders” WP5 Test and Evaluation.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Monitoring & Evaluating WHAT IS MEANT BY MONITORING AND EVALUATING?
Professional Work Ethics in the Workplace Martha Piear CUR/516 Joan Beckner.
DIFFERENTIATION STRATEGIES WELCOME Differentiation Strategies: How to Meet the Instructional Needs of Each Student in Your Classroom DOE# IS Brandman.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
Instructional Plan | Slide 1 AET/515 Instructional Plan For Associate’s Degree in Library Skills (Donna Roy)
Improved socio-economic services for a more social microfinance.
Document Development Cycle
Articulating Your Practice C3 - Session #3
Document Development Cycle
A Three-minute assessment and a few aha’s
Employability Skills for the Health Care Sector
Instructional Design Gibran Carter.
Preparing to Teach and Overview of Teaching Assignments
Articulating Your Practice C3 - Session #3
Overview: Understanding and Building a Schoolwide Assessment Plan
Revisit Differentiation and Reflection Standard Prepare for Submission
Reviewing and Revising:
Welcome to Your New Position As An Instructor
Preparing to Teach and Overview of Teaching Assignments
Chapter 4 Instructional Media and Technologies for Learning
Instructional Plan and Presentation Cindy Douglas Cur/516: Curriculum Theory and Instructional Design November 7, 2016 Professor Gary Weiss.
Presentation transcript:

Program Evaluation Making sure instruction works.

Designers Mark Bullard Carmela DeRoy Ashley McCormick Lucia Salters Susan Trejo

Overview of Proposed Evaluation Plan This is a new version of the EMT training course. The plan is to use a variety of evaluation tools to ensure that the new course materials are congruent with the goals and objectives set forth by the client, to ensure that the materials and technology are congruent and effective, and to ensure that the new course offers a positive ROI for the client.

Design reviews At this stage of evaluation the design will be compared to the stated goals set forth by the client. The functionality, congruence, strategies for pace, relevancy, gaining and maintaining attention, tools and technologies, media will all be tested. Goal Review: Does the instruction meet the needs assessment? Task Analysis Review: Are prerequisite skills as indicated in the task analysis clearly stated? Are the learners accurately portrayed and are the lessons correctly designed for the target audience? Review of Assessments: Do the assessments and benchmarks reflect the course objectives precisely? Will the assessments clearly show which students are learning the material and which are not?

Small Group Review At this stage of evaluation the design will be used by a focus group of students to test for understandability, clarity, congruence, and to see if a sample group of representative students have the requires prerequisite skills to succeed in the class. The focus group will fill out a short questionnaire that utilizes the Likert scale to rate aspects of the course. There will also be a short discussion between the designers and the focus group that will go more into depth regarding the scaling of the course on the questionnaire. Revisions will be made according to feedback from the students. Was the technology easily understood and usable? Was the information presented above your pre-knowledge level? Was enough time spent on each section? Was too much time spent on any section? Were the objectives clearly stated and an obvious part of the instruction? Were the assessments clear and did they correlate to the stated objectives? Were there any erroneous or confusing questions?

Expert Review Before the course goes live an expert in the field of Emergency Medical Technicians will be brought in to rate the course. Again a Likert Scale will be used with space for additional comments. An instructional design review will be done where the expert will be asked to rate the multi media and instructional strategies and pace of the course. A visual design review will be done where the expert will rate the visual aspect of the materials to ensure that all graphics are correct, pleasing and relevant. Finally a functionality review will be done where the expert will rate the functionality of the learning course.

Level One Evaluation Level one evaluations are surveys that are given to students at the end of a course. Since this is a new course, facilitators need to know if the class is effective. Students are I the best position to critique the course. A Likert Scale will be used to quiz the students and there will be comment spaces at the end of each section for students to elaborate on their answers. Level one evaluations can be used to make minor changes to the course. The evaluations will be anonymous to ensure candid answers. Facilitators can also utilize the level one evaluation to suggest changes in the course. The client will use the level one evaluations to give insight into what areas need improvement. Benchmarking will be utilized with any score falling below 3 on a 4 point scale considered for change. Reasonable changes can be made during this phase. Major changes cannot be made to the instructional strategy based on the Level One Evaluation

Level Three Evaluation Six months after the completion of the course, students will be assessed to ensure knowledge retention. The results of this level 3 evaluation will be used to determine if the training course better prepares EMTs for the field and will be able to determine if the new course impacted the participants in a positive way. This will give the client insight as to whether they should continue to offer the course as it is or opt to overhaul the course and make major changes in delivery methods. The information used as comparison for knowledge retention will come from the level one evaluations and the assessment data collected during the class. The students will be observed on the job and a checklist will be employed to log the performance data. The knowledge level before and during training will be compared to the knowledge level six months after training.

Level Four Evaluations Level four evaluations will determine if the cost of training is offset by the outcome of training. The level four evaluation will take place at least a year after initial training and will coincide with yearly performance reviews. The information provided by a level four evaluation will show if the training adequately prepared participants to become EMTs. If students taking the new course perform better than students who took the previously offered course, officials can be confidant that the new course provides a positive ROI and the new course will continue to be offered. The information used to compare the participant’s performance will be gathered during initial training by observation and assessment scores. A control group will be used to compare performance. The control group will be a group that did not participate in the new training. All information will be gathered through normal testing administered by the Emergency Services Department. Students who have taken both the new training and the old training will be tracked for the first year to provide information for comparison.

Contextual Factors That Affect Results Level one evaluation: Implementation factors can affect the outcome of the evaluation. If not given enough time participants will rush through the evaluation and not give thoughtful answers. Level Three evaluations: If there is a time lapse between completion of the course and when the participants are instated as working EMTs, there may be a loss of information. Level four evaluations: The length of time between the completion of the course and the level four evaluation may pose a problem. Management may want faster answers to their questions about the success of the course and the ROI. Level four evaluations are cost prohibitive. It is difficult to separate outside variables that affect the performance of the participants and the control group on any given day.