Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI 2014 1.

Slides:



Advertisements
Similar presentations
Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Advertisements

By: Edith Leticia Cerda
A Systems Approach To Training
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
What is the ADDIE Model? By San Juanita Alanis. The ADDIE model is a systematic instructional design model consisting of five phases: –Analysis –Design.
Chapter 2 Analyzing the Business Case.
Objectives WRITING WORTHWHILE OBJECTIVES FOR YOUR CLASS.
Summative Evaluation The Evaluation after implementation.
Effective Implementation Formative and Summative Project Evaluation.
Consistency of Assessment
Chapter 1 Introduction to Instructional Design. 首页 上页返回下页.
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Formative and Summative Evaluations
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Principles of High Quality Assessment
The ADDIE Instructional Design Process
Tutorial of Instructional Design
Questions to Consider What are the components of a comprehensive instructional design plan? What premises underline the instructional design process?
Chapter 5 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
Standards and Guidelines for Quality Assurance in the European
Instructional System Design
IS-700.A: National Incident Management System, An Introduction
Revising instructional materials
Click to edit Master title style  Click to edit Master text styles  Second level  Third level  Fourth level  Fifth level  Click to edit Master text.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
LISA HOLMES INSTRUCTIONAL DESIGN. The process of creating an effective learning experience that takes into consideration: what needs to be learned what.
Instructional Design Eman Almasruhi.
Instructional Design Aldo Prado. Instructional Design Instructional design is the process of easing the acquisition of knowledge and making it more efficient.
10/08/05Slide 1 Instructional Systems Design in Distance Education Goal: This lesson will discuss Instructional Systems Design as it relates to distance.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
Barry Williams1 Analyzing Learners & Context Dick & Carey, Chp. 5.
1 QIM 501- INSTRUCTIONAL DESIGN AND DELIVERY Dick & Carey Instructional Design Module Prepared by :- Omar Abdullah M. Al-Maktari PQM0025/08 Lecturer :-
EFFECTIVENESS OF TRAINING Group 5. Effectiveness of Training  What is “effectiveness of training”? Effectiveness means producing an intended result.
Classroom Assessment A Practical Guide for Educators by Craig A
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Evaluating a Research Report
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
ADEPT 1 SAFE-T Judgments. SAFE-T 2 What are the stages of SAFE-T? Stage I: Preparation  Stage I: Preparation  Stage II: Collection.
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
© 2012 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Cleo Sgouropoulou * Educational Technology & Didactics of Informatics Educational Technology & Didactics of Informatics Instructional.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Systems Life Cycle. Know why it is necessary to evaluate a new system Understand the need to evaluate in terms of ease-of- use, appropriateness and efficiency.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Preface the field of instructional design has continued to grow both as an area of study and as a profession. Increasing numbers of colleges and universities.
Summative vs. Formative Assessment. What Is Formative Assessment? Formative assessment is a systematic process to continuously gather evidence about learning.
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Facilitate Group Learning
Considering the Roles of Research and Evaluation in DR K-12 Projects December 3, 2010.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
1 Systems Analysis & Design 7 th Edition Chapter 2.
Instructional Design Course Evaluation Phase. Agenda The Evaluation Process Expert Review Small Group Review The Pilot Feedback and Revision Evaluation.
Chapter Two Copyright © 2006 McGraw-Hill/Irwin The Marketing Research Process.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
COM 535, S08 Designing and Conducting Formative Evaluations April 7, 2008.
Stages of Research and Development
7 Training Employees What Do I Need to Know?
The ADDIE Instructional Design Process
Classroom Assessment A Practical Guide for Educators by Craig A
Fundamentals of Information Systems, Sixth Edition
Model of instructional systems design: Dick & Carey model
Chapter 2 Performance Management Process
Analyzing Learners & Context
Adjunct Training – August 2016 | Jason Anderson
Chapter 4 Instructional Media and Technologies for Learning
Designing & Conducting Formative Evaluation
Presentation transcript:

Lecture 8A Designing and Conducting Formative Evaluations English Study Program FKIP _ UNSRI

The Dick and Carey Model Assess Needs to identify Goal(s) Conduct Instructional Analysis Analyze Learners and Contexts Write Performance Objectives Revise Instruction Develop Assessment Instruments Develop Instructio nal Strategy Design and Conduct Formative Evaluation of Instruction Design and Conduct Summative Evaluation Develop and Select Instructional Materials 2

Objectives Describe the various stages of formative evaluation. Describe the instruments used in a formative evaluation. 3

Concepts Formative evaluation is: the process designers use to obtain data that can be used to revise their instruction to make it more efficient and effective. The emphasis in formative evaluation is: on the collection and analysis of data and the revision of the instruction. 4

Three basic phases of formative evaluation One-to-one, or clinical evaluation A small-group evaluation A field trial 5

Role of subject-matter, learning, and learner specialists in formative evaluation SME: comment in the accuracy and currency of the instruction. A specialist in the type of learning outcome involved: might be able to critique your instructional strategy related to what is known about enhancing that particular type of learning. It is also helpful to share the first draft of the instruction with a person who is familiar with the target population. 6

One-to-one evaluation with learners The purpose of the first stage of formative evaluation,the one-to-one stage,is to identify and remove the most obvious errors in the instruction,and to obtain initial performance indications and reactions to the content by learners. Criteria : clarity, impact, feasibility. 7

One-to-one evaluation with learners Selecting learners During this stage, the designer works individually with three or more learners who are representative of the target population. Data collection 8

One-to-one evaluation with learners Procedures The typical procedure in a one-to-one evaluation is to explain to the learner that a new set of instructional materials has been designed and that you would like his or her reaction to them. The first critical hallmark of the one-to-one formative evaluation is that it is almost totally dependent on the ability of the designer to establish rapport with the learner and then to interact effectively. The second critical hallmark of the one-to-one approach is that it is an interactive process. 9

One-to-one evaluation with learners Assessments and questionnaires Learning time Data interpretation Outcomes 10

Small-group evaluation Purposes for the small-group evaluation: The first is to determine the effectiveness of changes made following the one-to-one evaluation and to identify any remaining learning problems that learners may have. The second purpose is to determine whether learners can use the instruction without interacting with the instructor. 11

Small-group evaluation Criteria and data Typical measures used to evaluate instructional effectiveness include learner performance scores on pretests and posttests. 12

Small-group evaluation Selecting learners For the small-group evaluation, you should select a group of approximately eight to twenty learners. Procedures The evaluator begins by explaining that the materials are in the formative stage of development and it is necessary to obtain feedback on how they may be improved. having said this, the instructor then administers the materials in the manner in which they are intended to be used when they are in final form. If a pretest is to be used, then it should be given first. 13

Small-group evaluation Assessment and questionnaires Data summary and analysis Outcomes 14

Field trial Purposes: One purpose is to determine whether the changes in the instruction made after the small- group stage were effective. Another purpose is to see whether the instruction can be used in the context for which it was intended. 15

Field trial Location of evaluation In picking the site for a field evaluation, you are likely to encounter one of two situations. Criteria and data Selecting learners You should identify a group of about thirty individuals to participate in your field trial. 16

Field trial Procedure for conducting field trial. It is similar to that for the small group, with only a few exceptions. The primary change is in the role of the designer. The only other change might be a reduction in testing. 17

Field trial Data summary and interpretation Data summary and analysis procedures are the same for the small group and field trials. Outcomes 18

Formative evaluation in the performance context The process we will describe next for doing formative evaluation in the performance context could be used after any of the three phases of learning-context formative evaluation. The purpose of the in-context formative evaluation is to determine fundamentally three things. 19

Formative evaluation in the performance context Selecting respondents Procedure At the completion of the formative evaluation, the learners should be told that they will be contacted sometime in the future to discuss the instruction they have just completed and its usefulness. Then, when sufficient time has passed to permit the skills to be used-the learners should be contacted. Outcomes 20

Collecting data on reactions to instruction The purpose for the formative evaluation is to pinpoint specific errors in the materials in order to correct them, the evaluation design needs to yield information about the location of and the reasons for any problems. Five questions would be appropriate for all materials. (1) Are the materials appropriate for the type of learning outcome? (2) Do the materials include adequate instruction on the subordinate skills,and these skills sequenced and clustered logically? 21

Collecting data on reactions to instruction (3) Are the materials clear and readily understood by representative members of the target group? (4) What is the motivational value of the materials? (5) Can the materials be managed efficiently in the manner they are mediated? 22

The types of data you will probably want to collect Test data collected on entry behaviors tests, pretests, posttests, and performance context. Comments or notations made by learners to you or marked on the instructional materials about difficulties encountered at particular points in the materials. Data collected on attitude questionnaires and /or debriefing comments in which learners reveal their overall reactions to the instruction and their perceptions of where difficulties lie with the materials and the instructional procedures in general. 23

The types of data you will probably want to collect The time required for learners to complete various components of the instruction. Reactions of the subject-matter specialist. Reactions of a manager or supervisor who has observed the learner using the skills in the performance context. 24

Formative evaluation of selected materials Preparation for the field trial of existing materials should be made as they could be for a field trial of original materials. An analysis should be made of existing documentation on the development of the materials, the effectiveness of the materials with defined group, and particularly any description of procedures used during field evaluations. 25

Formative evaluation of selected materials Descriptions of how materials are to be used should be studied, any test instruments that accompany the materials should be examined for their relationship to the performance objectives, and the need for any additional evaluations or attitude questionnaires should be determined. 26

Formative evaluation of instructor-led instruction The purposes of formative evaluations are much the same as they are for the formative evaluation of independent instructional materials: to determine whether the instruction is effective and decide how to improve it. Once again, the formative evaluation of an instructional plan most nearly approximates that of the field trial phase for instructional materials. Very often, the field testing of selected materials and the field testing of instructor-led instruction are interwoven. 27

Concerns influencing formative evaluation Context concerns: 1. To ensure that any technical equipment is operating effectively. 2. It is also important in the early stages of formative evaluation, especially in the one-to-one trials, that you work with learners in a quiet setting-one in which you can command their full attention. 28

Concerns influencing formative evaluation Concerns about learners: 1. In the selection of learners for participation in any phase of formative evaluation, avoid depending entirely on the instructor to assess entry knowledge of the learners. 2. Learners who do not have the entry behaviors should also be included in a formative evaluation. 29

Concerns influencing formative evaluation Concerns about formative evaluation outcomes: Concerns with implementing formative evaluation: 1. The first consideration should be to determine whether any kind of formative evaluation can be conducted before the formal usage of the instruction. 2. If instruction is being used with the target population without the benefit of any formative evaluation, then it is still possible to use that opportunity to gather information that can be used to do revisions of the instruction. 30

Any Questions? If not, End of Lecture 31