Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.

Slides:



Advertisements
Similar presentations
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Advertisements

What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
Dick & Carey Instructional Design Module Cheat Tool
Objectives WRITING WORTHWHILE OBJECTIVES FOR YOUR CLASS.
Evaluating tests and examinations What questions to ask to make sure your assessment is the best that can be produced within your context. Dianne Wall.
Chapter 4 Validity.
Group 3 Teachers: No Growth Model Classes
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Formative and Summative Evaluations
Principles of High Quality Assessment
Identifying Subordinate Skills & Entry Behaviors
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Evaluate your understanding of the Dick and Carey Model.
Measuring Learning Outcomes Evaluation
Barry Williams1 Developing an Instructional Strategy Dick & Carey 8.
Barry Williams1 Writing Objectives Dick & Carey Chp. 6.
Chapter 9 Instructional Assessment © Taylor & Francis 2015.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Chapter 1 Assessment in Elementary and Secondary Classrooms
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
Developing Evaluation Instruments
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
Click to edit Master title style  Click to edit Master text styles  Second level  Third level  Fourth level  Fifth level  Click to edit Master text.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
10/08/05Slide 1 Instructional Systems Design in Distance Education Goal: This lesson will discuss Instructional Systems Design as it relates to distance.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
Chap. 3 Designing Classroom Language Tests
Classroom Assessment and Grading
D ESIGNING FOR O NLINE, S ELF - PACED, C OMPETENCY - BASED L EARNING An Overview.
Barry Williams1 Analyzing Learners & Context Dick & Carey, Chp. 5.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 EDUCATION ASSESSMENT IN THE CLASSROOM
EDU 385 Education Assessment in the Classroom
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
Evaluating Instruction
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Assessment and Testing
Developing Assessment Instruments
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
+ Instructional Design Models EDU 560 Fall 2012 Online Module November 13, 2012.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
CHAPTER 6: EVALUATING INSTRUCTION Developing the Curriculum Eighth Edition Peter F. Oliva William R. Gordon II.
Chapter 1 Assessment in Elementary and Secondary Classrooms
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 Session 8 Writing Selection items
Understanding Results
Classroom Assessments Checklists, Rating Scales, and Rubrics
Statistics and Research Desgin
Instructional Design : Design Phase Unit 3
Conducting a Goal Analysis
14 The Role of Assessment. 14 The Role of Assessment.
Analyzing Learners & Context
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Designing Your Performance Task Assessment
Chapter 4 Instructional Media and Technologies for Learning
Presentation transcript:

Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase

Criterion-Referenced Tests Designed to measure explicit behavioral objectives Allows instructors to decide how well the learners have met the objectives that were set forth. Used to evaluate: ◦ learner performance ◦ effectiveness of the instruction

Criterion-Referenced Also called objective-referenced, or domain-referenced Refers directly to explicit “ criterion ” or specified performance “ Criterion-Referenced Test ” must: ◦ match test item and performance objective ◦ provide degree of mastery of the skill

Types of Criterion-Referenced Tests Dick, Carey and Carey discuss four different types of criterion-referenced tests that fit into the design process: ◦ Entry Behaviors Test ◦ Pretest ◦ Practice Tests ◦ Posttests

Types of Criterion Tests Entry behavior test: 1. Consists of items that: -measure entry behavior skills -test skills to be taught -draw from skills below the entry behavior line 2. Helps determine appropriateness of required entry skills. 3. Used during formative evaluation process. May be discarded in final version of instruction.

Types of Criterion Tests Pretest: 1. used to determine whether learners have previously mastered some or all of the skills that are to be included in the instruction. ◦ IBT determines whether or not students are ready to begin your instruction, ◦ PT helps determine which skills in your main instructional analysis, Students may already be familiar with.

Types of Criterion Tests Practice test: 1. To provide active learner participation during instruction. 2. Enable learners to rehearse the new knowledge and skills they are being taught. 3. also allow instructors to provide corrective feedback to keep learners on track.

Types of Criterion Tests Posttest: 1. Are administered following instruction, and they are parallel to pretest. 2. Assesses all the objectives, focusing on terminal objectives 3. Helps identify ineffective instructional segments 4. Used during the design process and may be eventually modified to measure only terminal objectives.

Test TypeDesigner’s DecisionObjectives Typically Tested Entry behavior test Are the learners ready to enter instruction? Do learners possess the required prerequisite skills? prerequisite skills Skills below the dotted line in instructional analysis Pretest Have learners previously mastered the enabling skills? Terminal objectives Main steps from the goal analysis Practice testAre students acquiring the intended knowledge and skills? knowledge and skills? For subset of objectives within the goal PosttestHave learners achieved the terminal objectives? Terminal objectives Main steps and their subordinate skills.

Using the instructional analysis diagram in this slide, indicate by box number (s) the skills that should be used to develop test items for: 1.Entry behaviors test:………. 2.Pretest:………… 3.Posttest:……… Skills for instruction Entry behaviors

Designing Tests for Learning Domains Intellectual & Verbal Information ◦ paper & pencil, short-answer, matching, and multiple-choice. Attitudinal ◦ state a preference or choose an option Psychomotor ◦ performance quantified on checklist ◦ subordinate skills tested in paper-and-pencil format

Determining Mastery Levels Approach # 1 ◦ mastery defined as level of performance normally expected from the best learners ◦ arbitrary (norm-referenced) (group comparison methods) Approach # 2 ◦ defined in statistical terms, beyond mere chance ◦ mastery varies with critical nature of task  example: nuclear work Vs. paint a house  Is the level required in order to be successful on the job.

Writing Test Items What should test items do?  Match the behavior of the objective ◦ Use the correct “ verb ” to specify the behavior  Match the conditions of the objective

Writing Test Items How many test items do you need?  Determined by learning domains  Intellectual requires three or more  Wide range use random sample

Writing Items (continued) What types (true / false, multiple choice, etc..) to use?  clues provided by the behavior listed in the objective clues  review “ Types of Test Items ” this chap. p 148  Entry behavior  Pretest  Practice test  Posttest

Writing Items (continued) Item types tempered by: amount of testing time ease of scoring amount of time to grade probability of guessing ease of cheating, etc. availability of simulations

Writing Items (continued) What types are inappropriate? ◦ true / false for definition  discrimination, not definition Acceptable alternatives from “ best possible ” ◦ for simulations  list steps

Constructing Test Items Consider: vocabulary setting of test item (familiar Vs. unfamiliar) clarity ◦ all necessary information trick questions ◦ double negatives, misleading information, etc.

Other Factors Sequencing Items ◦ Consider clustering by objective Test Directions ◦ Clear and concise ◦ General ◦ Section specific Evaluating Tests / Test Items

Measuring Performance, Products, & Attitudes Write directions to guide learner activities Construct an instrument to evaluate these activities ◦ a product, performance, or attitude  Sometimes includes both process and a product

Test Directions for Performance, Products, & Attitudes Determine the ◦ Amount of guidance? ◦ Special conditions  time limits, special steps, etc. ◦ Nature of the task (i.e., complexity) ◦ Sophistication level of the audience

Assessment Instruments for Performance, Products, & Attitudes Identify what elements are to be evaluated ◦ cleanliness, finish, tolerance ( possibility ) of item, etc. Paraphrase each element Sequence items on the instrument Select the type of judgment for rater Determine instrument scoring

Formats for Assessments of Performance, Products, & Attitudes Checklist Rating Scale Frequency Counts Etc.

Evaluating Congruency Skills, Objectives, & Assessments should refer to the same behaviors To check for congruency ◦ Construct Congruency Evaluation Chart  include: Subskills, Behavioral Objectives, & Test Items

Design Evaluation Chart SkillObjectiveAssessment Item(s) 1Objective 1Test item 2Objective 2Test item 3Objective 3Test item Instructional GoalTerminal ObjectiveTest item

Example Objective1: Given a research topic and a list of ten Google search results, select the three web sites most appropriate to the research topic. 1.What will they need to do? The learners should be able to select web sites from a list of search results. 2.What conditions will need to be provided? The learners will need to be given a predetermined research topic and a list of actual Google search results related to that topic. 3.Domain : Intellectual Skills: Rules. Students have to apply a set of criteria in order to make a decision. This objective will require fill-in-the-blank test item, as the students will have to write down the three most appropriate sites based on certain criteria. Test Item 1: ◦ Take a look at the following Google search results: (show screen capture of search results). Which 3 web sites are likely to have specific and relevant information dealing with the subject of Life on Mars?