Presentation is loading. Please wait.

Presentation is loading. Please wait.

Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.

Similar presentations


Presentation on theme: "Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase."— Presentation transcript:

1 Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase

2 Criterion-Referenced Tests Designed to measure explicit behavioral objectives Allows instructors to decide how well the learners have met the objectives that were set forth. Used to evaluate: ◦ learner performance ◦ effectiveness of the instruction

3 Criterion-Referenced Also called objective-referenced, or domain-referenced Refers directly to explicit “ criterion ” or specified performance “ Criterion-Referenced Test ” must: ◦ match test item and performance objective ◦ provide degree of mastery of the skill

4 Types of Criterion-Referenced Tests Dick, Carey and Carey discuss four different types of criterion-referenced tests that fit into the design process: ◦ Entry Behaviors Test ◦ Pretest ◦ Practice Tests ◦ Posttests

5 Types of Criterion Tests Entry behavior test: 1. Consists of items that: -measure entry behavior skills -test skills to be taught -draw from skills below the entry behavior line 2. Helps determine appropriateness of required entry skills. 3. Used during formative evaluation process. May be discarded in final version of instruction.

6 Types of Criterion Tests Pretest: 1. used to determine whether learners have previously mastered some or all of the skills that are to be included in the instruction. ◦ IBT determines whether or not students are ready to begin your instruction, ◦ PT helps determine which skills in your main instructional analysis, Students may already be familiar with.

7 Types of Criterion Tests Practice test: 1. To provide active learner participation during instruction. 2. Enable learners to rehearse the new knowledge and skills they are being taught. 3. also allow instructors to provide corrective feedback to keep learners on track.

8 Types of Criterion Tests Posttest: 1. Are administered following instruction, and they are parallel to pretest. 2. Assesses all the objectives, focusing on terminal objectives 3. Helps identify ineffective instructional segments 4. Used during the design process and may be eventually modified to measure only terminal objectives.

9 Test TypeDesigner’s DecisionObjectives Typically Tested Entry behavior test Are the learners ready to enter instruction? Do learners possess the required prerequisite skills? prerequisite skills Skills below the dotted line in instructional analysis Pretest Have learners previously mastered the enabling skills? Terminal objectives Main steps from the goal analysis Practice testAre students acquiring the intended knowledge and skills? knowledge and skills? For subset of objectives within the goal PosttestHave learners achieved the terminal objectives? Terminal objectives Main steps and their subordinate skills.

10 Using the instructional analysis diagram in this slide, indicate by box number (s) the skills that should be used to develop test items for: 1.Entry behaviors test:………. 2.Pretest:………… 3.Posttest:……….. 4 123 11 789 56 10 12 13 14 Skills for instruction Entry behaviors 1-4 5 - 13

11 Designing Tests for Learning Domains Intellectual & Verbal Information ◦ paper & pencil, short-answer, matching, and multiple-choice. Attitudinal ◦ state a preference or choose an option Psychomotor ◦ performance quantified on checklist ◦ subordinate skills tested in paper-and-pencil format

12 Determining Mastery Levels Approach # 1 ◦ mastery defined as level of performance normally expected from the best learners ◦ arbitrary (norm-referenced) (group comparison methods) Approach # 2 ◦ defined in statistical terms, beyond mere chance ◦ mastery varies with critical nature of task  example: nuclear work Vs. paint a house  Is the level required in order to be successful on the job.

13 Writing Test Items What should test items do?  Match the behavior of the objective ◦ Use the correct “ verb ” to specify the behavior  Match the conditions of the objective

14 Writing Test Items How many test items do you need?  Determined by learning domains  Intellectual requires three or more  Wide range use random sample

15 Writing Items (continued) What types (true / false, multiple choice, etc..) to use?  clues provided by the behavior listed in the objective clues  review “ Types of Test Items ” this chap. p 148  Entry behavior  Pretest  Practice test  Posttest

16 Writing Items (continued) Item types tempered by: amount of testing time ease of scoring amount of time to grade probability of guessing ease of cheating, etc. availability of simulations

17 Writing Items (continued) What types are inappropriate? ◦ true / false for definition  discrimination, not definition Acceptable alternatives from “ best possible ” ◦ for simulations  list steps

18 Constructing Test Items Consider: vocabulary setting of test item (familiar Vs. unfamiliar) clarity ◦ all necessary information trick questions ◦ double negatives, misleading information, etc.

19 Other Factors Sequencing Items ◦ Consider clustering by objective Test Directions ◦ Clear and concise ◦ General ◦ Section specific Evaluating Tests / Test Items

20 Measuring Performance, Products, & Attitudes Write directions to guide learner activities Construct an instrument to evaluate these activities ◦ a product, performance, or attitude  Sometimes includes both process and a product

21 Test Directions for Performance, Products, & Attitudes Determine the ◦ Amount of guidance? ◦ Special conditions  time limits, special steps, etc. ◦ Nature of the task (i.e., complexity) ◦ Sophistication level of the audience

22 Assessment Instruments for Performance, Products, & Attitudes Identify what elements are to be evaluated ◦ cleanliness, finish, tolerance ( possibility ) of item, etc. Paraphrase each element Sequence items on the instrument Select the type of judgment for rater Determine instrument scoring

23 Formats for Assessments of Performance, Products, & Attitudes Checklist Rating Scale Frequency Counts Etc.

24 Evaluating Congruency Skills, Objectives, & Assessments should refer to the same behaviors To check for congruency ◦ Construct Congruency Evaluation Chart  include: Subskills, Behavioral Objectives, & Test Items

25 Design Evaluation Chart SkillObjectiveAssessment Item(s) 1Objective 1Test item 2Objective 2Test item 3Objective 3Test item Instructional GoalTerminal ObjectiveTest item

26 Example Objective1: Given a research topic and a list of ten Google search results, select the three web sites most appropriate to the research topic. 1.What will they need to do? The learners should be able to select web sites from a list of search results. 2.What conditions will need to be provided? The learners will need to be given a predetermined research topic and a list of actual Google search results related to that topic. 3.Domain : Intellectual Skills: Rules. Students have to apply a set of criteria in order to make a decision. This objective will require fill-in-the-blank test item, as the students will have to write down the three most appropriate sites based on certain criteria. Test Item 1: ◦ Take a look at the following Google search results: (show screen capture of search results). Which 3 web sites are likely to have specific and relevant information dealing with the subject of Life on Mars?


Download ppt "Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase."

Similar presentations


Ads by Google