Download presentation
Presentation is loading. Please wait.
Published byGary Ray Modified over 8 years ago
2
Principles of Instructional Design Assessment
3
Recap Checkpoint Questions Prerequisite Skill Analysis
4
Why do we need better assessments? Students given instruction aimed at conceptual understanding do better on skills tests than students drilled on the skills directly. The more we focus on raising test scores, the more instruction is distorted and the less credible are the scores themselves. Assessment tasks should be redesigned to closely resemble real learning tasks.
5
Assessing Learner Outcomes
6
“How will we know when we’re there?” TESTING!
7
Reasons to Test diagnosis student difficulty placement check student progress report what skills the student has mastered provides information about the type of revision required Standards
8
Alignment CONTENT OBJECTIVE TEST ITEM
9
Choosing An Assessment Method Choose a methods that –matches the target of the instruction –aligns with the method of instruction
10
Assessment Methods Selected Response –True/False –Multiple Choice –Matching Performance Based - Constructed Response –Short Answer –Fill in the Blank –Show your work
11
Assessment Methods Performance Based - Motor Skills –Plan and carry out performance (athletic skills, dance, speeches, readings, demonstrations Product Based –Plan, produce and use a product (art exhibits, dioramas, research papers, poems, videotapes)
12
Assessment Methods Process –use a process to examine individual steps/tasks Portfolio –purposeful collections of student work –contains combinations of other assessments Interview/Observation
13
Categories of Assessment Criterion-referenced (objective- referenced) Norm-referenced
14
Criterion-Referenced /Objective-Referenced Tests and evaluates student progress Provides information about the effectiveness of the instruction Identifies gaps in learning Measures performance of all students compared to the number of subordinate skills or objectives passed Measured competence
15
Norm-Referenced Tests Compare the performance of students with each other.
16
Activity On page 51 of the workbook, determine if each item is criterion-referenced or norm-referenced.
17
Testing Types for Instructional Design Entry skills test –below the entry behaviors line Pretests –Includes all items appearing above the entry behaviors line. Posttests –Does not include entry behaviors, includes all subskills. Embedded Tests –Part of the instructional strategy. May appear every few pages, or after a major sequence of instruction.
18
Activity Practice, page 95, Smith & Ragan
19
What is Mastery? How do instructional designers respond to the concept of the normal curve? How would you set mastery for a course? What are critical skills?
20
Test Construction Issues related to validity of items Number of items to write Item types by learning domain
21
Validity Do the test items actually measure what they are intended to measure? Are the test items aligned with the objectives? Are items representative of a range of possible items? Are objectives adequately sampled?
22
Number & Type of Item Intellectual skills Verbal Information Psychomotor Skills Cognitive Strategy Attitude
23
Activity Are these items valid? Look closely at the objective and the possible test item on page 52 of the workbook. Are they aligned? Can a valid assessment be obtained?
24
Reliability Do the items consistently measure what it claims and is there a high degree of confidence in the scores produced. EXAMPLE: If the same assessment was given on another day to same learners would their scores be basically the same?
25
Threats to Reliability Using too few items to measure objectives Using the wrong type of instrumentation to measure specific domains of knowledge Incomplete directions Unfamiliar vocabulary Unfamiliar assessment formates Uncomfortable room conditions
26
Practicality trade-offs/compromises
27
Scoring Assessments -- Rubrics When? What are they? Levels Characteristics
28
Developing Rubrics Develop your own rubric –You are assessing the performance of students who are cooking a pizza. –Create a 3-level rubric
29
Assessment Instrument Blueprint Objectives Forms of Items Number of Items Proportionality of Items Directions for Administration Scoring Methods Weighting Mastery
30
Next Week Read Chapter 7, Smith & Ragin Workbook, Chapter 9 Project Assignment 3 Due
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.