Introduction to Indirect Student-Learning Assessment (Part I) Dr. Wayne W. Wilkinson September 14, 2016 ITTC Faculty Center Arkansas State University
Biography Ph.D. in Social and I/O Psychology from Northern Illinois University Project Director at the NIU Public Opinion Laboratory (2007-2011) Psychometrics experience: Member of the Psychometric Society Primary instructor of undergraduate psychometrics course Published research on scale item parceling in covariance structure modeling Current psychometrics interests: Covariance structure modeling approaches to measurement invariance Scale refinement using polytomous item response theory
Outline Direct and Indirect Assessment Properties of Indirect Assessment Methods of Indirect Assessment
I. Direct and Indirect Assessment
Forms of Assessment Two broad categories of student learning assessment: Direct Assessment Indirect Assessment Key difference: Are you observing or inferring student learning?
Direct Assessments Students actively demonstrate achievement levels: Collecting evidence from student work Observing demonstration of skills or behavior Exams Course Assignments Capstone projects Portfolios Internships/Practicums
Advantages of Direct Assessments Require active demonstration of learning Demand less abstract interpretation (rubrics) Usually “easy” to develop and administer Direct assessments are the standard; indirect assessments complement but cannot replace direct assessments
Indirect Assessments Require the inference of student learning: No direct evidence or demonstration Common topics of indirect assessments: Perceptions of successfully meeting program outcomes Satisfaction/attitudes toward program Utility of program
Common Forms of Indirect Assessment Interviews Focus groups Classroom assessment techniques Curriculum and syllabus analysis Surveys
II. Properties of Indirect Assessments
Are Indirect Assessments Useful? Indirect assessments can expand on or confirm what is discovered in direct assessments: Can tell you how students feel Do students think the program is preparing them? Do students and faculty agree on the program goals? Do the students like the delivery/administration of the program?
Advantages of Indirect Assessments Relatively easy to administer Can be administered to non-students Gain insight on subjective areas (e.g., retention) Provide clues for direct assessment topics
Disadvantages of Indirect Assessments Do not provide “hard evidence” of learning Responses may change over time Biased responding: Response distortion Non-response bias Can be time consuming
Implementing Indirect Assessments Two forms of implementation: Embedded assessments – part of course/program requirements Ancillary assessments – occur “outside” the program
III. Methods of Indirect Assessment
Interviews Forms: Verification: Structured Unstructured Semi-structured Verification: Observers Triangulation Results review Skill of the interviewer – guide rather than influence Interviews as part of further indirect assessment design
Focus Groups Carefully planned in-depth discussion on narrow topic What information is sought and how will it be used? Funnel method Conjunction with other methods Representativeness of participants Role of moderators: Impartial (students) Familiarity with topic & purpose Social skills Open environment Online thinking
Classroom Assessment Techniques (CAT) Ungraded & anonymous exercises for day-to-day adjustments Three topic categories: Program-related knowledge/skills Students’ self-awareness Students’ reactions Summarization and feedback 1st edition in A-State library
Example CAT
Curriculum & Syllabus Analysis Curriculum analysis: Examining whether courses/experiences are related to program outcomes (e.g., curriculum maps) Syllabus analysis: Provides assurance that each section/instructor is addressing program outcome material
Surveys Most common indirect assessment method . . . Our focus next time . . . Next session: September 28th