Introduction to Indirect Student-Learning Assessment (Part I)

Slides:



Advertisements
Similar presentations
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
Advertisements

Utilization-focused Assessment in Foundations Curriculum: Examining RCLS 2601: Leisure in Society Clifton E. Watts, PhD Dept. of Recreation & Leisure Studies.
Copyright Reserved by Dr. Kanlaya Vanichbuncha1 Business Research Methodology by Associate Prof. Dr. Kanlaya Vanichbuncha Faculty of Commerce & Accountancy.
Research methods – Deductive / quantitative
Grade 12 Subject Specific Ministry Training Sessions
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
David Gibbs and Teresa Morris College of San Mateo.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
1 Evaluation is a Partnership Dr. Darla M. Cooper Associate Director Center for Student Success Basic Skills Coordinators Workshop September 17, 2009.
Pam Ethridge, Coordinator P.E. & Fitness Center Welcome to A.A.S. Fitness & Exercise, Personal Trainer and Group Fitness Certificates.
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies June 27,
Qualitative Assessment Methods Session 1.3 Qualitative Approaches for FS Assessments.
EXPLORATORY RESEARCH INITIAL RESEARCH CONDUCTED TO CLARIFY AND DEFINE THE NATURE OF A PROBLEM DOES NOT PROVIDE CONCLUSIVE EVIDENCE SUBSEQUENT RESEARCH.
Methods of Research and Enquiry Qualitative Case Study by Dr. Daniel Churchill.
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
Student Learning Outcomes (SLOs) Module #2: Writing SLOs Office of Academic Planning & Accountability Institutional Effectiveness Moderator: Dr. Cathy.
Incorporating Instructional Design into Library Instruction Classes NEFLIN Live Online July 7, 2011.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
IPFW Assessment Academy Programmatic Assessment Workshop Series - Signature Assignments D. Kent Johnson, PhD Director of Assessment.
Data Collection Techniques
Ways of doing Needs Assessment
Chapter 1 Introduction and Data Collection
NEFLIN Assessment Basics for Library Instruction
Classroom Assessments Checklists, Rating Scales, and Rubrics
Introduction to Indirect Student-Learning Assessment (Part V)
This project has been funded with support the European Commission
Thinking about assessment…
Direct vs Indirect Assessment of Student Learning: An Introduction
Assessment in student life
DATA COLLECTION METHODS IN NURSING RESEARCH
Office of Planning & Development
Chapter 8: Performance-Based Strategies
Assessment & Evaluation Committee
Consider Your Audience
SP_ IRS : Research in Inclusive and Special Education
Interviews Focus Groups
Are Your Educational Programs Learning-Centered? Can You Measure This?
Assessment of Student Learning
Experimental & Non-experimental Methods
Aaker, Kumar, Day Ninth Edition Instructor’s Presentation Slides
blended Approach to academic literacy in so100 self & society
Badging Breakfast: Preparing for Summer Badging
Introduction to Indirect Student-Learning Assessment (Part II)
Asking questions: Interviews, Wednesday 14th December 2016
Classroom Assessments Checklists, Rating Scales, and Rubrics
The Assessment Toolbox: Capstone Courses & Portfolios
THE JOURNEY TO BECOMING
Interviews & focus groups
Helping students know what they know
Overview of Research Designs
Qualitative vs. Quantitative Research
Simon Pawley Market Research, Oxford University Press
Chapter 8 Performance Management and Employee Development
Derek Herrmann & Ryan Smith University Assessment Services
Data Collection Strategies
Data and Data Collection
Creating Assessable Student Learning Outcomes
Evaluation tools.
EDU 5818 DATA COLLECTION AND ANALYSIS
Associate Provost for Graduate and Professional Studies
Analyzing Student Work Sample 2 Instructional Next Steps
Assessments: Beyond the Claims
Assessment & Evaluation Committee
Interviews & focus groups
Business Statistics: A First Course (3rd Edition)
Information Gathering
EDU 5818 DATA COLLECTION AND ANALYSIS
Lesson Aims 1. Examine the skills needed in order to conduct a good interview 2. Identify the main types of interviews used in sociological research 3.
Case studies: interviews
Presentation transcript:

Introduction to Indirect Student-Learning Assessment (Part I) Dr. Wayne W. Wilkinson September 14, 2016 ITTC Faculty Center Arkansas State University

Biography Ph.D. in Social and I/O Psychology from Northern Illinois University Project Director at the NIU Public Opinion Laboratory (2007-2011) Psychometrics experience: Member of the Psychometric Society Primary instructor of undergraduate psychometrics course Published research on scale item parceling in covariance structure modeling Current psychometrics interests: Covariance structure modeling approaches to measurement invariance Scale refinement using polytomous item response theory

Outline Direct and Indirect Assessment Properties of Indirect Assessment Methods of Indirect Assessment

I. Direct and Indirect Assessment

Forms of Assessment Two broad categories of student learning assessment: Direct Assessment Indirect Assessment Key difference: Are you observing or inferring student learning?

Direct Assessments Students actively demonstrate achievement levels: Collecting evidence from student work Observing demonstration of skills or behavior Exams Course Assignments Capstone projects Portfolios Internships/Practicums

Advantages of Direct Assessments Require active demonstration of learning Demand less abstract interpretation (rubrics) Usually “easy” to develop and administer Direct assessments are the standard; indirect assessments complement but cannot replace direct assessments

Indirect Assessments Require the inference of student learning: No direct evidence or demonstration Common topics of indirect assessments: Perceptions of successfully meeting program outcomes Satisfaction/attitudes toward program Utility of program

Common Forms of Indirect Assessment Interviews Focus groups Classroom assessment techniques Curriculum and syllabus analysis Surveys

II. Properties of Indirect Assessments

Are Indirect Assessments Useful? Indirect assessments can expand on or confirm what is discovered in direct assessments: Can tell you how students feel Do students think the program is preparing them? Do students and faculty agree on the program goals? Do the students like the delivery/administration of the program?

Advantages of Indirect Assessments Relatively easy to administer Can be administered to non-students Gain insight on subjective areas (e.g., retention) Provide clues for direct assessment topics

Disadvantages of Indirect Assessments Do not provide “hard evidence” of learning Responses may change over time Biased responding: Response distortion Non-response bias Can be time consuming

Implementing Indirect Assessments Two forms of implementation: Embedded assessments – part of course/program requirements Ancillary assessments – occur “outside” the program

III. Methods of Indirect Assessment

Interviews Forms: Verification: Structured Unstructured Semi-structured Verification: Observers Triangulation Results review Skill of the interviewer – guide rather than influence Interviews as part of further indirect assessment design

Focus Groups Carefully planned in-depth discussion on narrow topic What information is sought and how will it be used? Funnel method Conjunction with other methods Representativeness of participants Role of moderators: Impartial (students) Familiarity with topic & purpose Social skills Open environment Online thinking

Classroom Assessment Techniques (CAT) Ungraded & anonymous exercises for day-to-day adjustments Three topic categories: Program-related knowledge/skills Students’ self-awareness Students’ reactions Summarization and feedback 1st edition in A-State library

Example CAT

Curriculum & Syllabus Analysis Curriculum analysis: Examining whether courses/experiences are related to program outcomes (e.g., curriculum maps) Syllabus analysis: Provides assurance that each section/instructor is addressing program outcome material

Surveys Most common indirect assessment method . . . Our focus next time . . . Next session: September 28th