Download presentation
Presentation is loading. Please wait.
Published byBenedict Baldwin Modified over 9 years ago
1
Why Assessment Results Are Hard to Use (and what to do about it)
2
Hi…I’m the Assessment Director
3
The Theory
4
What Could Go Wrong?
5
Look familiar?
6
Assessment Design
7
Don’t outsource design (if you can help it)
8
Choosing an Achievement Scale Pick a scale that corresponds to the educational progress we expect to see. Example: 0 = Remedial work 1 = Fresh / Soph level work 2 = Jr / Sr level work 3 = What we expect of our graduates
9
Subjective measurements Use authentic data that already exists Sacrifices reliability for validity Create subjective ratings in context Sacrifices reliability for validity Assess what you observe, not just what you teach! Recovers some reliability For complex goals like general education: www.coker.edu/assessment
10
Survey Attitudes and Behaviors CIRP, BCSSE, Student satisfaction inventory Link student ID numbers to other indicators These can be easily and productively outsourced
11
Analysis and Reporting
12
Some Tools ANOVA Logistic Regression and ROC curves http://statpages.org/logistic.html Pivot tables
13
Longitudinal Analysis
15
Use data you already have Grades FAFSA, SAT, ACT items Demographic data ePortfolio statistics or work Library circulation statistics
16
Analysis: Example
17
You don’t have to average Measurement requires: Units Ability to aggregate What we actually do is estimation Without measurement we shouldn’t average
18
Averages
19
Proportions
20
Min / max
21
Using comparative scales
22
Confidence Intervals
23
Portfolio analysis: epic FAIL!
24
Multi-dimensional graphs
25
Analysis: Example
26
Proportions
27
Conclusions Design it yourself Use a sensible longitudinal scale Combine with other data Avoid averages
28
Last Requests? highered.blogspot.com
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.