Download presentation
Presentation is loading. Please wait.
Published byRoger Parrish Modified over 9 years ago
1
Mary Allen Qatar University September 2011
2
Workshop participants will be able to: draft/revise learning outcomes develop/analyze curriculum maps develop/refine sustainable, multi-year assessment plans develop/refine rubrics and calibrate reviewers analyze assessment results use a variety of strategies to close the loop evaluate the impact of improvement actions
3
an on-going process designed to monitor and improve student learning
4
Faculty: Develop SLOs Verify curriculum alignment Develop an assessment plan Collect evidence Assess evidence and reach a conclusion Close the loop
5
Standard 3.3.1
6
Program goals Cohesive curriculum How students learn Course structure and pedagogy Faculty instructional role Assessment Campus support for learning
8
Direct vs. indirect assessment Value-added vs. absolute learning outcomes Authentic assessment Formative vs. summative assessment Triangulation
10
Clarify what faculty want students to learn Clarify how each outcome can be assessed
11
Knowledge Skills Attitudes/Values/Predispositions
12
CLO PLO ILO
15
List of goals and outcomes List of outcomes Typically 6-8 outcomes in all
17
1.Active verbs 2.Simple language 3.Real vs. aspirational 4.Aligned with mission 5.Avoid compound outcomes 6.Outcomes vs. learning processes 7.Focus on high-priority learning
18
Coherence Synthesizing experiences On-going practice of learned skills Opportunities to develop increasing sophistication and to apply what is learned
19
I = Introduced D = Developed & Practiced with Feedback M = Demonstrated at the Mastery Level Appropriate for Graduation
20
Curriculum Map 2 Curriculum Map 3
21
CLOs that align with relevant PLOs Faculty can provide artifacts for assessment Faculty teach courses consistent with the map
22
Focuses faculty on curriculum cohesion Guides course planning Allows faculty to identify potential sources of assessment evidence Allows faculty to identify where they might close the loop
23
Except for NCATE-accredited programs
24
Who? What? Where? When? Why?
25
Relevant samples Representative samples Reasonably-sized samples
26
Anonymity Confidentiality Privacy Informed consent
27
Find examples of: Direct assessment Indirect assessment Formative assessment Summative assessment Authentic assessment Triangulation
28
PLO When to assess What direct and indirect evidence to collect Who will collect the evidence How evidence will be assessed How decisions will be made
29
Valid Reliable Actionable Efficient and cost-effective Engages students Interesting to faculty Triangulation
30
Published tests Locally-developed tests Embedded assessment Portfolios
31
Surveys Interviews Focus groups
32
Holistic Analytic
33
Rubric Packet AAC&U VALUE Rubrics Specialized Packets
34
Efficiency Defines faculty expectations Well-trained reviewers use the same criteria Criterion-referenced judgments Ratings can be done by multiple people
35
Assess while grading Assess in a group
36
Columns are used for assessment Faculty can adapt an assessment rubric in different ways Faculty maintain control over their own grading
38
Grading may require extra criteria Grading requires more precision Calibrate when doing assessment
39
Speed up grading Clarify expectations to students Reduce student grade complaints Improve the reliability and validity of assessments and grades Make grading and assessment more efficient and effective Help faculty create better assignments
41
Below Expectations Needs Improvement Meets Expectations Exceeds Expectations
43
Adapt an already-existing rubric Analytic method
44
Consider starting at the extremes Some words I find useful
45
One reader/document. Two independent readers/document. Paired readers.
46
Collect the assessment evidence and remove identifying information. Develop and pilot test the rubric. Select exemplars of weak, medium, and strong student work. Consider pre-programming a spreadsheet so data can be entered and analyzed during the reading and participants can discuss results immediately.
47
Correlation Discrepancy Index
49
How good is good enough?
50
Celebrate! Change pedagogy Change curriculum Change student support Change faculty support Change equipment/supplies/space
53
1.Focus on what is important. 2.Don’t try to do too much at once. 3.Take samples. 4.Pilot test procedures. 5.Use rubrics. 6.Close the loop. 7.If you rely on adjunct faculty, include them in assessment. 8.Keep a written record.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.