Mary Allen Qatar University September 2011
Workshop participants will be able to: draft/revise learning outcomes develop/analyze curriculum maps develop/refine sustainable, multi-year assessment plans develop/refine rubrics and calibrate reviewers analyze assessment results use a variety of strategies to close the loop evaluate the impact of improvement actions
an on-going process designed to monitor and improve student learning
Faculty: Develop SLOs Verify curriculum alignment Develop an assessment plan Collect evidence Assess evidence and reach a conclusion Close the loop
Standard 3.3.1
Program goals Cohesive curriculum How students learn Course structure and pedagogy Faculty instructional role Assessment Campus support for learning
Direct vs. indirect assessment Value-added vs. absolute learning outcomes Authentic assessment Formative vs. summative assessment Triangulation
Clarify what faculty want students to learn Clarify how each outcome can be assessed
Knowledge Skills Attitudes/Values/Predispositions
CLO PLO ILO
List of goals and outcomes List of outcomes Typically 6-8 outcomes in all
1.Active verbs 2.Simple language 3.Real vs. aspirational 4.Aligned with mission 5.Avoid compound outcomes 6.Outcomes vs. learning processes 7.Focus on high-priority learning
Coherence Synthesizing experiences On-going practice of learned skills Opportunities to develop increasing sophistication and to apply what is learned
I = Introduced D = Developed & Practiced with Feedback M = Demonstrated at the Mastery Level Appropriate for Graduation
Curriculum Map 2 Curriculum Map 3
CLOs that align with relevant PLOs Faculty can provide artifacts for assessment Faculty teach courses consistent with the map
Focuses faculty on curriculum cohesion Guides course planning Allows faculty to identify potential sources of assessment evidence Allows faculty to identify where they might close the loop
Except for NCATE-accredited programs
Who? What? Where? When? Why?
Relevant samples Representative samples Reasonably-sized samples
Anonymity Confidentiality Privacy Informed consent
Find examples of: Direct assessment Indirect assessment Formative assessment Summative assessment Authentic assessment Triangulation
PLO When to assess What direct and indirect evidence to collect Who will collect the evidence How evidence will be assessed How decisions will be made
Valid Reliable Actionable Efficient and cost-effective Engages students Interesting to faculty Triangulation
Published tests Locally-developed tests Embedded assessment Portfolios
Surveys Interviews Focus groups
Holistic Analytic
Rubric Packet AAC&U VALUE Rubrics Specialized Packets
Efficiency Defines faculty expectations Well-trained reviewers use the same criteria Criterion-referenced judgments Ratings can be done by multiple people
Assess while grading Assess in a group
Columns are used for assessment Faculty can adapt an assessment rubric in different ways Faculty maintain control over their own grading
Grading may require extra criteria Grading requires more precision Calibrate when doing assessment
Speed up grading Clarify expectations to students Reduce student grade complaints Improve the reliability and validity of assessments and grades Make grading and assessment more efficient and effective Help faculty create better assignments
Below Expectations Needs Improvement Meets Expectations Exceeds Expectations
Adapt an already-existing rubric Analytic method
Consider starting at the extremes Some words I find useful
One reader/document. Two independent readers/document. Paired readers.
Collect the assessment evidence and remove identifying information. Develop and pilot test the rubric. Select exemplars of weak, medium, and strong student work. Consider pre-programming a spreadsheet so data can be entered and analyzed during the reading and participants can discuss results immediately.
Correlation Discrepancy Index
How good is good enough?
Celebrate! Change pedagogy Change curriculum Change student support Change faculty support Change equipment/supplies/space
1.Focus on what is important. 2.Don’t try to do too much at once. 3.Take samples. 4.Pilot test procedures. 5.Use rubrics. 6.Close the loop. 7.If you rely on adjunct faculty, include them in assessment. 8.Keep a written record.