Download presentation
Presentation is loading. Please wait.
Published byHilary Gregory Modified over 9 years ago
1
The Nuts and Bolts of Assessment LAVC SLO Training Spring 2010 Partially adapted from a presentation by Arend Flick, Assessment Coordinator, Riverside Community College District
2
An Introduction to Student Learning Outcomes As opposed to “objectives,” SLOs emphasize application of knowledge (what a student can do at the end of a course of instruction). As opposed to “objectives,” SLOs emphasize application of knowledge (what a student can do at the end of a course of instruction). They are not discrete or highly specific skills but “complexes of knowledge, ability, and attitudes.” They are not discrete or highly specific skills but “complexes of knowledge, ability, and attitudes.”
3
First Steps: Establishing Course SLOs SLO Statement SLO Statement Assessment Activity (done by students) Assessment Activity (done by students) Assessment Measure (done by instructors; e.g., rubrics) Assessment Measure (done by instructors; e.g., rubrics) Done through Valley College Curriculum Committee Done through Valley College Curriculum Committee Sets the plan for assessment Sets the plan for assessment
4
Evaluation of Course SLOs Is directly related to the outcome and can realistically measure/document the outcome Is directly related to the outcome and can realistically measure/document the outcome Is specific enough to show how the SLO is being assessed (e.g., it is not enough to simply write "exam" without showing how the exam will assess student learning) Is specific enough to show how the SLO is being assessed (e.g., it is not enough to simply write "exam" without showing how the exam will assess student learning) Will produce and/or document evidence of student learning Will produce and/or document evidence of student learning Will produce manageable information and statistical knowledge Will produce manageable information and statistical knowledge Is a realistic, feasible way of collecting and analyzing evidence Is a realistic, feasible way of collecting and analyzing evidence Can differentiate between different levels of achievement through the use of a rubric or other measure Can differentiate between different levels of achievement through the use of a rubric or other measure
5
ASSESSMENT What is it and how do we do it?
6
Outcomes Assessment Is... Something you’re already doing, even though you probably don’t call it that. Something you’re already doing, even though you probably don’t call it that. Not an end in itself, but a tool for educational and institutional improvement. Not an end in itself, but a tool for educational and institutional improvement.
7
Assessment: A Definition Assessment is best understood as a three-part, process in which we... Assessment is best understood as a three-part, process in which we... Identify what we want students to be able to do, know, or think at the end of a unit of instruction. (That is, identify SLOs.) Determine the extent to which they can do or know those things. Adjust teaching practices and curriculum in light of that information.
8
Assessment Cycle 1. Articulate goals for student learning 2. Gather evidence about how well students are meeting the goals (and discuss/interpret this evidence) 3. Use this information to improve and to make learning visible
9
Why Do Assessment? To improve. To improve. To document learning. To document learning. To assist in planning and resource allocation processes. To assist in planning and resource allocation processes.
10
Methods of Assessment Methods of Assessment Many possible student activities such as essays, reports, portfolios, recitals, oral presentations, performances Many possible student activities such as essays, reports, portfolios, recitals, oral presentations, performances How will instructors evaluate the student activity? How will instructors evaluate the student activity? One possibility - RUBRICS One possibility - RUBRICS
11
Types Of Rubrics Holistic - performance traits are aggregated and a single score is given to each product Holistic - performance traits are aggregated and a single score is given to each product Analytic - each primary trait has a separate score (more useful for assessment) Analytic - each primary trait has a separate score (more useful for assessment)
12
Steps for Creating a Rubric 1. Identify what learning outcome(s) you’re assessing (e.g., critical thinking). 2. Identify an assignment that could enable students to demonstrate they’ve achieved that outcome. 3. Describe the best student product you could expect and the criteria you associate with that exemplary product. 4. Do the same with clearly inferior work, and then with marginally acceptable and marginally unacceptable work. 5. Categorize the criteria—a table is useful for that purpose. 6. Test the rubric, ideally asking colleagues who were not involved in its creation to use it, revising as needed to eliminate ambiguities.
13
Creating An Outcomes Assessment Rubric SLO Criteria “Emerging” 0 Points “Competent” 1 Point “Exemplary” 2 points Total Criterion # 1 Criterion # 2 Criterion # 3 Criterion # 4 Basic Assessment Rubric Structure
14
A Simple Generic Rubric 4Strong Clear evidence that the student has achieved the SLO. 4Strong Clear evidence that the student has achieved the SLO. 3Marginal Acceptable evidence that the student has generally achieved the SLO. 3Marginal Acceptable evidence that the student has generally achieved the SLO. 2Inadequate Insufficient evidence that the student has achieved the SLO. 2Inadequate Insufficient evidence that the student has achieved the SLO. 1Weak Little or no evidence that the student has achieved the SLO. 1Weak Little or no evidence that the student has achieved the SLO.
15
Employing Rubrics to Do Course- Based Outcomes Assessment Involve everyone in the discipline (including adjuncts) Involve everyone in the discipline (including adjuncts) You don’t have to assess products from every student in every section—a random sampling is fine. You don’t have to assess products from every student in every section—a random sampling is fine. Decide how samples will be collected and who will apply the rubric. Decide how samples will be collected and who will apply the rubric. Hold norming sessions with everyone who will be involved in the assessment. Hold norming sessions with everyone who will be involved in the assessment. Aggregate the data and DISCUSS THEIR IMPLICATIONS: you aren’t completing the assessment cycle unless you use results to improve teaching and learning. Aggregate the data and DISCUSS THEIR IMPLICATIONS: you aren’t completing the assessment cycle unless you use results to improve teaching and learning.
16
Assessing vs. Grading A course grade is based on a student’s achievement of the course objectives. A course grade is based on a student’s achievement of the course objectives. It is possible for a student to pass a class but not meet a specific course outcome and vice-versa. It is possible for a student to pass a class but not meet a specific course outcome and vice-versa. Various assessment techniques can be used in a class that may or may not be part of the course grade. Various assessment techniques can be used in a class that may or may not be part of the course grade. Grades are often based on more than learning outcomes (e.g., participation, improvement) Grades are often based on more than learning outcomes (e.g., participation, improvement) The grade alone does not identify which component skills the student has mastered. Furthermore, overall grades would not provide an instructor with feedback on which skills the class overall found difficult. Assessment provides class level analysis of strengths and weaknesses. The grade alone does not identify which component skills the student has mastered. Furthermore, overall grades would not provide an instructor with feedback on which skills the class overall found difficult. Assessment provides class level analysis of strengths and weaknesses.
17
Assessing vs. Grading What would we look at to grade this assignment? What would we look at to grade this assignment? Which student is not doing well? Which student is not doing well? What would we look at for assessment? What would we look at for assessment? Which aspect is most problematic? Which aspect is most problematic? JimBobSueAnn Content4333 Structure3324 Grammar2322 Total9979
18
Sample SLO Assessment English 1A Riverside Community College
19
English 1A Assessment Pilot Chose specific SLOs to assess Chose specific SLOs to assess Developed rubric Developed rubric Developed plan to select representative sample of late-1A writing Developed plan to select representative sample of late-1A writing Met to read and score 115 essays (about 10% of those actually written in the course) Met to read and score 115 essays (about 10% of those actually written in the course) Interpreted results, returned essays to instructors with comments, and reported back to the discipline on findings. Interpreted results, returned essays to instructors with comments, and reported back to the discipline on findings.
20
Elements of the Assessment Rubric Effective use of quotation Effective use of quotation Effective use of MLA conventions in citing sources Effective use of MLA conventions in citing sources Effective control over conventions of written English (including grammar, punctuation, etc.) Effective control over conventions of written English (including grammar, punctuation, etc.) Suitable choice and conception of topic Suitable choice and conception of topic Sufficient cognitive engagement with topic Sufficient cognitive engagement with topic
21
English 1A Assessment Results N = 115 Little or No Evidence (1) Inadequate Evidence (2) Adequate Evidence (3) Clear Evidence (4) Use of Quotation 20294521 Use of MLA Conventions 29273128 Grammar and Punctuation 9253051 Quality of Topic 8213749 Engagement with Topic 12344326
22
Assessment Results (Simplified) : Percentage of sample essays demonstrating clear or adequate evidence of SLO achievement: Percentage of sample essays demonstrating clear or adequate evidence of SLO achievement: Choice of topic 74.4 Grammar, etc. 70.4 Engagement w/topic 68.7 Use of quotation 57.4 Use of MLA conven. 51
23
Interpretation of Results Choice of Topic: Choice of Topic: 75% of students sampled chose topics that seemed suitable for late-1A writing. 75% of students sampled chose topics that seemed suitable for late-1A writing. But 25% did not. But 25% did not.
24
Effective Use of Quotation: Effective Use of Quotation: The 57% success rate is disquieting. The 57% success rate is disquieting. Discussion during and after the reading suggested that students were having equal difficulty with the mechanics of quoting, the ability to contextualize and analyze quotation effectively, the process of deciding when and how much (or little) to quote. Discussion during and after the reading suggested that students were having equal difficulty with the mechanics of quoting, the ability to contextualize and analyze quotation effectively, the process of deciding when and how much (or little) to quote.
25
Use of MLA conventions: Use of MLA conventions: Barely half of the essays demonstrated competency in this area Barely half of the essays demonstrated competency in this area Discussion at the reading speculated that we aren’t spending enough time teaching MLA conventions and quotation methods--or holding students to sufficient standards in our grading practices. Discussion at the reading speculated that we aren’t spending enough time teaching MLA conventions and quotation methods--or holding students to sufficient standards in our grading practices.
26
Using the Results to Improve As a model for doing course-based assessment, this approach has been modified to assess learning in other English courses. As a model for doing course-based assessment, this approach has been modified to assess learning in other English courses. English 1A assessment report sent to all English 1A instructors, underscoring evidence that we need to teach and assess quotation and MLA usage. English 1A assessment report sent to all English 1A instructors, underscoring evidence that we need to teach and assess quotation and MLA usage. Workshops on these SLOs in our writing labs. Workshops on these SLOs in our writing labs. Discussion and professional development activities on how to teach to these SLOs in discipline meetings. Discussion and professional development activities on how to teach to these SLOs in discipline meetings. Development of course handbooks to make expectations clearer and provide pedagogical advice to all instructors. Development of course handbooks to make expectations clearer and provide pedagogical advice to all instructors.
27
What to Do If Assessment Results Are Disappointing? Consider alternate teaching methods. Consider alternate teaching methods. Brainstorm with colleagues about their methods. Brainstorm with colleagues about their methods. Reconsider the curriculum itself. Reconsider the curriculum itself. Reconsider the assessment method— maybe it gave you bad data. Reconsider the assessment method— maybe it gave you bad data.
28
Documenting the Process Course-level Assessment Report Form (available at www.lavc.edu/slo/Forms.html) Course-level Assessment Report Form (available at www.lavc.edu/slo/Forms.html) www.lavc.edu/slo/Forms.html Submitted to SLO Coordinator Submitted to SLO Coordinator Reviewed by SLO Committee Reviewed by SLO Committee
29
The Form
30
Criteria for Evaluation of Assessment Reports Sampling – Sampling methodology is adequately described (i.e., how sampling was done, number of students and faculty/staff involved out of the total) and is appropriately done. Sampling – Sampling methodology is adequately described (i.e., how sampling was done, number of students and faculty/staff involved out of the total) and is appropriately done. Description of Methodology – The methodology is clearly described and contains detailed information about the tools used (e.g., student activity, rubric elements) and how inter-rater reliability was achieved. Description of Methodology – The methodology is clearly described and contains detailed information about the tools used (e.g., student activity, rubric elements) and how inter-rater reliability was achieved. Valid Data – The assessment accurately measures what it was trying to measure. Valid Data – The assessment accurately measures what it was trying to measure. Effective Data – The data contribute to the improvement of teaching, learning or institutional effectiveness. Effective Data – The data contribute to the improvement of teaching, learning or institutional effectiveness. Collaborative Review – The data was collaboratively reviewed by members of the discipline/service area. Collaborative Review – The data was collaboratively reviewed by members of the discipline/service area. Proposal of Improvements – A clear plan is presented for improvement that is based on the data given. The plan includes how results will be shared with others in the discipline/area. Proposal of Improvements – A clear plan is presented for improvement that is based on the data given. The plan includes how results will be shared with others in the discipline/area.
31
Assessment Results Should be discussed at department meetings (documented in minutes) Should be discussed at department meetings (documented in minutes) Are reported in annual plans and discussed in comprehensive program reviews Are reported in annual plans and discussed in comprehensive program reviews Provide data for planning Provide data for planning
32
Assessment Resources: Books Thomas Angelo and Patricia Cross, Classroom Assessment Techniques (Jossey-Bass, 1993) Thomas Angelo and Patricia Cross, Classroom Assessment Techniques (Jossey-Bass, 1993) C. A. Palomba and Trudy W. Banta, Assessment Essentials (Jossey-Bass, 1999) C. A. Palomba and Trudy W. Banta, Assessment Essentials (Jossey-Bass, 1999) Barbara Walvoord, Assessment Clear and Simple (Jossey-Bass, 2004) Barbara Walvoord, Assessment Clear and Simple (Jossey-Bass, 2004)
33
Assessment Resources: Websites (all with useful links) The California Assessment Initiative: http://cai.cc.ca.us/ The California Assessment Initiative: http://cai.cc.ca.us/http://cai.cc.ca.us/ The Center for Student Success of the California College Research and Planning (RP) Group: http://css.rpgroup.org The Center for Student Success of the California College Research and Planning (RP) Group: http://css.rpgroup.org http://css.rpgroup.org Janet Fulks’s excellent Bakersfield College website: Janet Fulks’s excellent Bakersfield College website: http://online.bakersfieldcollege.edu/courseassessment/ Default.htm http://online.bakersfieldcollege.edu/courseassessment/ Default.htm North Carolina State’s comprehensive website: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm North Carolina State’s comprehensive website: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm http://www2.acs.ncsu.edu/UPA/assmt/resource.htm The Riverside CCD Assessment website: http://www.rcc.edu/administration/academicaffairs/effecti veness/assess/index.cfm The Riverside CCD Assessment website: http://www.rcc.edu/administration/academicaffairs/effecti veness/assess/index.cfm http://www.rcc.edu/administration/academicaffairs/effecti veness/assess/index.cfm http://www.rcc.edu/administration/academicaffairs/effecti veness/assess/index.cfm
34
Assessment Resources: LAVC SLO website – www.lavc.edu/slo SLO website – www.lavc.edu/slowww.lavc.edu/slo SLO Coordinator – Rebecca Stein SLO Coordinator – Rebecca Stein steinrl@lavc.edu steinrl@lavc.edu steinrl@lavc.edu (818) 947-2538 (818) 947-2538 Office: AH&S 305 Office: AH&S 305 Dean of Research & Planning – Michelle Fowles Dean of Research & Planning – Michelle Fowles fowlesmr@lavc.edu fowlesmr@lavc.edu fowlesmr@lavc.edu (818) 947-2437 (818) 947-2437 Office: Administration Building Office: Administration Building
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.