SLO Course Assessment in 5 Easy Steps Vivian Mun, Ed.D.
Step One - Know the Basics - Outcome, Student Activity and Evaluation Tool Step 1: ◦Look at the course’s approved SLO addendum. This will tell you the outcome you are assessing, what evidence you will be gathering to see if students have achieved the outcome, and how instructors will evaluate the student work.
Example 1: Foreign Language 1 SLO - Using the vocabulary and structures learned, students will be able to perform elementary everyday communicative functions in the target language orally and in writing. Evidence of SLO attainment for this assessment included: Compositions, oral presentations, role plays, sketches, and/or interviews in the target language on topics covered in the semester Evaluation tool - rubrics Example 2: Math 105 SLO – Students will be able to perform arithmetic operations without the use of a calculator. Evidence of SLO – Embedded questions from the final exam Evaluation tool – item analysis
Step Two - Randomly select student work samples If the course has only one section: You do NOT need to randomly select student samples as you will analyze ALL student work. If the course has multiple sections: Randomly select work from the 1/3 of the total enrolled students. The 1/3 that is randomly selected must represent the greater diversity of courses offered (e.g., morning/afternoon/evening/online sections; part-time/full-time faculty)
Different Ways of Sampling Common Types of Sampling (excerpt taken from ◦Simple Random Sampling: You randomly select a certain number of students or artifacts. ◦Stratified Sampling: Students are sorted into homogenous groups and then a random sample is selected from each group. This process is useful when there are groups that may be underrepresented. ◦Systemic Sampling: You select the nth (e.g. 7th, 9th, 20th) student or artifact from an organized list. ◦Cluster Sampling: You randomly select clusters or groups (e.g. classes or sections), and you evaluate the assignments of all the students in those randomly selected clusters or groups
Step Three - The Rubric and Norming Process Step 3: Randomly select student work samples. If you are using a rubric to evaluate student work, the first step is to write and test the rubric. Then use a “norming process” to achieve inter-rater reliability. (Note: this would not apply to multiple- choice exams.)
What is the norming process? Norming process procedures: 1) Choose at least five anonymous student work samples. 2) Faculty assess anonymous student work samples, according to a rubric or set evaluation criteria. 3) Have a dialogue on scores given on the student work samples. 4) If consensus cannot be achieved, assess more anonymous student work samples or consider revising the rubric. When all faculty come to an agreement on the scores for at least five student work samples, inter-reliability has been achieved.
Step Four – Data Analysis Score student work (according to rubric, if applicable). Compile all the scores (according to each rubric dimension, if using a rubric to evaluate the data).
Questions to ask during Data Analysis Review the data. The following questions are helpful to ask when analyzing the data: What patterns/common themes emerge around specific items in the data? What are the areas of strength? What are the areas of improvement? Are there any deviations from these patterns? What interesting stories emerge from the data? Do any of the patterns/emergent themes suggest that additional data needs to be collected?
How Results are Going to Be Used for Course/Program Improvement What actions will you take based on the data? How could the course be improved? Does the assessment process itself need revision? Does the SLO need to be revised?
Filling out the Report Form Make sure to fill out course-level assessment report (found on and it to Vivian Mun.
Step Five - Closing the Loop Report your findings to other faculty. Collaborate and discuss how to improve teaching, learning, and institutional effectiveness.