The Nuts and Bolts of Assessment LAVC SLO Training Spring 2010 Partially adapted from a presentation by Arend Flick, Assessment Coordinator, Riverside.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Analyzing Student Work
Developing an Outcomes Assessment Plan. Part One: Asking a Meaningful Question OA is not hard science as we are doing it. Data that you collect is only.
Student Learning Outcomes and Assessment An Overview of Purpose, Structures and Systems Approved by Curriculum Committee Approved by Academic.
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
The goal of the Student Learning Outcomes (SLO) process at East Los Angeles College is to develop and implement innovative and effective assessments of.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Lamar State College-Port Arthur February 16 & 17, 2011.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
Making Your Assessments More Meaningful Flex Day 2015.
Apples to Oranges to Elephants: Comparing the Incomparable.
A Multi-method Approach: Assessment of Basic Communication Cheryl E Drout, Ph.D. SUNY-Fredonia.
Measuring Learning Outcomes Evaluation
Assessment Assessment Planning Assessment Improvements Assessment Data Dialogue & Reflection.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Principles of Assessment
Session Materials  Wiki
How to Use Data to Improve Student Learning Training Conducted at Campus-Based SLO Summit Spring 2014 How to Use Data to Improve Student Learning.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
October 31, Dialog about SLOs, assessment, and existing practices at TC Identify course level SLO to assess this semester Align SLO with TC’s institutional.
California State University East Bay
August 3,  Review “Guiding Principles for SLO Assessment” (ASCCC, 2010)  Review Assessment Pulse Roundtable results  Discuss and formulate our.
Standard 5 - Faculty Qualifications, Performance, and Development Kate Steffens St. Cloud State University.
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
NAVIGATING THE PROCESS OF STUDENT LEARNING OUTCOMES: DEVELOPMENT, EVALUATION, AND IMPROVEMENT Shannon M. Sexton, Julia M. Williams, & Timothy Chow Rose-Hulman.
Assessing Transfer-Level English Strengthening Student Success Conference, October 3, 2007 Sandra Stefani Comerford, Professor, English Assessment Coordinator.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Closing the Loop: The Assessment Process from Outcomes to Academic Excellence, Budgetary Competence and Community Engagement January 2012.
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
Understanding Meaning and Importance of Competency Based Assessment
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
On SLOA (SLO Assessment). (after Munch) Will Lecture die? No…
North Carolina Network for Excellence in Teaching  The Assessment Toolkit.
Everything You Need to Know for Spring 2010 S. Kashima.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Fall 2015 Professional Development Days C. Cruz-Johnson & R. Gamez August 28, Walking with Integrity Towards Student Success.
Connecting Course Goals, Assignments, and Assessment Faculty Development for Student Success at Prince George’s Community College William Peirce
PRESENTATION TO ASSOCIATION OF INSTITUTIONAL RESEARCH AND PLANNING OFFICERS BUFFALO, NEW YORK JUNE 11, 2009 How Campuses are Closing the GE Assessment.
Outcomes Assessment Adapted from a presentation by Arend Flick, Assessment Coordinator, Riverside Community College District.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Assessment and Testing
Building Rubrics that Align with Standards & Documenting Candidates’ Effects on Student Learning Cynthia Conn, Ph.D., Associate Director, Office of Academic.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Introduction to Academic Assessment John Duffield Office of Academic Assessment Georgia State University September 2013.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Incorporating Program Assessment into Your Annual Program Review June 29, 2006.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
Nuts and Bolts: Functional Variations of Assessment and Evaluation Barbara Hornum, PhD Director, Drexel Center for Academic Excellence Associate Professor,
Developing & Applying Rubrics To Improve Student Performance
Consider Your Audience
LASC 2010 Program Review Orientation
Developing a Student Learning Outcomes Assessment Plan and Report
Creating Analytic Rubrics April 27, 2017
Institutional Learning Outcomes Assessment
Institutional Effectiveness USF System Office of Decision Support
Student Learning Outcomes Assessment
Outcomes Assessment Adapted from a presentation by Arend Flick,
2017: SLOs & Assessment Reboot
Presentation transcript:

The Nuts and Bolts of Assessment LAVC SLO Training Spring 2010 Partially adapted from a presentation by Arend Flick, Assessment Coordinator, Riverside Community College District

An Introduction to Student Learning Outcomes As opposed to “objectives,” SLOs emphasize application of knowledge (what a student can do at the end of a course of instruction). As opposed to “objectives,” SLOs emphasize application of knowledge (what a student can do at the end of a course of instruction). They are not discrete or highly specific skills but “complexes of knowledge, ability, and attitudes.” They are not discrete or highly specific skills but “complexes of knowledge, ability, and attitudes.”

First Steps: Establishing Course SLOs SLO Statement SLO Statement Assessment Activity (done by students) Assessment Activity (done by students) Assessment Measure (done by instructors; e.g., rubrics) Assessment Measure (done by instructors; e.g., rubrics) Done through Valley College Curriculum Committee Done through Valley College Curriculum Committee Sets the plan for assessment Sets the plan for assessment

Evaluation of Course SLOs Is directly related to the outcome and can realistically measure/document the outcome Is directly related to the outcome and can realistically measure/document the outcome Is specific enough to show how the SLO is being assessed (e.g., it is not enough to simply write "exam" without showing how the exam will assess student learning) Is specific enough to show how the SLO is being assessed (e.g., it is not enough to simply write "exam" without showing how the exam will assess student learning) Will produce and/or document evidence of student learning Will produce and/or document evidence of student learning Will produce manageable information and statistical knowledge Will produce manageable information and statistical knowledge Is a realistic, feasible way of collecting and analyzing evidence Is a realistic, feasible way of collecting and analyzing evidence Can differentiate between different levels of achievement through the use of a rubric or other measure Can differentiate between different levels of achievement through the use of a rubric or other measure

ASSESSMENT What is it and how do we do it?

Outcomes Assessment Is... Something you’re already doing, even though you probably don’t call it that. Something you’re already doing, even though you probably don’t call it that. Not an end in itself, but a tool for educational and institutional improvement. Not an end in itself, but a tool for educational and institutional improvement.

Assessment: A Definition Assessment is best understood as a three-part, process in which we... Assessment is best understood as a three-part, process in which we...  Identify what we want students to be able to do, know, or think at the end of a unit of instruction. (That is, identify SLOs.)  Determine the extent to which they can do or know those things.  Adjust teaching practices and curriculum in light of that information.

Assessment Cycle 1. Articulate goals for student learning 2. Gather evidence about how well students are meeting the goals (and discuss/interpret this evidence) 3. Use this information to improve and to make learning visible

Why Do Assessment? To improve. To improve. To document learning. To document learning. To assist in planning and resource allocation processes. To assist in planning and resource allocation processes.

Methods of Assessment Methods of Assessment Many possible student activities such as essays, reports, portfolios, recitals, oral presentations, performances Many possible student activities such as essays, reports, portfolios, recitals, oral presentations, performances How will instructors evaluate the student activity? How will instructors evaluate the student activity? One possibility - RUBRICS One possibility - RUBRICS

Types Of Rubrics Holistic - performance traits are aggregated and a single score is given to each product Holistic - performance traits are aggregated and a single score is given to each product Analytic - each primary trait has a separate score (more useful for assessment) Analytic - each primary trait has a separate score (more useful for assessment)

Steps for Creating a Rubric 1. Identify what learning outcome(s) you’re assessing (e.g., critical thinking). 2. Identify an assignment that could enable students to demonstrate they’ve achieved that outcome. 3. Describe the best student product you could expect and the criteria you associate with that exemplary product. 4. Do the same with clearly inferior work, and then with marginally acceptable and marginally unacceptable work. 5. Categorize the criteria—a table is useful for that purpose. 6. Test the rubric, ideally asking colleagues who were not involved in its creation to use it, revising as needed to eliminate ambiguities.

Creating An Outcomes Assessment Rubric SLO Criteria “Emerging” 0 Points “Competent” 1 Point “Exemplary” 2 points Total Criterion # 1 Criterion # 2 Criterion # 3 Criterion # 4 Basic Assessment Rubric Structure

A Simple Generic Rubric 4Strong Clear evidence that the student has achieved the SLO. 4Strong Clear evidence that the student has achieved the SLO. 3Marginal Acceptable evidence that the student has generally achieved the SLO. 3Marginal Acceptable evidence that the student has generally achieved the SLO. 2Inadequate Insufficient evidence that the student has achieved the SLO. 2Inadequate Insufficient evidence that the student has achieved the SLO. 1Weak Little or no evidence that the student has achieved the SLO. 1Weak Little or no evidence that the student has achieved the SLO.

Employing Rubrics to Do Course- Based Outcomes Assessment Involve everyone in the discipline (including adjuncts) Involve everyone in the discipline (including adjuncts) You don’t have to assess products from every student in every section—a random sampling is fine. You don’t have to assess products from every student in every section—a random sampling is fine. Decide how samples will be collected and who will apply the rubric. Decide how samples will be collected and who will apply the rubric. Hold norming sessions with everyone who will be involved in the assessment. Hold norming sessions with everyone who will be involved in the assessment. Aggregate the data and DISCUSS THEIR IMPLICATIONS: you aren’t completing the assessment cycle unless you use results to improve teaching and learning. Aggregate the data and DISCUSS THEIR IMPLICATIONS: you aren’t completing the assessment cycle unless you use results to improve teaching and learning.

Assessing vs. Grading A course grade is based on a student’s achievement of the course objectives. A course grade is based on a student’s achievement of the course objectives. It is possible for a student to pass a class but not meet a specific course outcome and vice-versa. It is possible for a student to pass a class but not meet a specific course outcome and vice-versa. Various assessment techniques can be used in a class that may or may not be part of the course grade. Various assessment techniques can be used in a class that may or may not be part of the course grade. Grades are often based on more than learning outcomes (e.g., participation, improvement) Grades are often based on more than learning outcomes (e.g., participation, improvement) The grade alone does not identify which component skills the student has mastered. Furthermore, overall grades would not provide an instructor with feedback on which skills the class overall found difficult. Assessment provides class level analysis of strengths and weaknesses. The grade alone does not identify which component skills the student has mastered. Furthermore, overall grades would not provide an instructor with feedback on which skills the class overall found difficult. Assessment provides class level analysis of strengths and weaknesses.

Assessing vs. Grading What would we look at to grade this assignment? What would we look at to grade this assignment? Which student is not doing well? Which student is not doing well? What would we look at for assessment? What would we look at for assessment? Which aspect is most problematic? Which aspect is most problematic? JimBobSueAnn Content4333 Structure3324 Grammar2322 Total9979

Sample SLO Assessment English 1A Riverside Community College

English 1A Assessment Pilot Chose specific SLOs to assess Chose specific SLOs to assess Developed rubric Developed rubric Developed plan to select representative sample of late-1A writing Developed plan to select representative sample of late-1A writing Met to read and score 115 essays (about 10% of those actually written in the course) Met to read and score 115 essays (about 10% of those actually written in the course) Interpreted results, returned essays to instructors with comments, and reported back to the discipline on findings. Interpreted results, returned essays to instructors with comments, and reported back to the discipline on findings.

Elements of the Assessment Rubric Effective use of quotation Effective use of quotation Effective use of MLA conventions in citing sources Effective use of MLA conventions in citing sources Effective control over conventions of written English (including grammar, punctuation, etc.) Effective control over conventions of written English (including grammar, punctuation, etc.) Suitable choice and conception of topic Suitable choice and conception of topic Sufficient cognitive engagement with topic Sufficient cognitive engagement with topic

English 1A Assessment Results N = 115 Little or No Evidence (1) Inadequate Evidence (2) Adequate Evidence (3) Clear Evidence (4) Use of Quotation Use of MLA Conventions Grammar and Punctuation Quality of Topic Engagement with Topic

Assessment Results (Simplified) : Percentage of sample essays demonstrating clear or adequate evidence of SLO achievement: Percentage of sample essays demonstrating clear or adequate evidence of SLO achievement: Choice of topic 74.4 Grammar, etc Engagement w/topic 68.7 Use of quotation 57.4 Use of MLA conven. 51

Interpretation of Results Choice of Topic: Choice of Topic: 75% of students sampled chose topics that seemed suitable for late-1A writing. 75% of students sampled chose topics that seemed suitable for late-1A writing. But 25% did not. But 25% did not.

Effective Use of Quotation: Effective Use of Quotation: The 57% success rate is disquieting. The 57% success rate is disquieting. Discussion during and after the reading suggested that students were having equal difficulty with the mechanics of quoting, the ability to contextualize and analyze quotation effectively, the process of deciding when and how much (or little) to quote. Discussion during and after the reading suggested that students were having equal difficulty with the mechanics of quoting, the ability to contextualize and analyze quotation effectively, the process of deciding when and how much (or little) to quote.

Use of MLA conventions: Use of MLA conventions: Barely half of the essays demonstrated competency in this area Barely half of the essays demonstrated competency in this area Discussion at the reading speculated that we aren’t spending enough time teaching MLA conventions and quotation methods--or holding students to sufficient standards in our grading practices. Discussion at the reading speculated that we aren’t spending enough time teaching MLA conventions and quotation methods--or holding students to sufficient standards in our grading practices.

Using the Results to Improve As a model for doing course-based assessment, this approach has been modified to assess learning in other English courses. As a model for doing course-based assessment, this approach has been modified to assess learning in other English courses. English 1A assessment report sent to all English 1A instructors, underscoring evidence that we need to teach and assess quotation and MLA usage. English 1A assessment report sent to all English 1A instructors, underscoring evidence that we need to teach and assess quotation and MLA usage. Workshops on these SLOs in our writing labs. Workshops on these SLOs in our writing labs. Discussion and professional development activities on how to teach to these SLOs in discipline meetings. Discussion and professional development activities on how to teach to these SLOs in discipline meetings. Development of course handbooks to make expectations clearer and provide pedagogical advice to all instructors. Development of course handbooks to make expectations clearer and provide pedagogical advice to all instructors.

What to Do If Assessment Results Are Disappointing? Consider alternate teaching methods. Consider alternate teaching methods. Brainstorm with colleagues about their methods. Brainstorm with colleagues about their methods. Reconsider the curriculum itself. Reconsider the curriculum itself. Reconsider the assessment method— maybe it gave you bad data. Reconsider the assessment method— maybe it gave you bad data.

Documenting the Process Course-level Assessment Report Form (available at Course-level Assessment Report Form (available at Submitted to SLO Coordinator Submitted to SLO Coordinator Reviewed by SLO Committee Reviewed by SLO Committee

The Form

Criteria for Evaluation of Assessment Reports Sampling – Sampling methodology is adequately described (i.e., how sampling was done, number of students and faculty/staff involved out of the total) and is appropriately done. Sampling – Sampling methodology is adequately described (i.e., how sampling was done, number of students and faculty/staff involved out of the total) and is appropriately done. Description of Methodology – The methodology is clearly described and contains detailed information about the tools used (e.g., student activity, rubric elements) and how inter-rater reliability was achieved. Description of Methodology – The methodology is clearly described and contains detailed information about the tools used (e.g., student activity, rubric elements) and how inter-rater reliability was achieved. Valid Data – The assessment accurately measures what it was trying to measure. Valid Data – The assessment accurately measures what it was trying to measure. Effective Data – The data contribute to the improvement of teaching, learning or institutional effectiveness. Effective Data – The data contribute to the improvement of teaching, learning or institutional effectiveness. Collaborative Review – The data was collaboratively reviewed by members of the discipline/service area. Collaborative Review – The data was collaboratively reviewed by members of the discipline/service area. Proposal of Improvements – A clear plan is presented for improvement that is based on the data given. The plan includes how results will be shared with others in the discipline/area. Proposal of Improvements – A clear plan is presented for improvement that is based on the data given. The plan includes how results will be shared with others in the discipline/area.

Assessment Results Should be discussed at department meetings (documented in minutes) Should be discussed at department meetings (documented in minutes) Are reported in annual plans and discussed in comprehensive program reviews Are reported in annual plans and discussed in comprehensive program reviews Provide data for planning Provide data for planning

Assessment Resources: Books Thomas Angelo and Patricia Cross, Classroom Assessment Techniques (Jossey-Bass, 1993) Thomas Angelo and Patricia Cross, Classroom Assessment Techniques (Jossey-Bass, 1993) C. A. Palomba and Trudy W. Banta, Assessment Essentials (Jossey-Bass, 1999) C. A. Palomba and Trudy W. Banta, Assessment Essentials (Jossey-Bass, 1999) Barbara Walvoord, Assessment Clear and Simple (Jossey-Bass, 2004) Barbara Walvoord, Assessment Clear and Simple (Jossey-Bass, 2004)

Assessment Resources: Websites (all with useful links) The California Assessment Initiative: The California Assessment Initiative: The Center for Student Success of the California College Research and Planning (RP) Group: The Center for Student Success of the California College Research and Planning (RP) Group: Janet Fulks’s excellent Bakersfield College website: Janet Fulks’s excellent Bakersfield College website: Default.htm Default.htm North Carolina State’s comprehensive website: North Carolina State’s comprehensive website: The Riverside CCD Assessment website: veness/assess/index.cfm The Riverside CCD Assessment website: veness/assess/index.cfm veness/assess/index.cfm veness/assess/index.cfm

Assessment Resources: LAVC SLO website – SLO website – SLO Coordinator – Rebecca Stein SLO Coordinator – Rebecca Stein (818) (818) Office: AH&S 305 Office: AH&S 305 Dean of Research & Planning – Michelle Fowles Dean of Research & Planning – Michelle Fowles (818) (818) Office: Administration Building Office: Administration Building