The goal of the Student Learning Outcomes (SLO) process at East Los Angeles College is to develop and implement innovative and effective assessments of.

Slides:



Advertisements
Similar presentations
Check Sheets, Scoring Guides, and Rubrics, Oh, My! Using grids to improve writing and reduce grading time.
Advertisements

Professor or Editor? Time-Saving Strategies for Effective Grading of Writing Assignments DR. DAVID S. HOGSETTE.
Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
A Guide for College Assessment Leaders Ursula Waln, Director of Student Learning Assessment Central New Mexico Community College.
Rubric Workshop Los Angeles Valley College Fall 2008.
Vivian Mun, Ed.D. Accreditation What is a rubric? A rubric is a scoring tool that lists the criteria for a piece of work, or “what counts” (for.
STUDENT LEARNING OUTCOMES ASSESSMENT. Cycle of Assessment Course Goals/ Intended Outcomes Means Of Assessment And Criteria For Success Summary of Data.
ARIZONA WESTERN COLLEGE ASSESSMENT & PROGRAM REVIEW.
The Nuts and Bolts of Assessment LAVC SLO Training Spring 2010 Partially adapted from a presentation by Arend Flick, Assessment Coordinator, Riverside.
Using Rubrics for Assessment: A Primer Marcel S. Kerr Summer 2007 
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
Assessment Rubrics Los Angeles City College Assessment Team.
Becoming one with the SLO The Zen Of Assessment of SLOs And Rubric Writing.
Constructing Good Courses : Designing Rubrics Jodi Welsch CTEAG Sessions 2008.
Lamar State College-Port Arthur February 16 & 17, 2011.
ELAC SLO RETREAT 2009 Veronica Jaramillo, Ph.D. Mona Panchal Anthony Cadavid ELAC SLO RETREAT 2009.
Assessment of Learning in Student-Centered Courses
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
ENG 111 & 112: Goals Overview English 111 & 112 use an integrated reading/writing approach to develop students’ critical thinking and analytical writing.
performance INDICATORs performance APPRAISAL RUBRIC
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Daniel Fasko, Jr., Ph.D..  Definition of Critical Thinking  Critical Thinking Skills  Critical Thinking Dispositions  Instructional Strategies  Assessment.
Pedagogies of Engagement (Cooperative Learning) and Assessment – Overview – Karl A. Smith Engineering Education – Purdue University Civil Engineering -
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Assessment: Creating and Using Rubrics. Workshop Goals Review rubrics and parts of rubrics Use your assignment to create a rubric scale & dimension Peer.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Formative Assessment.
Assessment of Information Skills. There is more teaching going on around here than learning and you ought to do something about that. Graduating Senior.
October 31, Dialog about SLOs, assessment, and existing practices at TC Identify course level SLO to assess this semester Align SLO with TC’s institutional.
California State University East Bay
Rubrics for Complex Papers/Projects Academic Assessment Workshop May 13-14, 2010 Bea Babbitt, Ph.D.
Closing the Loop: The Assessment Process from Outcomes to Academic Excellence, Budgetary Competence and Community Engagement January 2012.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
Everything You Need to Know for Spring 2010 S. Kashima.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Peralta Community Colleges: Environments of Effective Learning and Innovation. January
Designing and Using Rubrics PRESENTED BY MS SIDDRA PERVAIZ MS RABIA YOUSAF.
The Basics of.  The ACCJC requires it for accreditation  To report it on program review  To make course outlines more relevant (SLOs, assignments,
Rubrics.
Outcomes Assessment Adapted from a presentation by Arend Flick, Assessment Coordinator, Riverside Community College District.
Agenda: 1/2 Welcome back Final project overview and schedule – Clarification & questions Restriction analysis challenge Samples – gene cloning Homework:
Rubrics for Complex Papers/Projects Academic Assessment Workshop May 14-15, 2009 Bea Babbitt, Ph.D.
Assessment & Program Review President’s Retreat 2008 College of Micronesia - FSM May 13 – 15, 2008 FSM China Friendship Sports Center.
Developing Evaluation Rubrics Cynthia Conn, Ph.D. Associate Director Office of Academic Assessment Paula Garcia, Ph.D. Research & Assessment Coordinator.
Building Rubrics that Align with Standards & Documenting Candidates’ Effects on Student Learning Cynthia Conn, Ph.D., Associate Director, Office of Academic.
MAVILLE ALASTRE-DIZON Philippine Normal University
Rubrics Staff development workshop 19/9/2014 Dr Ruth Fazakerley.
Program Review 2.0 Pilot 2 October Self Evaluation HAPS is the result of a process that began in 2012, the last Accreditation self- evaluation.
Formative Assessment. Fink’s Integrated Course Design.
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
Texas Higher Education Coordinating Board Dr. Christopher L. Markwood Texas A&M University-Corpus Christi January 23, 2014.
FLORIDA EDUCATORS ACCOMPLISHED PRACTICES Newly revised.
Tia Juana Malone, English Professor Ruth Ronan, Course Developer Assessment Strategies That Promote Student Engagement.
Getting Prepared for the Webinar
Jane Schmidt-Wilk, Ph.D. Maharishi University of Management
Consider Your Audience
LASC 2010 Program Review Orientation
David Keller, Ph.D. Curriculum & Assessment Specialist
Creating Analytic Rubrics April 27, 2017
Rubrics.
Rubrics.
Effective Use of Rubrics to Assess Student Learning
Outcomes Assessment Adapted from a presentation by Arend Flick,
Presented by: Skyline College SLOAC Committee Fall 2007
Institutional Self Evaluation Report Team Training
Peralta Community Colleges: Environments of Effective Learning and Innovation. January 2012.
Presentation transcript:

The goal of the Student Learning Outcomes (SLO) process at East Los Angeles College is to develop and implement innovative and effective assessments of our academic and support programs. These assessments will lead to increased student success through the improvement of our basic skills classes, general education courses, transfer programs, and workforce education programs, as demonstrated by our course completion, certificate, graduation, and transfer rates. This SLO process will serve our multicultural community with its educationally diverse needs, and prepare our students for the challenges of the 21st century.

Institutional Outcomes

 A clear statement of what students will be able to know, understand and do as a result of a learning process.  Connected to Core Competencies  MEASURABLE

1. Articulate goals for student learning 2. Gather evidence about how well students are meeting the goals (and discuss/interpret this evidence) 3. Use this information to improve and to make learning visible

 Making expectations explicit and public;  setting appropriate criteria and high standards for learning quality;  systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards;  Analysis of the evidence of the degree to which expectations for student learning are met  Planning improvements based on analysis- using the resulting information to document, explain, and improve performance  Institutional in scope

 Merely data-gathering  A research project  Grading  A new form of faculty evaluation

 To improve.  To document learning.  To assist in planning and resource allocation processes.

Assessment TypesComments Direct MeasureStudent demonstrates achievement of outcome. Indirect MeasureOthers report perceptions of how well students have achieved the outcome. Quantitative AssessmentAssessment results are summarized in a numerical score. Qualitative AssessmentAssessment results are described verbally and may involve rubrics. Embedded AssessmentAssessment activities are embedded within courses. Value-Added AssessmentStudent learning is demonstrated by determining how much your students have gained through the learning experience. (pre- and post- test) Absolute Attainment AssessmentCan the students exhibit mastery of the learning outcome at an acceptable level.

Assessment TypesComments Authentic AssessmentAssessment process is similar to or embedded in relevant real- world activities. Formative AssessmentDesigned to give feedback to improve what is being assessed. Summative AssessmentDesigned to provide an evaluative summary

DomainComments CognitiveKnowledge PsychomotorSkills AffectiveAttitude

 Embedded Questions  Essay  Speech  Portfolios  Surveys  Projects

1. The assessment of student learning begins with educational values. 2. Assessment is most effective when it reflects an understanding of learning as multidimensional, integrated, and revealed in performance over time. 3. Assessment works best when the programs it seeks to improve have clear,explicitly stated purposes.

4. Assessment requires attention to outcomes but also and equally to the experiences that lead to those outcomes. 5. Assessment works best when it is ongoing not episodic. 6. Assessment fosters wider improvement when representatives from across the educational community are involved.

7. Assessment makes a difference when it begins with issues of use and illuminates questions that people really care about. 8. Assessment is most likely to lead to improvement when it is part of a larger set of conditions that promote change. 9. Through assessment, educators meet responsibilities to students and to the public.

 Embedded Questions  Essay  Speech  Portfolios  Surveys  Projects

 A scoring tool that lays out the specific expectations for an assignment or for other assessment purposes. Information in this presentation is adapted from Introduction to Rubrics by Dannelle D. Stevens and Antonia J. Levi, 2005.

 4Strong Clear evidence that the student has achieved the SLO.  3Marginal Acceptable evidence that the student has generally achieved the SLO.  2Inadequate Insufficient evidence that the student has achieved the SLO.  1Weak Little or no evidence that the student has achieved the SLO.

1. Identify what learning outcome(s) you’re assessing (e.g., critical thinking). 2. Identify an assignment that could enable students to demonstrate they’ve achieved that outcome. 3. Describe the best student product you could expect and the criteria you associate with that exemplary product. 4. Do the same with clearly inferior work, and then with marginally acceptable and marginally unacceptable work. 5. Categorize the criteria—a table is useful for that purpose. 6. Test the rubric, ideally asking colleagues who were not involved in its creation to use it, revising as needed to eliminate ambiguities.

SLO Criteria“Emerging” 0 Points “Competent” 1 Point “Exemplary” 2 pointsTotal Criterion # 1 Criterion # 2 Criterion # 3 Criterion # 4 Basic Assessment Rubric Structure

 Task Description (Outcome)  Scales (Levels of Performance or Competency)  Dimensions (Primary Traits of Evaluation/Criteria)  Performance Descriptors (Qualifying Statements)

Task Description (Outcome) SCALE LEVEL 1 SCALE LEVEL 2 SCALE LEVEL 3 Dimension 1 (Criteria) Performance Level Performance Level Performance Level Dimension 2 (Criteria) Performance Level Performance Level Performance Level Dimension 3 (Criteria) Performance Level Performance Level Performance Level Dimension 4 (Criteria) Performance Level Performance Level Performance Level

 Involves some sort of performance by the student  What do you expect the student to do with the knowledge gained  For example - Course Outcome: The student will write a multi-paragraph, in-class essay with an introduction, body paragraphs, and conclusion in response to a reading question.

 Photography 28 Create a single photographic print from a camera original and prepare it for presentation.

The scale describes how well or poorly any given task has been performed. General guidelines: Scale descriptors should be tactful but clear. Three levels of performance is usually sufficient at least in the beginning.

 Exemplary, Acceptable, Unacceptable  Proficient, Developing, Emerging  Outstanding, Satisfactory, Unsatisfactory  High, Average, Low  Excellent, Average, Weak

 describe the criteria that will be used to evaluate the work that students submit as evidence of their learning.  can also convey the relative importance of each of the criteria.  provide students with information on how their work will be evaluated and the relative importance of the skills they need to demonstrate.

For Example: The reading response contains an introduction whose thesis is well-developed and which provides a structure to the essay. The multiple body paragraphs contain topic sentences that are developed with relevant details and examples, use correct English word order, and use a variety of verb tenses and sentence types. The conclusion extends the thesis in some way.

 Photography 10 › All photographic work, including negatives and prints, must be current work produced by the individual student for the course. The negatives should demonstrate proficiency with camera operations including focus and exposure, and the film should be properly developed with correct darkroom procedures. The prints should demonstrate printing proficiency, including proper exposure, contrast control and cleanliness. The photofinishing should demonstrate precision.

 This area provides a description of what constitutes each level of performance in the rubric.  The performance descriptors offer specific feedback on the dimensions of the task.

CriteriaExemplaryAcceptableUnacceptable IntroductionContains a well- developed thesis statement that outlines the development of the essay Contains a thesis statement; may lack a controlling idea or organizing pattern Thesis statement may be vague or missing BodyBody paragraphs provide clear details that develop the thesis; transitions are used throughout Body paragraphs contain details; use of transitions may be sporadic. Details may be missing, vague, or irrelevant; few transitions are used ConclusionExtends the thesis in some way Restates the thesis but may not offer concluding question or extension. No conclusion evident; student stops writing without coming to a conclusion LanguageLanguage is consistently clear with few, if any errors; contains variety in sentence patterns and control of verb tenses. Language is comprehensible; errors do not distract reader; may lack sentence variety; control of verb tenses may be inconsistent May contain frequent or serious errors that distract reader; sentence patterns may not vary; control of verb tenses may be weak.

Sample Report

Sample Report

Sample Report

 Darryl Kinney, Los Angeles City College Assessment Team  Arend Flick, Riverside Community College District Assessment Coordinator  RubiStar Home page  Stevens, D.D. and Levi, A.J. (2005). Introduction to rubrics: An assessment tool to save time, convey effective feedback, and promote student learning. Stylus Publishing, Sterling, VA.

 Thomas Angelo and Patricia Cross, Classroom Assessment Techniques (Jossey-Bass, 1993)  C. A. Palomba and Trudy W. Banta, Assessment Essentials (Jossey-Bass, 1999)  Barbara Walvoord, Assessment Clear and Simple (Jossey-Bass, 2004)

 The California Assessment Initiative:  The Center for Student Success of the California College Research and Planning (RP) Group:  Janet Fulks’s excellent Bakersfield College website: ment/Default.htm  North Carolina State’s comprehensive website:  The Riverside CCD Assessment website: veness/assess/index.cfm veness/assess/index.cfm

Fall 2007 Spring 2008 Fall 2008 Spring %-10%15%- 25% 25%- 40% 45%- 50%

 Have 50% course SLO assessments developed by the spring accreditation visit  Have whole campus engaged  Spread the word!

 Due December 5 th  Data to make self study report accurate  Use SLO Report Form onlySLO Report Form

 Based on What to do with assessment data  Friday, October 17 rd 9-12 pm in E7-101  Assessment results

 Please fill out!  Thank-you!