Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.

Slides:



Advertisements
Similar presentations
Student Learning Outcomes and Assessment An Overview of Purpose, Structures and Systems Approved by Curriculum Committee Approved by Academic.
Advertisements

Daniel Peck January 28, SLOs versus Course Objectives Student Learning Outcomes for the classroom describe the knowledge, skills, abilities.
Learning Outcomes, Authentic Assessments and Rubrics Erin Hagar
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
22 May Session Overview Purposes: Provide an overview of the assessment plan for Foundational Studies Discuss Phase I of the assessment plan – assessing.
Domain 1: Planning and Preparation
OUTCOMES ASSESSMENT Developing and Implementing an Effective Plan.
Lamar State College-Port Arthur February 16 & 17, 2011.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
4/16/07 Assessment of the Core – Social Inquiry Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
Analysis of Institutional Data of the Assessment of SLOs October 27, 2009.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
Designing Course-Level Performance Measures Aligned with Program-Level Learning Outcomes Steven Beyerlein, Mechanical Engineering University of Idaho Daniel.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Principles of Assessment
Jeanne M. Clerc, Ed.D. Western Illinois University (WIU) October 14, 2011.
1 Student Success Plans Regional Meeting February 9, 2007 Youngstown State University Office of Assessment Sharon Stringer
David Gibbs and Teresa Morris College of San Mateo.
Overview of the Department’s ABET Criterion 3 Assessment Process.
California State University East Bay
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Eportfolio: Tool for Student Career Development and Institutional Assessment Sally L. Fortenberry, Ph.D., and Karol Blaylock, Ph.D. Eportfolio: Tool for.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Joanne Chen Irvine Valley College.  SLOs are statements that specify what students will know, be able to perform and to demonstrate.  SLOs specify an.
Understanding Meaning and Importance of Competency Based Assessment
The levels of performance considered at the element level are not intended to be used to label teachers as Not Evident, Emerging, Proficient, and Exemplary.
Professional growth Objectives 1)Pilot program 2)Familiarize/re-familiarize ourselves with new Standards and Indicators 3)Consider the importance of data.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Alternative Assessment
 Integrate the Bacc Core category learning outcomes into the course.  Clarify for students how they will achieve and demonstrate the learning outcomes.
The CLO Assessment Cycle NFI Presentation, Fall 2011 Amanda Ryan-Romo, Assessment Facilitator.
Everything You Need to Know for Spring 2010 S. Kashima.
Core Curriculum Proposal Workshop. Overview  Defining Assessment  Steps in Assessment  Time to Practice Developing an Assessment Plan  Q&A and Wrap.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
Fall 2015 Professional Development Days C. Cruz-Johnson & R. Gamez August 28, Walking with Integrity Towards Student Success.
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
Performance-based Assessment An overview of Chapter 8 presented by Daniel Shockley & Bart Fennemore.
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Last Revised: 10/01/15. Senate Bill 290 has specific goal-setting requirements for all licensed and administrative staff in the State of Oregon. In ,
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
& YOUR PROGRAM How to Get Started Thinking About it All…
LEARNING GOALS AND PERFORMANCE SCALES PLC FOCUS FOR BVS
Tia Juana Malone, English Professor Ruth Ronan, Course Developer Assessment Strategies That Promote Student Engagement.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
1 Embracing Math Standards: Our Journey and Beyond 2008.
Learning Goals, Objectives, & Curriculum Mapping Linking curriculum & pedagogy to demonstrations of knowledge, skills, and abilities Dr. Marjorie Dorimé-Williams.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Developing a Student Learning Outcomes Assessment Plan and Report
Program Learning Outcomes
Creating Analytic Rubrics April 27, 2017
November 17, 2016 Cathy Sanders Director of Assessment
Student Learning Outcomes Assessment
Creating Assessable Student Learning Outcomes
Randy Beach, South Representative Marie Boyd, Chaffey College
Assessing Academic Programs at IPFW
What to do with your data?
Developing a Rubric for Assessment
Business Administration Programs School of Business and Liberal Arts Fall 2016 Assessment Report
Student Learning Outcomes Assessment
Presentation transcript:

Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment

Workshop Learning Outcomes Upon conclusion of the workshop, workshop participants will be able to : 1)identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report; 2)develop an analytic rubric; 3) link analytic rubrics and assessment data to Performance Outcomes; and 4)use assessment data to improve student learning.

Elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report 1)Student Learning Outcome 2)Effectiveness Measure with the scoring rubric 3)Assessment Methodology 4)Performance Outcome 5)Assessment Data 6)Continuous Improvement Plan

Student Learning Outcomes (SLOs) Are: competencies (i.e., knowledge, skills, abilities and behaviors) that students will be able to demonstrate as a result of an educational program SLOs need to be specific and measurable

Sample SLO That is Not Measurable Because it is Stated Too Broadly: “Upon conclusion, workshop participants will be knowledgeable about student learning outcomes assessment.”

Sample Measurable SLOs Workshop participants will be able to: 1)identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report; 2)develop an analytic rubric; 3) link analytic rubrics and assessment data to Performance Outcomes ; and 4)use assessment data to improve student learning.

An Effectiveness Measure is: a student artifact, (e.g., exam, project, paper or presentation) that will be used to gauge students’ acquisition of the student learning outcome.

Important Attributes of an Effectiveness Measure Validity: Does the Effectiveness Measure assess what it is supposed to (i.e., the knowledge, skill, ability or behavior articulated in the SLO)? Example: An oral presentation would be a valid Effectiveness Measure for assessing students’ oral communication skills. Reliability: Does the Effectiveness Measure consistently assess the knowledge, skill, ability or behavior of interest? Analytic rubrics help to facilitate inter-rater reliability

Sample Effectiveness Measure “Workshop participants will develop an analytic rubric to demonstrate their ability to apply their new knowledge of analytic rubrics acquired in the workshop. The analytic rubric will model rubrics used by faculty to evaluate student artifacts and will include a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

Assessment Methodology Is: A description of the department’s process for assessing the student learning outcome and reviewing the resulting assessment data that includes: 1)when and in what course the assessment will be administered; 2)how the student artifact will be evaluated and by whom; 3)the department’s process for collecting, analyzing and disseminating assessment data to department faculty will be; and 4)the department’s process for annually reviewing assessment data and deciding changes/improvements to make will be.

Sample Assessment Methodology “Workshop participants will develop an analytic rubric after the instructor has defined a rubric, shared a sample rubric and explained the steps in constructing a rubric. Rubrics will be collected by the instructor and a Scoring Rubric for Analytic Rubric Activity will be used to assess workshop participants’ ability to develop a rubric. The instructor will summarize and disseminate the assessment findings to OAA staff who will identify areas needing improvement and decide changes to make prior to the next training workshop.”

A Performance Outcome Is: the percentage of students that will demonstrate proficiency on the student learning outcome and the level of proficiency expected.

Sample Performance Outcome “80% of workshop participants will score ‘Acceptable (2)’ or higher on the Scoring Rubric for Analytic Rubric Activity”

Scoring Rubric Defined A scoring tool that uses a set of evaluation criteria that are directly tied to the student learning outcome to assess student learning. When the content of the rubric is communicated prior to students completing the work, the grading process is very clear and transparent to everyone involved. When scoring rubrics are used, Performance Outcomes are to be tied to the rubric scale.

Scoring Rubric for Analytic Rubric Activity Criteria1 = Needs Improvement2 = Acceptable3 = Accomplished Rating (1 to 3) Ability to develop a rubric scale Is unable to develop a minimum of a 3 point scale that progresses from least proficient to most proficient Is able to develop a minimum of a 3 point scale that progresses from least proficient to most proficient Is able to develop a scale that is greater than 3 points that progresses from least proficient to most proficient Ability to develop evaluation criteria Is unable to identify at least 3 relevant evaluation criteria Is able to identify at least 3 relevant evaluation criteria Is able to identify more than 3 relevant evaluation criteria Ability to develop cell descriptors Cell descriptors are missing for some evaluation criteria. Many of the cell descriptors are not sufficiently detailed to differentiate between the levels of proficiency. Cell descriptors are present for all evaluation criteria but some do not clearly differentiate between the levels of proficiency. Cell descriptors are present for all evaluation criteria that clearly differentiate between the levels of proficiency.

Steps in Constructing an Analytic Scoring Rubric 1.Determine what kind of rubric scale to use (e.g., 3 pt., 5 pt.). 2.Label each level of proficiency across the top of the rubric (e.g., “(1) Needs Improvement,” “(2) Acceptable,” “(3) Accomplished”). 3.List evaluation criteria down the left side of the rubric template. 4.Write cell descriptors of what the highest level of proficiency “(3) Accomplished” looks like for each criterion. 5.Write cell descriptors of what the lowest level of proficiency “(1) Needs Improvement” looks like for each criterion. 6.Write cell descriptors of what the mid levels of proficiency look like for each criterion. 7.Test it: use the rubric to evaluate student artifacts. Make note of important evaluation criteria that were omitted, existing criteria to eliminate, and cell descriptors that need greater specificity. 8.Revise the rubric to reflect the desired changes.

Analytic Rubric Activity Using the blank rubric template provided, develop an Oral Presentation Rubric with a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

Tying it All Together Into a Student Learning Outcomes Assessment Plan

SLO Assessment Plan Student Learning Outcome: Workshop participants will be able to: 1)identify and define the elements of UNC Charlotte’s Student Learning Outcomes Assessment Plan and Report; 2)develop an analytic rubric; 3) link analytic rubrics and assessment data to Performance Outcomes; and 4)use assessment data to improve student learning. Effectiveness Measure: Workshop participants will develop an analytic rubric to demonstrate their ability to apply their new knowledge of analytic rubrics acquired in the workshop. The analytic rubric will model rubrics used by faculty to evaluate student artifacts. and will include a minimum of a 3 point rating scale and 3 evaluation criteria with cell descriptors for each criterion for each level of competency.

SLO Assessment Plan Assessment Methodology: Workshop participants will develop an analytic rubric after the instructor has defined a rubric, shared a sample analytic rubric and explained the steps in constructing a rubric. Rubrics will be collected by the instructor and a Scoring Rubric for Analytic Rubric Activity will be used to assess workshop participants’ ability to develop a rubric. The instructor will summarize and disseminate the assessment findings to OAA staff who will decide changes to make prior to the next training workshop. Performance Outcome: 80% of workshop participants will score “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity.

Reporting Assessment Data

Report assessment data in the same way that the Performance Outcome is stated. PO: 80% of workshop participants will score “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity Assessment Data: 89% of workshop participants scored “Acceptable (2)” or higher on the Scoring Rubric for Analytic Rubric Activity

Continuous Improvement Plan During the annual review of assessment data, department faculty will: 1.determine whether students are meeting the Performance Outcome(s) for each Student Learning Outcome; 2.identify changes that are needed to improve future student learning; 3.develop the department’s continuous improvement plan for the upcoming year; and 4.the following year, document in the Student Learning Outcomes Assessment Plan and Report whether the changes made improved student learning.

Changes to Consider When Students Are Not Meeting Performance Outcomes 1.Change the assessment instrument – Revise the assessment instrument (i.e., test questions or project requirements) – Change to an assessment instrument that measures deeper learning (e.g. from test questions to a written paper) – Revise or add rubric evaluation criteria 2.Change the assessment methodology – Change what you are assessing (i.e., revise the SLO statement) – Change when SLOs are assessed (junior year vs senior year) – Change how the assessment is administered (e.g., videotaped oral presentation vs oral presentation so that students can watch afterward to self-assess)

Changes to Consider When Student Are Not Meeting Performance Outcomes 3.Change the curriculum – Revise courses to provide additional coverage in areas where students did not perform well – Revise the curriculum to increase students’ exposure to SLOs by introducing, reinforcing and emphasizing competencies throughout the curriculum – Schedule an appt. with CTL curriculum design specialist 4.Change the pedagogy – Incorporate more active learning that provides students with opportunities to apply what they are learning – Incorporate mini assessments throughout the semester to gauge earlier whether students are grasping the material and adjust accordingly – Incorporate active learning opportunities for students to share/teach the material they are learning in your classroom

Questions/Discussion