T HE R OLE OF C ALIBRATION IN A DVANCING F ACULTY L EARNING A BOUT S TUDENT L EARNING Terry Rhodes & Ashley Finley AAC&U Institute on Integrative Learning.

Slides:



Advertisements
Similar presentations
Aligning VALUE Rubrics to Institutional Purposes: Sharing Practical Approaches that Promote Consistent Scoring and Program Improvement Linda Siefert
Advertisements

General Education Assessment 2013: Introduction
SLO Course Assessment in 5 Easy Steps Vivian Mun, Ed.D.
What Behaviors Indicate a Student is Meeting Course Goals and Objectives? “Indicators”
GE R UBRIC /A SSESSMENT AT CSUN: S TARTING S MALL AND S CALING U P Beth Lasky Anu Thakur Gigi Hessamian Mintesnot Woldeamanuel Nina Golden Ashley Samson.
Carol Ann Gittens, Gail Gradowski & Christa Bailey Santa Clara University WASC Academic Resource Conference Session D1 April 25, 2014.
Multi-State Collaborative 1.  Susan Albertine Vice President, Office of Diversity, Equity, and Student Success, AAC&U Faculty Engagement Subgroup, MSC.
Working with Rubrics: Using the Oral Communication, Writing, and Critical Thinking Rubrics VALUE Rubrics Ashley Finley, Ph.D Senior Director of Assessment.
Crowd-Sourcing Innovative Practices: Assessing Integrative Learning at Large Research Institutions.
S TEPS FOR S CORING W ORK Florida State University March 18-19, 2015 VALUE R UBRIC C ALIBRATION E VENT.
Mo Noonan Bischof Assistant Vice Provost Jocelyn Milner Director of Academic Planning and Institutional Research Panel:
A Multi-method Approach: Assessment of Basic Communication Cheryl E Drout, Ph.D. SUNY-Fredonia.
The Washington State University Critical Thinking Project Diane Kelly-Riley Kim Andersen Paul Smith Karen Weathermon Washington State University.
Frequently asked questions about development, interpretation and use of rubrics on campuses 1.
M ODELS & S TRATEGIES FOR A SSESSING I NSTITUTIONAL LEARNING OUTCOMES Ashley Finley, Ph.D Senior Director of Assessment & Research, AAC&U National Evaluator,
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
AAC&U/ Minnesota Collaborative Pilot (MCP) Project: Artifacts and Assessment Aug. 19, 2014 Professional Development Day 1.
Assessment Report School of The Arts, Humanities and Social Sciences________________ Department: Political Science and International Studies.
Using a logic model to help you use the ePortfolio Implementation Framework Katherine Lithgow.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments Oct 2013.
SCORING. INTRODUCTION & PURPOSE Define what SCORING means for the purpose of these modules Explain how and why you should use well-designed tools, such.
BY Karen Liu, Ph. D. Indiana State University August 18,
Overview of the Department’s ABET Criterion 3 Assessment Process.
FACULTY RETREAT MAY 22, H ISTORY 2006 Middle States Self-Study Reviewer’s Report Recommendations: The institution is advised that General Education.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Pierce College CSUN-Pierce Paths Project Outcomes Report 2013.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Senior Literacy Assessment: AY Senior Literacy Assessment English assessment was an interest of USDE and MSCHE ▫Undertaken in various forms.
U SING A SSESSMENT TO I NFORM & R EFORM P RACTICE Academic Programs Assessment Committee (APAC) Presentation Melanie DiLoreto.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Making the Connection between Assessment and Learning More Intentional Carol Rovello, Director of Employee & Organization Development Sherian Howard, CAD.
NCATE for Dummies AKA: Everything You Wanted to Know About NCATE, But Didn’t Want to Ask.
Connecting Course Goals, Assignments, and Assessment Faculty Development for Student Success at Prince George’s Community College William Peirce
Competency Assessment Advisory Team (CAAT) QUANTITATIVE REASONING DEPARTMENT OF MATHEMATICS REP – ROB NICHOLS 1.
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Presented to GETSI by Ellen Iverson, SERC, Carleton College Developed as InTeGrate talk by David Steer Department of Geosciences The University of Akron.
Graduate Program Assessment: A Pilot Study Using a Common Activity and Combined Rubric Rana Khan, Ph.D., Director, Biotechnology Program Datta Kaur Khalsa,
This work is supported by the National Science Foundation’s Transforming Undergraduate Education in STEM program within the Directorate for Education and.
ELEMENTARY SPANISH 2 INTERMEDIATE SPANISH 1 CONSISTENCY PROJECT El Camino College Fall 2015 Workshop 2: November 13, 2015.
Developing Program Learning Outcomes To help in the quality of services.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
& YOUR PROGRAM How to Get Started Thinking About it All…
Paths to Program Assessment through Rubrics Dannelle D. Stevens, Ph.D. Portland State University, Portland, OR UMUC, Washington, D.C., October 3 & 4, 2010.
Using AAC&U’s Learning Tools to Address Core Revision Terrel L. Rhodes Vice President Association of American Colleges and Universities Texas Coordinating.
Transfer Course Credit – Institutions of Higher Education Credit for Prior Learning Industry Recognized Credentials/Test Credit AGC – April 2016.
Student Learning Outcomes Assessment Montgomery College Fall 2011 Orientation.
Quantitative Literacy Across the Curriculum. Members of the QLAC Committee Beimnet Teclezghi – co-chair Laura Pannaman – co-chair Marilyn Ettinger John.
4/16/07 Assessment of the Core – Humanities with Writing Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
4/16/07 Assessment of the Core – Quantitative Reasoning Charlyne L. Walker Director of Educational Research and Evaluation, Arts and Sciences.
CRITICAL CORE: Straight Talk.
Student Learning Outcomes and Rubrics
Institutional Student Learning Outcome Assessment Report Fall 2015 Assessment Cycle
TRCC TAP Course Vetting: What we have learned so far
As Good As It Gets…For Now:
Discourse Faculty Meeting
General Education Assessment
Student Learning Outcomes (SLO) Assessment Process
Institutional Learning Outcomes Assessment
Institutional Effectiveness USF System Office of Decision Support
Using VALUE Rubrics to Assess Almost Any Program Outcome
Jillian Kinzie, Indiana University Center for Postsecondary Research
Dr. James W. Dottin Department Chair Business Administration
Assessment History Since 2013 the department has performed ongoing assessment of RWS 100 & 200 comprised of Internal, department-driven, self-assessment.
_Accounting_ Program School of Business and Liberal Arts Fall 2016 Assessment Report
Program Director: D. Para January 2016
Assessment and Improvement
Department Chair: Liz Brown Date of Presentation: 01/19/18
Curriculum Coordinator: Pamela Quinn Date of Presentation: 1/19/18
Curriculum Coordinator: Patrick LaPierre February 3, 2017
Presentation transcript:

T HE R OLE OF C ALIBRATION IN A DVANCING F ACULTY L EARNING A BOUT S TUDENT L EARNING Terry Rhodes & Ashley Finley AAC&U Institute on Integrative Learning and the Departments Portland, OR “O UR ” C OMMON G ROUND :

Criteria The Anatomy of a VALUE Rubric Levels Performance Descriptors

T HE C ALIBRATION T RAINING P ROCESS Scoring Steps: Review rubric to familiarize yourself with structure, language, performance levels Ask questions about the rubric for clarification or to get input from others regarding interpretation Read student work sample Connect specific points of evidence in work sample with each criterion at the appropriate performance level (if applicable) Calibration Steps: Review scores Determine common score(s) Hear from outliers Discuss Determine final score

T HE G ROUND R ULES This is not grading. Think globally about student work and about the learning skill. Think beyond specific disciplinary lenses or content. We are not changing the rubric (today). Our work is time sensitive. Go with your instinct. Start with 4 and work backwards. Pick one performance benchmark per criterion. Avoid “.5”. Zero does exist. Assign “0” if work does not meet benchmark (cell one) performance level. N/A exists. Assign “not applicable” if the student work is not intended to meet a particular criterion.

S IGNATURE A SSIGNMENTS Assignment should enable attainment of criteria Break down criteria to determine key components for assignment What should students do with content to meet criteria? E.g. What are the pieces to be analyzed, compared, integrated? Will the assignment be used for more than one outcome? What are the types of assignments that will be most helpful for allowing students to demonstrate competency?

E XAMPLE OF P ROCESS Step 1: All Gen Ed Courses reported as addressing and assessing Info. Tech. Literacy identified as potential courses from which to request artifacts. (54 courses) Step 2: Of courses identified, approx. 20% were randomly selected for sample (10 courses, 36 total sections) Step 5: Artifacts submitted to Director of Learning Outcomes for scoring. (66 artifacts) Step 4: Start of semester, department chairs notified of courses in from which artifacts were to be requested. Chairs worked with individual faculty to fulfill request. Step 3: Within each selected course, 2 students randomly selected by roster # to submit artifacts (74 artifacts) Step 6: Faculty scoring team met at the close of spring semester for a norming session and scoring. (62 artifacts) From: Carroll Community College Flow chart of sequential steps in the request, submission, and scoring of student artifacts for Learning Goal 4: Information and technology literacy.

C AMPUS E XAMPLE OF O UTCOMES A SSESSMENT U SING R UBRIC DATA Dimension% of students who scored 2 or higher % of students who scored 3 of higher Explanation of Issues Interpreting & Analysis Influence of Context and Assumptions Student’s position Conclusions and related outcomes From: UNC-Wilmington, Critical Thinking Rubric

B UILDING THE E VIDENTIARY B ASE : U NIVERSITY OF K ANSAS Percent of Ratings Critical Thinking: Issues, Analysis, and Conclusions Inter-rater reliability = >.8

B UILDING THE E VIDENTIARY B ASE : U NIVERSITY OF K ANSAS Percent of Ratings Critical Thinking: Evaluation of Sources and Evidence

B UILDING THE E VIDENTIARY B ASE : U NIVERSITY OF K ANSAS Percent of Ratings “VALUE added” for 4 years - writing