S TEPS FOR S CORING W ORK Florida State University March 18-19, 2015 VALUE R UBRIC C ALIBRATION E VENT.

Slides:



Advertisements
Similar presentations
HART RESEARCH P e t e r D A S O T E C I
Advertisements

Association of American Colleges and Universities.
Aligning VALUE Rubrics to Institutional Purposes: Sharing Practical Approaches that Promote Consistent Scoring and Program Improvement Linda Siefert
Aligning and Integrating General Education and the Majors at a Large Public Research University Anthony Ciccone William Keith Jeffrey Merrick University.
Our Institutions CSB/SJU Two schools, one academic program Liberal Arts, Residential, Catholic, Benedictine 3,900+ undergraduates 300+ FTE faculty Distribution.
Refocusing General Education: A Dialectical Interplay of Accreditation Requirements, Assessment Methods, and Student Learning Competencies Bobby L. Matthews.
GENERAL EDUCATION COURSE- BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
AMY DOLING, CORYANNE HARRIGAN, AND STEVEN J. GRIFFITH SIMPSON COLLEGE, INDIANOLA, IOWA Widening the Circle: Creating Momentum for Curricular and Structural.
University Mission Learning-centered environment Integration of teaching research, service, and co-curricular experiences Prepare students to be responsible.
Core Competencies Student Focus Group, Nov. 20, 2008.
Working with Rubrics: Using the Oral Communication, Writing, and Critical Thinking Rubrics VALUE Rubrics Ashley Finley, Ph.D Senior Director of Assessment.
General Education Reform Through the Lens of Student Success Tony Ciccone Bill Keith Jeff Merrick Dev Venugopalan.
ENGAGING STUDENTS: HIGH IMPACT PRACTICES By Mark Stoner and Vanessa Arnaud.
Implementing the AAC&U Value Rubrics in Sakai Sean Keesler and Janice Smith Three Canoes June 15, 2010.
Assessment Consultant to THECB’s Undergraduate Education Advisory Committee (UEAC): Danita McAnally, Chief of Planning and Advancement.
Mo Noonan Bischof Assistant Vice Provost Jocelyn Milner Director of Academic Planning and Institutional Research Panel:
Defining Institutional Learning Outcomes: A Collaborative Initiative Facilitated by the Office of Academic and Student Affairs AND the College Governance.
What is a VALUE Rubric and Why Develop Rubrics? Terrel L. Rhodes Association of American Colleges and Universities Florida State University March 2015.
NMU Towards the 21 st Century Mitchell Klett Alan Willis Ruth Watry Laura Reissner Gary Brunswick.
The Current Refocusing of General Education. Objectives for the Workshop Proposing and/or Renewing a Course Assessing the general education aspect of.
Promoting Universal Values in the Face of Societal Change The Council of Europe Caryn McTighe Musil November 20, 2007 Association of American Colleges.
James Madison University General Education Program
Maps, Rubrics and Templates A Primer on Their Uses for Assessment in Student Affairs.
THE NEW TEXAS CORE CURRICULUM (OCTOBER 27, 2011).
Frequently asked questions about development, interpretation and use of rubrics on campuses 1.
Pendidikan Liberal untuk Membentuk Karektor Riza Atiq.
M ODELS & S TRATEGIES FOR A SSESSING I NSTITUTIONAL LEARNING OUTCOMES Ashley Finley, Ph.D Senior Director of Assessment & Research, AAC&U National Evaluator,
Spring 2012 Pilot Project Module Nine A New Texas Core Curriculum 1.
T HE R OLE OF C ALIBRATION IN A DVANCING F ACULTY L EARNING A BOUT S TUDENT L EARNING Terry Rhodes & Ashley Finley AAC&U Institute on Integrative Learning.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 PERFORMANCE ASSESSMENT... Concerns direct reality rather than disconnected.
California State University East Bay
Making Sense of the Core Objectives
Humanities Blogs Rock! Integrating information technology into the School of Liberal Arts and Academic Partnerships Shereen Hassanein.
Terrel L Rhodes Association of American Colleges and Universities University of South Carolina May 21, 2010.
Liberal Education and America’s Promise: Changing the Conversation about Student Success and Institutional Accountability SHEEO—Denver, CO August 2009.
GENERAL EDUCATION COURSE- BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
MnObe Mini-Immersion August 6, 2009 Information Literacy in the Liberal Arts Barbara Fister.
Bonnie Paller 2013 AALC Assessment Retreat.  The charge of the Task Force is to identify the abilities and intellectual traits that all students are.
SCHOOL THE CORE! MASL Fall Conference 2013.
General Education at CityU. Framework cu.
Put Your Classroom On A 21 st Century DI-IT Create Engaging Technology Rich Differentiated Classroom Environments Create Engaging Technology Rich Differentiated.
Information Literacy Module for FYI Available to any FYI Tony Penny, Research Librarian – Goddard Library Research & Library Instruction Services We support.
Program Assessment: What Difference Does it Make? Ross Miller, AAC&U.
Assessment at City Tech April 17, 2015 Living Lab Associate Seminar Susan Nilsen-Kupsch RDH MPA.
A LEAP Toward an Educated Citizenry: The Role of Public Health APHA Washington, DC November 1, 2011 Susan Albertine Vice President Association of American.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
AAC&U VALUE Project: Minnesota Collaborative Demonstration Year 2 ( ) Demonstration Year 3 ( ) 1.
AAC&U Members on Trends in Learning Outcomes Assessment Key findings from a survey among 325 chief academic officers or designated representatives at AAC&U.
Using AAC&U’s Learning Tools to Address Core Revision Terrel L. Rhodes Vice President Association of American Colleges and Universities Texas Coordinating.
Systems Wide Learning at a Community College Developments in the last five years –SACS-COC (Course Outcomes to Program Outcomes) –The Texas Higher Education.
Assessment Instruments and Rubrics Workshop Series Part 1: What is a rubric? What are the required elements? What are VALUE rubrics? February 24, 2016.
Post Graduation Plans Questionnaire SL, Academic Planning and Institutional Research, February 2015.
C OLLEGIATE L EARNING A SSESSMENT Dr. Pedro Reyes, Executive Vice Chancellor for Academic Affairs January 2014.
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Designing Valid Reliable Grading Tools Using VALUE Rubrics
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! S. Berlin.
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
First-Stage Draft Plans for Gen Ed Revision
Student Learning Outcomes Assessment
Purposeful Pathways: Designing Relevant and Transparent HIPs
Using VALUE Rubrics to Assess Almost Any Program Outcome
Let’s Get it Started! Using VALUE Rubrics to Assess Almost Any Program Outcome Jessica Dennis- Interim Director of Assessment March 5, 2018.
We do not need to reinvent the wheel.
Phase 1: Core Learning Goals & Objectives
AAC&U Members on Trends in Learning Outcomes Assessment
Randy Beach, South Representative Marie Boyd, Chaffey College
Jillian Kinzie, Indiana University Center for Postsecondary Research
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK!
We VALUE HIPs Utilizing VALUE Rubrics and HIP QA Tools in Course Revitalization Presented by Melynda Conner, TBR OSS HIP Specialist 2019.
GENERAL EDUCATION COURSE-BASED ASSESSMENT – IT’S EASIER THAN YOU THINK! C. Griffin.
Presentation transcript:

S TEPS FOR S CORING W ORK Florida State University March 18-19, 2015 VALUE R UBRIC C ALIBRATION E VENT

F ROM C REATION TO C APTURE : H OW TO GAUGE IMPACT VALUE Project ( 16 national rubrics Created to: Develop shared understanding of common learning outcomes Improve direct assessment of student learning (in text and non-text formats) Encourage transparency and student self-evaluation of learning Rubric Development & Use  National Advisory Panel (12 people)  16 Inter-disc/Inter-institutional teams of faculty/scholars (Over 100)  Reviewed existing rubrics to develop broad agreement on dimensions of outcomes (openedpractices.or g )  Tested in 2-4 waves on over 100 campuses  National reliability studies  To date accessed by over 5661 institutions/organizations, 32,729 individuals  Domestic & international, K-12, state university systems  3 Consortia: RAILS, Connect2Learning, South Metropolitan Higher Education Consortium  Approved for use in Voluntary System of Accountability (VSA)

L IST OF VALUE R UBRICS  Knowledge of Human Cultures & the Physical & Natural Worlds Content Areas  No Rubrics  Intellectual and Practical Skills Inquiry & Analysis Critical Thinking Creative Thinking Written Communication Oral Communication Reading Quantitative Literacy Information Literacy Teamwork Problem-solving  Personal & Social Responsibility Civic Knowledge & Engagement Intercultural Knowledge & Competence Ethical Reasoning Foundations & Skills for Lifelong Learning Global learning  Integrative & Applied Learning Integrative & Applied Learning

Review of VALUE Rubric

Criteria Levels Performance Descriptors

T HE G ROUND R ULES 1. We are not changing the rubric (today). Really. 2. This is not grading. Grading focuses on content and often a mixture of learning skills. This is also why the assignment prompt is not given to scorers 2. Think globally about student work and about the learning skill. Think beyond specific disciplinary lenses or content. Assume the content is correct. 3. For each dimension, connect specific places in the work sample with the assigned score. This is critical for helping colleagues to understand your rationale. 5. Start with 4 and work backwards. The rubric was specifically designed to begin with the highest level of complexity of thinking in mind and to go from there. 6. Pick one performance benchmark per dimension. Avoid “.5”. Discussion provides the opportunity to admit if you were on the fence. 7. Zero does exist. So does N/A. But each means different things. Assign “0” if work does not meet benchmark (cell one) performance level. Assign “not applicable” if the student work is not intended to meet a particular criterion.

C ALIBRATION P ROCESS Discuss first page of rubric – Any questions regarding clarity or interpretation? Discuss each dimension of the rubric – Any questions or issues to raise regarding clarity or interpretation of a particular dimension? Read work sample Score all rubric dimensions Reporting out of scores Starting with those who assigned a “4”…“3”…“2”…“1”…Zero and N/A Discussion of assigned scores Be sure to connect responses with specific instances in the work sample that help to justify the score. Does anyone want to change their score?

Q UESTIONS /I SSUES TO R AISE ?