Multi-State Collaborative To Advance Quality Student Learning

Slides:



Advertisements
Similar presentations
Global Learning Outcomes at Pensacola State College (GLOs)
Advertisements

Students MAKE LEARNING VISIBLE. Part II A permanent record of their work that can be useful to them in later studies, jobs, etc. A record that allows.
Multi-State Collaborative to Advance Learning Outcome Assessment Pilot Project and PCC.
Pilot Study Introduction and Overview 1.  Julie Carnahan Senior Associate - SHEEO MSC Project Director  Terrel Rhodes Vice President.
A Quantitative Literacy Assessment Rubric Development & Lessons Learned - Stuart Boersma, Central Washington Univ. - Caren Diefenderfer, Hollins University.
How to Create a Rubric Presented by the ORIE Team How to Create a Rubric.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
A Quantitative Literacy Assessment Rubric Development & Lessons Learned - Stuart Boersma, Central Washington Univ. - Caren Diefenderfer, Hollins University.
Feeling LO? Making Connections Through Curriculum Mapping Assessment Workshop Fall 2015 Swarup Wood Professor of Chemistry California State University.
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Program-level Assessment With Faculty Learning Communities: The Wisdom Of Well-managed, Small Crowds Amy Liu, Mary Maguire, Lynn Tashiro Sacramento State.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
VALUE/Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Information Literacy Module for Majors Available to support any department Tony Penny, Research Librarian – Goddard Library Supporting the Architecture.
DQP in Context Eastern Oregon University. “You don’t bring about major system change by attacking the core. You build up the edges and show what the edge.
Elise Martin, Dean of Assessment, Middlesex Community College Cathy Pride, Assistant Professor, Psychology, Middlesex Community College Alice Frye, Lecturer,
Laboratory Science and Quantitative Core Requirements.
Quantitative Literacy at WOU. LEAP In November, Faculty Senate recommended that WOU adopt LEAP's (Liberal Education & America's Promise) learning outcomes.
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
Critical Information Literacy
EVALUATING EPP-CREATED ASSESSMENTS
Quality Assurance processes
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
CRITICAL CORE: Straight Talk.
How to Research Lynn W Zimmerman, PhD.
Beyond the “A” Word Assessment that Empowers Faculty to
Understanding Standards
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Warm Up: How do you communicate?
Computational Reasoning in High School Science and Math
Director of Policy Analysis and Research
Lab Roles and Lab Report
Tool WE-1: Mathematics in the world of work
Internal assessment criteria
General Education Assessment
Critical Thinking.
General Education Assessment
Academic Rubric Slides
Demonstration Year Results ( )Rev. 2/3/2017
Refinement Year Results ( ) – 1/19/2018
Accreditation Support for Teachers
Parent Involvement Committee EQAO Presentation
Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Purposeful Pathways: Designing Relevant and Transparent HIPs
Using VALUE Rubrics to Assess Almost Any Program Outcome
Aqua Training Webinar for Refinement Year Participants
LAW112 Assessment 3 Haley McEwen.
Quantitative Literacy at WOU
Multi-State Collaborative (MSC) to Advance Learning Outcomes Assessment Pilot Year Study Findings and Summary These slides summarize results from.
Jillian Kinzie, Indiana University Center for Postsecondary Research
How to Effectively and Efficiently Conduct and Use Annual Assessment
Refinement Year Results ( ) – 3/6/2018
Shazna Buksh, School of Social Sciences
A LEVEL Paper Three– Section A
Assessing Academic Programs at IPFW
Standard for Teachers’ Professional Development July 2016
Setting Writing Goals in Science The Living Environment
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
All-Faculty General Education Meeting Jan. 10, 2019
How to Effectively and Efficiently Conduct and Use Annual Assessment
Student Learning Outcomes at CSUDH
Tool WE-1: World of work tasks in mathematics
Assessment Spring 2016.
AP U.S. History Exam Details
Curriculum Coordinator: Patrick LaPierre February 3, 2017
S-STEM (NSF ) NSF Scholarships for Science, Technology, Engineering, & Mathematics Information Materials 6 Welcome! This is the seventh in a series.
Tia McNair.
Presentation transcript:

Multi-State Collaborative To Advance Quality Student Learning In partnership with:

Today’s Speakers & Moderator Julie Carnahan Terrel Rhodes Catherine Wehlburg MODERATOR This webinar is brought to you by

Purpose of Today Preview “30,000 foot” initial results from the Multi-State Collaborative Demonstration Year using the LEAP Value Rubrics to inform teaching and learning. Provide examples of detailed results to be released with full report in January 2017

These slides summarize results from the demonstration study involving 48 institutions in twelve states using common rubrics to assess more than 8,000 student work products. The sample of student work in the pilot represented the near-graduation students across the participating institutions in the twelve states only; therefore, the results are not generalizable for all students in each participating state or nationwide.

VALUE Rubric Approach Assumptions Learning is a process that occurs over time Student work is best representation of motivated learning Focus on what student does in key learning outcomes Faculty & educator expert judgment Results are useful & actionable for learning (& accountability)

The Current VALUE Initiative Minnesota Collaborative Great Lakes Colleges Association Multi-State Collaborative Purpose Sea change in assessment Reliability Validity Local value Policy debate = learning

The Multi-State Collaborative States committed to the importance of learning outcomes and quality of a degree Mindful of students contribution to the states in which they live Respectful that teaching & learning is prerogative of faculty Focus is on improvement of student learning not ranking states or institutions

The MSC Challenge: Scaling Direct Assessment 8,308

Demonstration Year: Taking the vision to scale from 9 to 12 states Steering Committee Point person from each state and reps from SHEEO & AAC&U Institution Point Persons From each campus in each state p ME MN MA OR RI CT IN UT KY MO TX HI

Demonstration Year: Taking the vision to scale from 9 to 12 states Goals Root assessment of learning in authentic work & the expertise of faculty Establish benchmarks for essential learning outcomes Develop transparency of shared standards of learning to assist with transfer ME MN MA OR RI CT IN UT KY MO TX HI

Multi-State Collaborative To Advance Learning Outcomes Assessment Preview of Demonstration Year (2016) Results

MSC Demonstration Year by the Numbers 48 public institutions uploaded artifacts By sector 29 four-year, including 8 research institutions 19 two-year ME MN MN MA OR OR RI CT IN IN UT UT KY KY MO MO TX HI These results are not generalizable across participating states or the nation in any way. Please use appropriately.

8,308 1886 MSC Demonstration Year by the Numbers pieces of student work were submitted assignments were submitted* These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Artifacts Scored Per Outcome MSC Demonstration Year Profile of VALUE Scorers These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Profile of Scorers by Discipline and/or Institutional Role MSC Demonstration Year Profile of VALUE Scorers These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Critical Thinking Rubric Dimensions   Capstone 4 Milestones 3 2 Benchmark 1 Explanation of issues Issue/problem to be considered critically is stated clearly and described comprehensively, delivering all relevant information necessary for full understanding. Issue/problem to be considered critically is stated, described, and clarified so that understanding is not seriously impeded by omissions. Issue/problem to be considered critically is stated but description leaves some terms undefined, ambiguities unexplored, boundaries undetermined, and/or backgrounds unknown. Issue/problem to be considered critically is stated without clarification or description. Evidence Selecting and using information to investigate a point of view or conclusion Information is taken from source(s) with enough interpretation/evaluation to develop a comprehensive analysis or synthesis. Viewpoints of experts are questioned thoroughly. Information is taken from source(s) with enough interpretation/evaluation to develop a coherent analysis or synthesis. Viewpoints of experts are subject to questioning. Information is taken from source(s) with some interpretation/evaluation, but not enough to develop a coherent analysis or synthesis. Viewpoints of experts are taken as mostly fact, with little questioning. Information is taken from source(s) without any interpretation/evaluation. Viewpoints of experts are taken as fact, without question. Influence of context and assumptions Thoroughly (systematically and methodically) analyzes own and others' assumptions and carefully evaluates the relevance of contexts when presenting a position. Identifies own and others' assumptions and several relevant contexts when presenting a position. Questions some assumptions. Identifies several relevant contexts when presenting a position. May be more aware of others' assumptions than one's own (or vice versa). Shows an emerging awareness of present assumptions (sometimes labels assertions as assumptions). Begins to identify some contexts when presenting a position. Student's position (perspective, thesis/hypothesis) Specific position (perspective, thesis/hypothesis) is imaginative, taking into account the complexities of an issue. Limits of position (perspective, thesis/hypothesis) are acknowledged. Others' points of view are synthesized within position (perspective, thesis/hypothesis). Specific position (perspective, thesis/hypothesis) takes into account the complexities of an issue. Others' points of view are acknowledged within position (perspective, thesis/hypothesis). Specific position (perspective, thesis/hypothesis) acknowledges different sides of an issue. Specific position (perspective, thesis/hypothesis) is stated, but is simplistic and obvious. Conclusions and related outcomes (implications and consequences) Conclusions and related outcomes (consequences and implications) are logical and reflect student’s informed evaluation and ability to place evidence and perspectives discussed in priority order. Conclusion is logically tied to a range of information, including opposing viewpoints; related outcomes (consequences and implications) are identified clearly. Conclusion is logically tied to information (because information is chosen to fit the desired conclusion); some related outcomes (consequences and implications) are identified clearly. Conclusion is inconsistently tied to some of the information discussed; related outcomes (consequences and implications) are oversimplified. For full text of AAC&U VALUE Rubric for Critical Thinking, see: https://www.aacu.org/value/rubrics/critical-thinking.

These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Quantitative Literacy Rubric Dimensions   Capstone 4 Milestones 3 2 Benchmark 1 Interpretation Ability to explain information presented in mathematical forms (e.g., equations, graphs, diagrams, tables, words) Provides accurate explanations of information presented in mathematical forms. Makes appropriate inferences based on that information. For example, accurately explains the trend data shown in a graph and makes reasonable predictions regarding what the data suggest about future events. Provides accurate explanations of information presented in mathematical forms. For instance, accurately explains the trend data shown in a graph. Provides somewhat accurate explanations of information presented in mathematical forms, but occasionally makes minor errors related to computations or units. For instance, accurately explains trend data shown in a graph, but may miscalculate the slope of the trend line. Attempts to explain information presented in mathematical forms, but draws incorrect conclusions about what the information means. For example, attempts to explain the trend data shown in a graph, but will frequently misinterpret the nature of that trend, perhaps by confusing positive and negative trends. Representation Ability to convert relevant information into various mathematical forms (e.g., equations, graphs, diagrams, tables, words) Skillfully converts relevant information into an insightful mathematical portrayal in a way that contributes to a further or deeper understanding. Competently converts relevant information into an appropriate and desired mathematical portrayal. Completes conversion of information but resulting mathematical portrayal is only partially appropriate or accurate. Completes conversion of information but resulting mathematical portrayal is inappropriate or inaccurate. Calculation Calculations attempted are essentially all successful and sufficiently comprehensive to solve the problem. Calculations are also presented elegantly (clearly, concisely, etc.) Calculations attempted are essentially all successful and sufficiently comprehensive to solve the problem. Calculations attempted are either unsuccessful or represent only a portion of the calculations required to comprehensively solve the problem.  Calculations are attempted but are both unsuccessful and are not comprehensive. Application / Analysis Ability to make judgments and draw appropriate conclusions based on the quantitative analysis of data, while recognizing the limits of this analysis Uses the quantitative analysis of data as the basis for deep and thoughtful judgments, drawing insightful, carefully qualified conclusions from this work. Uses the quantitative analysis of data as the basis for competent judgments, drawing reasonable and appropriately qualified conclusions from this work. Uses the quantitative analysis of data as the basis for workmanlike (without inspiration or nuance, ordinary) judgments, drawing plausible conclusions from this work. Uses the quantitative analysis of data as the basis for tentative, basic judgments, although is hesitant or uncertain about drawing conclusions from this work. Assumptions Ability to make and evaluate important assumptions in estimation, modeling, and data analysis Explicitly describes assumptions and provides compelling rationale for why each assumption is appropriate. Shows awareness that confidence in final conclusions is limited by the accuracy of the assumptions. Explicitly describes assumptions and provides compelling rationale for why assumptions are appropriate. Explicitly describes assumptions. Attempts to describe assumptions. Communication Expressing quantitative evidence in support of the argument or purpose of the work (in terms of what evidence is used and how it is formatted, presented, and contextualized) Uses quantitative information in connection with the argument or purpose of the work, presents it in an effective format, and explicates it with consistently high quality. Uses quantitative information in connection with the argument or purpose of the work, though data may be presented in a less than completely effective format or some parts of the explication may be uneven. Uses quantitative information, but does not effectively connect it to the argument or purpose of the work. Presents an argument for which quantitative evidence is pertinent, but does not provide adequate explicit numerical support. (May use quasi-quantitative words such as "many," "few," "increasing," "small," and the like in place of actual quantities.) For full text of AAC&U VALUE Rubric for Quantitative Literacy, see: https://www.aacu.org/value/rubrics/quantitative-literacy

These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Questions?

Potential to Disaggregate by Demographic Characteristics

Critical Thinking Scores by Race 2 year 4 year Asian Black Hispanic White These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Critical Thinking Scores by Pell Eligibility 2 year 4 year Not Eligible Pell Eligible Not Eligible Pell Eligible These results are not generalizable across participating states or the nation in any way. Please use appropriately.

State Level Results Potential to Inform State Level Policy Transfer & Articulation Equity Increase resources to support professional development Inform policy leaders about the learning outcomes students in state are demonstrating

MSC Criterion State Level Score Distribution

Institution Level Results

Institution Level Results

Questions?

Inherent Challenge for VALUE Navigating Methodological Complexity

Nature & Implications of Complexity Establishing the validity & reliability of VALUE is a key priority

Reality Check There is no large-scale model for what we are doing. The very variables other assessment approaches “control” or “eliminate” VALUE embraces.

Purpose = Discuss validity & reliability in relation to inherent complexity of VALUE Scores (rubrics) Assignments Scorers

A Careful Balancing Act Methodological Strict parameters to be included as a scorer Required common prompts Stringent assignment design requirements Pedagogical Philosophical/ Faculty development opportunities Curricular alignment Faculty autonomy/ academic freedom Institutional autonomy

VALUE & Validity

Faculty & staff saw the VALUE rubrics as valid. Percent of scorers who reported Strongly Agree or Agree with each aspect of rubric use These results are not generalizable across participating states or the nation in any way. Please use appropriately.

Lessons Learned Actionable data about student achievement and improvement of key learning outcomes on specific key dimensions of these important learning outcomes can be generated via a common rubric-based assessment approach. Faculty can effectively use common rubrics to evaluate student work products—even those produced for courses outside their area of expertise. Following training, faculty members can produce reliable results using a rubric-based assessment approach. Faculty report that the VALUE Rubrics used in the study do encompass key elements of each learning outcome studied, and were very useful for assessing student work and for improving assignments. A web-based platform can create an easily usable framework for uploading student work products and facilitating their assessment.

MSC Refinement Year (year three) Next Steps MSC Refinement Year (year three) 13 states, five with representative samples for the state 20,000 artifacts collected and uploaded Establishment of inter-state “SWAT” teams Increased focus on evaluation – panel of data scientists Increased focus on equity Explore feasibility of sub-study following students into the workforce

Virginia joins as 13th state. ME MN MA OR RI CT IN VA UT KY MO TX HI

Questions?

Contact Us! Julie Carnahan JCarnahan@sheeo.org Terrel Rhodes rhodes@aacu.org Taskstream.com/Aqua events@taskstream.com