1 University of Wisconsin-Madison Establishing Institution-wide Expectations for Student Learning Spheres of Influence Where do you fit? 2008 T&L Symposium.

Slides:



Advertisements
Similar presentations
GENERAL EDUCATION ASSESSMENT Nathan Lindsay January 22-23,
Advertisements

Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Purpose of Instruction
A Self Study Process for WCEA Catholic High Schools
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
 Reading School Committee January 23,
An Assessment Primer Fall 2007 Click here to begin.
Consistency of Assessment
National Commission for Academic Accreditation & Assessment Preparation for Developmental Reviews.
WASC Accreditation Process DUE Managers Meeting December 2, 2009 Sharon Salinger and Judy Shoemaker.
Program Review and General Education Assessment at the University at Albany: Past, Present and Future Barbara Wilkinson Assistant Director for Assessment.
ACADEMIC INFRASTRUCTURE Framework for Higher Education Qualifications Subject Benchmark Statements Programme Specifications Code of Practice (for the assurance.
STRATEGIC PLANNING AND ASSESSMENT PLANNING Presentation to CLAS Unit Heads Nov. 16, 2005 Maria Cimitile Julie Guevara Carol Griffin.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
The Academic Assessment Process
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
How to Write Goals and Objectives
Assessing Students Ability to Communicate Effectively— Findings from the College of Technology & Computer Science College of Technology and Computer Science.
Using the T-9 Net This resource describes how schools use the T-9 Net to monitor the literacy and numeracy skills of students in Transition, Year 1 and.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
The Role of Assessment in the EdD – The USC Approach.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Maureen Noonan Bischof Eden Inoway-Ronnie Office of the Provost Higher Learning Commission of the North Central Association Annual Meeting April 22, 2007.
1 Student Success Plans Regional Meeting February 9, 2007 Youngstown State University Office of Assessment Sharon Stringer
Direct vs Indirect Assessment of Student Learning: An Introduction Dr. Sheila Handy, Chair Business Management and Co-Chair University Assessment Committee.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Academic Assessment Task Force Report August 15, 2013 Lori Escallier, Co-Chair Keith Sheppard, Co-Chair Chuck Taber, Co-Chair.
Assessment Cycle California Lutheran University Deans’ Council February 6, 2006.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Focus on Learning: Student Outcomes Assessment and the Learning College.
Writing Your Program’s SPA Report(s) Cynthia Conn, Ph.D., Associate Director, Office of Academic Assessment Chris Geanious, Project Director, College of.
Outcome Assessment Reporting for Undergraduate Programs Stefani Dawn and Bill Bogley Office of Academic Programs, Assessment & Accreditation Faculty Senate,
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
HECSE Quality Indicators for Leadership Preparation.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
UCF University-wide System for Assessing Student Learning Outcomes Dr. Julia Pet-Armacost Assistant VP, Information, Planning, and Assessment University.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Dr. Marsha Watson Director of Assessment Dr. Kenny Royal Assistant Director of Measurement & Analysis Dr. Julie F. Johnson Assessment Specialist.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Why Do State and Federal Programs Require a Needs Assessment?
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
NSE Assessment Overview: Your Role in Outcomes-Based Assessment What Will We Learn About Our Students?
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
The Make-Over of Strategý The Make-Over of Strategý Accentuating the Positive Musette France McKelvey Library Director, Interim Shaw University 1.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
October 20 – November 6, 2014 Alovidin Bakhovidinov Alina Batkayeva
DEFINING AND REFINING LEARNING OUTCOMES UK Office of Assessment.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
QCC General Education Assessment Task Force March 21 and 22, 2016 Faculty Forum on General Education Outcomes.
HLC Criterion Three Primer: Teaching and Learning: Quality, Resources, and Support Thursday, September 24, :40 – 11:40 a.m. Event Center.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Assessment Planning and Learning Outcome Design Dr
Consider Your Audience
Student Learning Outcomes Assessment
Program Assessment Processes for Developing and Strengthening
Presented by: Skyline College SLOAC Committee Fall 2007
Assessing Academic Programs at IPFW
Presentation transcript:

1 University of Wisconsin-Madison Establishing Institution-wide Expectations for Student Learning Spheres of Influence Where do you fit? 2008 T&L Symposium Session Overview - Introduction - Challenges of arriving at common expectations - What is your sphere of influence? - Where have we been? Assessment Plan Assessment Audit Preface to the 2003 Assessment Plan -Where are we now? -Essential Learning Outcomes (and LEAP) -Wisconsin Experience -Where are we going? -Convergence and integration -What do you think? - Discussion about your roles and how expectations for student learning influence your work as an educator - How do we know how we are doing? Presenters Mo Noonan Bischof, Assistant to the Provost and Co- Chair of the Assessment Council Jocelyn Milner, Director of Academic Planning and Analysis, past Co-Chair of Assessment Council About Assessment at UW-Madison: About the Wisconsin Experience: About the Essential Learning Outcomes: JLM/APA April 2008 Courses Co-Curricular Activities Academic Programs Institutional-level perspective External stakeholders More aggregated High-resolution

Early 1990’s General Education Assessment 1995 University Assessment Plan 2003 University Assessment Plan Year-long process Specified roles and responsibilities across levels Evaluation is part of academic life; system = assessment Every academic program has to have a plan, do something (anything!) annually to evaluate student learning Use assessment findings *locally* Intentionally laid aside the specification of university- wide expectations for student learning Where have we been?

2007 Assessment Audit - Explicit institution-level learning outcomes are necessary, basis for knowing how we are doing, for doing better - What expectations for student learning had been stated, agreed upon by the university community? - Examine expectations for student learning from existing documents - Our finding: Existing statements aligned with the Essential Learning Outcomes (ELOs) - Recommendation: adopt ELOs as university-wide expectations for student learning - To: Provost, VPT&L, UAC, UGEC, LEAPers Where have we been?

4 University of Wisconsin-Madison CONVERGENCE around the Wisconsin Experience and the Essential Learning Outcomes A number of campus groups have been independently discerning the distinctive nature of a UW-Madison education. These groups, units, and individuals have started to coordinate efforts in more intentional ways, and are converging on a shared understanding of the educational experience. Two intersecting sets of ideas have found strong resonance. The Wisconsin Experience at UW-Madison (WI-X) provides a description of the distinctive nature of the educational experience. The Essential Learning Outcomes (ELOs) align to cross-cutting values and expectations for student learning that are present in existing statements and descriptions of the curriculum. Together, WI-X and the ELOs provide a framework for talking and writing about the educational experience. They are not prescriptive. Rather, they give us language to describe, to ourselves and to others, goals for student learning. And they provide a reference point for planning and evaluation. About the Wisconsin Experience: Leadership from Aaron Brower, Vice-Provost for Teaching and Learning, and Lori Berquam, Dean of Students. About the Essential Learning Outcomes: A multi-year national research project by the AACU and involving universities, students, business leaders, and the professions. UW-Madison leadership from Jolanda Vanderwal Taylor, Chair of German; Nancy Westphal-Johnson, L&S Associate Dean; Elaine Klein, L&S Assistant Dean. First-Year Experience/ Office of New Student Programs Wisconsin LEAP Project Dean of Students L&S Student Academic Affairs University General Education Committee Office of the Registrar VP-Teaching and Learning Study Abroad Office School of Pharmacy Council of Associate Deans Reaccreditation Project University Assessment Council University Communications JLM/APA March 2008 Essential Learning Outcomes Academic Planning and Analysis Provost ASM Libraries L&S TA Training Morgridge Center DoIT Academic Technologies WAA Faculty, staff, and students are represented across the various units University Academic Planning Council

2008 Preface to the Assessment Plan - Recognizes that expectations for student learning are implicit in many existing documents and policies - Recommends the language of the Essential Learning Outcomes as university-wide expectations for student learning - Discussed at three University Assessment Council meetings; adopted February 26, {March 12 Gen Ed Breadth Event} - Presented to the University Academic Planning Council, March 28, 2008 Where have we been?

Essential Learning Outcomes (ELOs) Wisconsin Experience (WI-X) Together these two sets of ideas provide a framework: for explicitly articulating students’ educational experiences and goals for student learning, curricularly and programmatically for integrating efforts more intentionally across campus Where are we now?

7 Institutional-level Perspective Serves as a reference point for planning and collecting evidence at the local and campus-level Provides a template for understanding where the “gaps” are and what improvements are needed Sets a foundation for building a campus-level assessment report Institutional-level Challenges Documentation and communication: how do we communicate to internal and external audiences what we value, what we do, what we’ve learned, and what we hope to improve?

10 University of Wisconsin-Madison CONVERGENCE around the Wisconsin Experience and the Essential Learning Outcomes A number of campus groups have been independently discerning the distinctive nature of a UW-Madison education. These groups, units, and individuals have started to coordinate efforts in more intentional ways, and are converging on a shared understanding of the educational experience. Two intersecting sets of ideas have found strong resonance. The Wisconsin Experience at UW-Madison (WI-X) provides a description of the distinctive nature of the educational experience. The Essential Learning Outcomes (ELOs) align to cross-cutting values and expectations for student learning that are present in existing statements and descriptions of the curriculum. Together, WI-X and the ELOs provide a framework for talking and writing about the educational experience. They are not prescriptive. Rather, they give us language to describe, to ourselves and to others, goals for student learning. And they provide a reference point for planning and evaluation. About the Wisconsin Experience: Leadership from Aaron Brower, Vice-Provost for Teaching and Learning, and Lori Berquam, Dean of Students. About the Essential Learning Outcomes: A multi-year national research project by the AACU and involving universities, students, business leaders, and the professions. UW-Madison leadership from Jolanda Vanderwal Taylor, Chair of German; Nancy Westphal-Johnson, L&S Associate Dean; Elaine Klein, L&S Assistant Dean. First-Year Experience/ Office of New Student Programs Wisconsin LEAP Project Dean of Students L&S Student Academic Affairs University General Education Committee Office of the Registrar VP-Teaching and Learning Study Abroad Office School of Pharmacy Council of Associate Deans Reaccreditation Project University Assessment Council University Communications JLM/APA March 2008 Essential Learning Outcomes Academic Planning and Analysis Provost ASM Libraries L&S TA Training Morgridge Center DoIT Academic Technologies WAA Faculty, staff, and students are represented across the various units University Academic Planning Council

11 Question 1. Thinking about your role in teaching and the educational experience, where to you see resonance with the Essential Learning Outcomes? To the extent that the ELO’s resonate with you, how do they influence your work as an educator?

12 Question 2. How do we know if expectations for student learning are being met?

13 The following pages provide more information about doing assessment in your unit …...

14 Figure 2. Examples of Assessment Strategies Implemented at UW-Madison 1. After administering prelims, a faculty committee uses a scale to rate each student’s performance in each of the identified learning goals. Those ratings are summarized annually as an indication of the program’s effectiveness in conveying information students need to meet program expectations. 2. A capstone course requires upper-level students to complete a final project. A faculty committee reviews these projects and rates the extent to which they reflect identified learning goals. Results are presented at a faculty meeting in a discussion of the program’s effectiveness in conveying information students need to meet program goals. 3. All students completing a course required for admission to the major take a final exam containing one or more questions targeting one of the learning goals. Results are compiled to assess students’ “before” scores; later, when students complete their final requirements for the major, they are asked to respond to the same question to evaluate their attainment of information related to that goal. 4. A department asks the quantitative assessment project to develop an examination to assess the math preparation of students taking a course as a prerequisite for entry into the major; results are used to improve communication with students about necessary quantitative skills, and online tutorials developed to convey those skills. 5. Course directors meet with TA’s and instructors on a regular basis to discuss various components of an introductory course sequence. Specific outcomes are identified for each stage of students’ progress through the curriculum; the directors design a project used both to evaluate individual student achievement (individual grades assigned by instructors) as well as program evaluation (a sample of papers rated by all instructors using a common rubric). 6. A department publishes a list of problems that students should be able to perform on entering a course and another list of problems that students should be able to solve on completing the course. From time to time, and instructor in the course reports to the department on how the students are measuring up to these expectations. 7. The department’s curriculum committee establishes a regular sequence of course offerings to ensure that majors can fulfill degree requirements in a timely way; this sequence is consulted when the timetable is built, when sabbaticals are considered, and when other decisions are made that influence the regular scheduling of offerings. The arrival or departure of faculty may provoke a review of the course array or of the requirements for the major. Source: 2003 University Assessment Plan

15 Source: 2003 University Assessment Plan Figure 3. Practices that Contribute to Successful Academic Assessment 1. Do not assess every learning goal every year. For example, a major with five discrete learning goals might evaluate each goal in turn. Or one distinct methodology may be applied at any one time. Break the task into achievable units to maintain a manageable assessment program. 2. Use both direct and indirect measures to evaluate student achievement of learning goals. For example, evaluate a group of papers by graduating seniors against standardized expectations (a direct measure of student performance) or survey students about their learning behaviors and perceptions of learning (an indirect measure of student experience). 3. Use both formative and summative elements. For example, student performance on a goal might be evaluated upon conclusion of a course required for admission to the major (as an early formative measure) and the same goal might be evaluated when those students complete their degree requirements (a final summative measure). 4. Employ the highest research standards possible within the limits posed by resources and expertise. The value of measurement increases if it is taken repeatedly over time and especially if the same measurement is taken repeatedly over a period of time that spans a change in the program. Such trend analyses are likely to be sensitive to change over time. 5. Collect retain and summarize data in ways that facilitate its use. Use data to support academic judgment. 6. Collect data when it becomes available even if the analysis of the data will take place later. Examples: course closeout information; course evaluation forms; collections of capstone papers; faculty evaluations of preliminary exams. Collect the same data at the same time each semester and/or year since time series data are essential to high- quality assessment. 7. When possible make use of standard reports and tabulations of student curricular and budgetary data that are produced regularly for campus use. Examples: enrollment and degree reports; grade distributions reports; enrollment statistics by minority group and gender; Departmental Planning Profiles; Data Digest; Graduate Program Profiles. Department or program records need not replicate all of this information if the historical data can be retrieved from campus data resources (the UW Data Warehouse). 8. Those who undertake assessment projects that involve interaction with individuals seek advice on whether human subjects review is necessary based on the most recent regulations and legislation (see the appropriate Graduate School web site). 9. Students who participate in assessment activities need to understand their role in the assessment activity, its purpose, and how results will be used. Students may come to the task with greater commitment if they understand that the goal is to improve the program.

16 The Basic Assessment Plan Modified from Assessment Clear and Simple: A Practical Guide, Barbara Walvoord, Specify expectations for student learning, goals When students finish this (assignment, course, program) we expect them to ___________________________ Not too many; stick with what you agree on. 2. One direct measure: Program faculty review senior work for progress to goals. For example selected projects from a capstone course. Also works for graduate theses. Subjective evaluations are acceptable. 3. One indirect measure. A survey or focus group that asks students: i) How well did they meet expectations for student learning ii) What aspects of this (assignment, course, program) helped their learning? Why? How? iii) What might be done differently to help them learn more effectively? Why? How? Placement rates, alumni surveys may also be useful. 4. Annual meeting to review results, findings, identify a some actionable improvements; decide on upcoming year measures.

17 The Basic Assessment Plan: Use Scoring Schemes to Organize Judgment (Rubrics, Primary Trait Analysis) Expectation for learning Rating Knowledge of human cultures and the physical and natural world Intellectual and practical skills Personal and social responsibility Integrative learning Rating scale: 1=Well below expectations; 2=Somewhat below expectations; 3=About meets expectations; 4= Exceeds expectations; 5= Substantially exceed expectations. (or a scale that works for the given purpose).

18 Liberal Education Scorecard, Wick and Phillips, in Liberal Education, V94(1): Figure 2b. Sample Liberal Education Scorecard: “Scientific Reasoning” Emphasis