EVIDENCE-BASED STORYTELLING: SHARING OUR NARRATIVES Natasha Jankowski, Ph.D., Associate Director National Institute for Learning Outcomes Assessment (NILOA)

Slides:



Advertisements
Similar presentations
Linking assignments with course and program goals Lehman College Assessment Council and Lehman Teaching and Learning Commons November 28, :30 -
Advertisements

Higher Learning Commission Annual Conference Chicago, IL ▪ April 7, 2013 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate Director.
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
The DQP and Tuning USA: Partners in Advancing Student Learning Pat Hutchings, NILOA Natasha Jankowski, NILOA George Kuh, NILOA David Marshall, IEBC AAC&U.
An Outcomes-based Assessment Model for General Education Amy Driscoll WASC EDUCATIONAL SEMINAR February 1, 2008.
ELAC SLO RETREAT 2009 Veronica Jaramillo, Ph.D. Mona Panchal Anthony Cadavid ELAC SLO RETREAT 2009.
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Weber State University’s Teacher Preparation Program Conceptual Framework.
Effective Grading and Assessment:. Strategies to Enhance Student Learning.
Consistent Teacher judgement
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Helping Your Department Advance and Implement Effective Assessment Plans Presented by: Karen Froslid Jones Director, Institutional Research.
The DQP and Tuning USA: NILOA Resources for Faculty Collaboratives Pat Hutchings & Jillian Kinzie NILOA.
Everything you wanted to know about Assessment… Dr. Joanne Coté-Bonanno Barbara Ritola September 2009 but were afraid to ask!
Let’s Get S.T.A.R.T.ed Standards Transformation and Realignment in Thompson.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Assessing Student Learning Lynn Merklin Assistant Provost Office of Institutional Effectiveness August, 2014.
Academic Assessment Accountability: Are we what we say we are? Program Improvement: How can we be even better? External audiences: SACS.
COM 101 Training 2013 Roberta Rea. Teaching and learning practices have been widely tested and have been shown to be beneficial for college students from.
HECSE Quality Indicators for Leadership Preparation.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
On SLOA (SLO Assessment). (after Munch) Will Lecture die? No…
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
By Elisa S. Baccay. The teacher understands and uses a variety of instructional strategies to encourage students’ development of critical thinking, problem.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Welcome to Curriculum Mapping… The attendee will understand the purpose of curriculum mapping, with a focus on the alignment of instruction with desired.
Bonnie Paller 2013 AALC Assessment Retreat.  The charge of the Task Force is to identify the abilities and intellectual traits that all students are.
Practicing Meaningful Learning Outcomes Assessment at UGA Department of Crop and Soil Sciences August 10, 2015 Dr. Leslie Gordon Associate Director for.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Aligning the South African National Curriculum Electrical Technology Jenny Rault-Smith Director: Curriculum Development, WCED
Programming the New Syllabuses (incorporating the Australian Curriculum)
THE SLO PROCESS #1 create/update SLO’s, rubrics, & assessment methods #2 assess all students in all sections of all courses #3 maintain SLO assessment.
North East Association for Institutional Research Bethesda, Maryland ▪ November 3-6, 2012 Christine Keller VSA Executive Director Teri Lyn Hinds VSA Associate.
Data Analysis Processes: Cause and Effect Linking Data Analysis Processes to Teacher Evaluation Name of School.
October 15, 2015 QEP: PAST AND PRESENT AND FUTURE.
Student Learning Objectives (SLO) Resources for Science 1.
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
1 Student Learning Outcomes and Assessment Norma Ambriz * February 14, 2008 Student Learning Outcomes and Assessment Norma Ambriz * February 14, 2008 A.
How Institutions Use Evidence of Assessment: Does It Really Improve Student Learning? Natasha Jankowski, NILOA Higher Education Collaborative October 22,
Assessing the Impact of the Interactive Learning Space Initiative on Faculty Morale and Student Achievement Ball State University 2015 Spring Assessment.
ASSIGNMENT DESIGN Natasha Jankowski, NILOA May 20, 2014.
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Title Slide In the Murky Middle “In the Murky Middle: Assessing Critical Thinking and Integrative Learning Across the General Education/Major Divide” Dr.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
VOLUNTARY SYSTEM OF ACCOUNTABILITY AND LEARNING OUTCOMES: AN UPDATE Teri Hinds Voluntary System of Accountability Natasha Jankowski National Institute.
MAY 1, 2014 THE LOOK AND FEEL OF TASKSTREAM. PERCEPTION OF ASSESSMENT: COMPLIANCE Accreditation Write Outcomes Identify Assessments Gather Results PackageResults.
MUS Outcomes Assessment Workshop University-wide Program-level Writing Assessment at The University of Montana Beverly Ann Chin Chair, Writing Committee.
IPFW Assessment Academy Programmatic Assessment Workshop Series - Signature Assignments D. Kent Johnson, PhD Director of Assessment.
Learning Goals, Objectives, & Curriculum Mapping Linking curriculum & pedagogy to demonstrations of knowledge, skills, and abilities Dr. Marjorie Dorimé-Williams.
Assessment Design Natasha Jankowski Associate Director, NILOA
Student Learning Outcomes Assessment: Past, Present, and Future
Office of Planning & Development
M-LANG project  Ref. n NO01-KA Interactive Exchange Workshop on how to use response systems and ICT tools for creating interactive learning.
Consider Your Audience
Closing the Assessment Loop
Effective Outcomes Assessment
Telling our Stories: Narratives of Student Learning
Preparing Future Faculty/FaCET Assessment Session
Jillian Kinzie, Indiana University Center for Postsecondary Research
Curriculum Coordinator: D. Miller Date of Presentation: 1/18/2017
February 21-22, 2018.
Student Learning Outcomes at CSUDH
Assessment Where Learning Matters
Assessment at Northeastern University
Presentation transcript:

EVIDENCE-BASED STORYTELLING: SHARING OUR NARRATIVES Natasha Jankowski, Ph.D., Associate Director National Institute for Learning Outcomes Assessment (NILOA)

NILOA NILOA’s mission is to discover and disseminate effective use of assessment data to strengthen undergraduate education and support institutions in their assessment efforts. ● S URVEYS ● W EB S CANS ● C ASE S TUDIES ● F OCUS G ROUPS ● O CCASIONAL P APERS ● W EBSITE ● R ESOURCES ● N EWSLETTER ● P RESENTATIONS ● T RANSPARENCY F RAMEWORK ● F EATURED W EBSITES ● A CCREDITATION R ESOURCES ● A SSESSMENT E VENT C ALENDAR ● A SSESSMENT N EWS ● M EASURING Q UALITY I NVENTORY ● P OLICY A NALYSIS ● E NVIRONMENTAL S CAN ● D EGREE Q UALIFICATIONS P ROFILE ● T UNING

Using Assessment Results

Share Anyone have examples of the use of assessment results they would like to share? Where are you struggling on your campus?

Institutions have the greatest difficulty in the assessment cycle of “closing the loop” (Banta, 2009) Closing the Loop SLOs Select Measures Gather data Analyze the data Use to improve

Most assessment literature states that the reason or purpose of engaging with assessment in the first place is to USE the results to IMPROVE student learning (Banta, 2007; Ewell, 2010; Suskie, 2009; Walvoord, 2004) But what does that really mean? Using Results

The ability to make causal claims about our impact on students and their learning Institutional structures and support + student = enhanced learning Causal Statements

Mobility of students Untracked changes Changes in courses add up to program level change Levels at which use occurs Longer than a year cycle Loosely coupled relationships Life Difficulty of Causal Statements

Why do we think the changes we make will lead to better outcomes? What is assumed in the changes we select as it relates to how students understand and navigate higher education? Theories of Change

A process of outlining causal pathways by specifying what is needed for goals to be achieved (Carman, 2012) It requires articulating underlying assumptions which can be tested and measured (Jones, 2010) A Theory of Change provides a roadmap outlining how one thinks or conceptualizes getting from point A to point B over time (Ployhart & Vandenburg, 2010) Theory of Change (TOC)

Coverage and content Opportunities and support Intentional, coherent, aligned pathways Within each of these is the belief about a root cause – why students were not learning or not meeting the outcome and the mechanism by which the institution can help them succeed For instance…

A process of exploring not only that something happened, but why it happened the way that it did (Rooney & Heuvel, 2004) Moves beyond surface-level problem identification and examines assumptions in order to prevent reoccurrences (Taitz, Genn, Brooks, Ross, Ryan, Shumack, Burrell, & Kennedy, 2010) Root Cause

Evidence-based storytelling The story of the lonely students

Your turn Can you think of other stories you would like to share? How often do you discuss assumptions about students and how and where they learn?

Toulmin (2003) Evidence Claim Warrant Warrants Arguments But…

Evidence of student learning is used in support of claims or arguments about improvement and accountability told through stories to persuade a specific audience (Jankowski, 2012) Evidence-based Storytelling

Shared Meaning Making Across the Institution 70% Examine multiple data points Group data by theme not method Discussion and Reflection

Sharing Practice What does good assessment look like for us here? Why do we think that what we are doing, for these students, will lead to enhanced learning, at this time?

A faculty chair in business examined the results of program outcomes for learners who completed the program capstone course and found that on one of the outcomes, learners were performing below what he regarded as the minimum threshold. Through the curriculum maps and alignments linking learning activities in individual courses to program outcomes in the capstone, he was able to identify across the entire program which courses had the strongest alignment to the outcome in question. From there, he was able to delve deeper into individual learning activities, to combine that information with additional data including course evaluations, and from the combined data to make detailed changes in specific courses and specific learning activities or assignments within courses. By the time participants in the revised courses and learning activities completed the capstone course, there was a measurable improvement in the particular outcome in question. The faculty chair involved in the process stated, “The concept of having an outcomes-based approach and having a strong theory of alignment all the way down to individual learning activities helps facilitate the use of assessment data.” The Brian Barton Story

Veterinary technology students did not score as well as needed in quantitative reasoning, for example, so veterinary technology faculty redesigned several key assignments to build and document that competency in students. Whereas previously students only read an article to learn about monitoring glucose levels in felines, the new assignment asked them to read the article, to take a reading of a cat’s glucose level, and then to use both sources to write an analytical report. This curriculum redesign created a more robust and discipline- specific quantitative reasoning experience for students and a richer set of documents to be collected and examined through ePortfolio. Addressing general education requirements throughout the program, according to the veterinary technology program director, means that “programs need to decide where they are addressing general education within the curriculum,” and using student artifacts collected through the ePortfolio “brings assessment to the forefront of the classroom.” Veterinary Technology

The religion department wanted to know if their students were writing at a desired level, and so the faculty developed a writing rubric, gathered a random collection of student essays, and had a faculty panel rate them. A report was generated from the rating that outlined where students demonstrated or fell short on the outcomes in question. Areas where students fell short were used to refocus teaching and also to rethink the sequence of courses and assignments within courses so as to better reinforce the desired outcomes and help students improve. A faculty member involved in this effort remarked, “It seems so modest to state it now – we identified an intended learning outcome, made rubrics, looked at essays, and altered teaching – but that fairly modest process generated a holistic view of what students were doing well and what they were not doing so well, which allowed for minor adjustments. In a year or two these adjustments showed that students are doing better on a given outcome.” Writing Across the Curriculum

Assumptions How do we think about our students? What assumptions do we make about them and their ability to learn? Where do we think learning occurs? What is our theory of change? Why do we think the change we are proposing will lead to enhanced student learning?

Take Aways There is not one “right” way to undertake assessment processes Institutions consider what works, for their students, at this moment in time Institutions develop shared understanding, through reflection and space for dialogue, what “good” assessment looks like within their specific institutional context

If we engage in assessment for internal improvement, how do we share that externally for those interested in reports or compliance?

Thank you! Discussion