Practical Strategies for Assessing Student Learning in Departments and Programs: A Utilization-Focused Approach Jo Beld St. Olaf College.

Slides:



Advertisements
Similar presentations
ASSESSMENT 101 Preparing Future Faculty (PFF) Workshop Spring 2011 Facilitators: Dr. Kara Penfield, Director of Academic Assessment; Isis Artze-Vega, PFF.
Advertisements

We’ve Got Our Assessment Results – Now What?! AAC&U 2014 Institute on Integrative Learning and the Departments Theme 3, Concurrent Session IV Jo Beld,
TELPAS Grades K-1 Holistic Rating Training Spring 2010 Hitchcock ISD.
Best Practices in Assessment, Workshop 2 December 1, 2011.
Assessment of General Education Patricia E. Beeson Vice Provost for Graduate and Undergraduate Studies February 11, 2008 University of Pittsburgh.
Assessing student learning. Diagnostic - The gathering of information at the outset of a course or program of study to provide information for both the.
1 Aligning Assessment Methods with Learning Outcome Statements and Curricular Design Presented at CCRI April 8, 2005 Material from Maki,
Teaching, Learning, and Assessing Peggy Maki Senior Scholar, Assessing for Learning AAHE
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
September 2013 The Teacher Evaluation and Professional Growth Program Module 2: Student Learning Objectives.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
2005 CELT Conference1 Strategies from the Assessment Trenches: A Self-Help Clinic presented by AURA.
Scholarship of Teaching: An Introduction New Fellows Orientation April 17, 2008 SoTL Fellows
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
Information Literacy and Assessment: The What, Why, Who, Where, When, and (most importantly) How.
Lesson Design Study Suggestions from our text: Leading Lesson Study.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Action Research: For Both Teacher and Student
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Looking at Student work to Improve Learning
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
Now That My Marking’s Done, Now What??? Brent Galloway Lori Stuber.
Welcome to the UA Assessment Showcase 2012! Sponsored by the Office of the Provost, the Assessment Coordinating Council (ACC), and the Office of Instruction.
Authentic Assessment Lynne E. Houtz, Ph.D. Associate Professor of Education Creighton University.
Susan Agre-Kippenhan, Portland State University Professor, Art Department Evaluating the Effectiveness of Service Learning.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
Administering the DRA2 and EDL2 Denver Public Schools Spring 2008 Grades 4–8.
Clear Purpose: Assessment for and of Learning: A Balanced Assessment System “If we can do something with assessment information beyond using it to figure.
Reflect and Revise: Evaluative Thinking for Program Success Tom DeCaigny and Leah Goldstein Moses.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Adriana Signorini, CRTE, SATAL Coordinator Greg Dachner, SSHA, SATAL Student Sharai Kirk, SSHA, SATAL Student How do we know our students are learning?
Evidence of Student Learning Fall Faculty Seminar Office of Institutional Research and Assessment August 15, 2012.
Ensuring that Professional Development Leads to Improved Mathematics Teaching & Learning Kristen Malzahn Horizon Research, Inc. TDG Leadership Seminar.
Assessments Matching Assessments to Standards. Agenda ● Welcome ● What do you think of assessment? ● Overview of all types of evidence ● Performance Tasks.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Assessment Tools.
Teacher Performance Evaluation System Data Sources.
Assessing Your Assessments: The Authentic Assessment Challenge Dr. Diane King Director, Curriculum Development School of Computer & Engineering Technologies.
1 Orientation Session at 2003 Assessment Conference A Richer and More Coherent Set of Assessment Practices Peggy L. Maki Senior Scholar Assessing for Learning.
“Assessment Made Easy” Workshop CCSC Eastern Conference October 15, 2004 Based on a presentation by Philip K. Way, University of Cincinnati Frances K.
Senior Capstone Experience Framework A Guide for South Dakota Schools.
Whatever It Takes Differentiated Assessment Session One.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
By Lynnece Edmond.  It is essential for researchers to collect data as a way to gather evidence to support their research question.  Collecting data.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
MT ENGAGE Student Learning Outcomes and Assessment April 27, 2015.
PROGRAM ASSESSMENT BASICS Alan Kalish, PhD Teresa Johnson, PhD Director Assistant Director University Center for the Advancement of Teaching.
What is Learning-Focused?
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
“To begin with the end in mind means to start with a clear understanding of your destination. It means to know where you’re going so that you better understand.
Enhancing Evaluation Stakeholder Responsiveness Through Collaborative Development of Data Collection Instruments Karen Kortecamp, PhD The George Washington.
Learning Assessment Techniques
Consider Your Audience
Creating an Active Learning environment
Presentation transcript:

Practical Strategies for Assessing Student Learning in Departments and Programs: A Utilization-Focused Approach Jo Beld St. Olaf College

Guiding questions Who are the users of the evidence?Who are the users of the evidence? How might the evidence be used?How might the evidence be used? How will you gather the evidence?How will you gather the evidence? How will you animate the evidence?How will you animate the evidence?

Theoretical perspectives Utilization-focused assessment (Patton, 2008): Focus on intended uses by intended users

Theoretical perspectives Backward design (Wiggins & McTighe, 2005): “Beginning with the end in mind”

Theoretical perspectives Traditional instructional design: Choose texts Develop classroom activities Develop classroom activities Make up tests

Theoretical perspectives Backward instructional design: Identify intended learning outcomes Determine appropriate evidence Determine appropriate evidence Plan instruction and practice

Theoretical perspectives Traditional assessment design: Choose an assessment instrument Gather and summarize evidence Gather and summarize evidence Send a report to someone

Theoretical perspectives Backward assessment design: Identify intended users and uses Define and locate the learning Define and locate the learning Choose assessment instrument Choose assessment instrument

Who are the users? Faculty roles, commitments, and disciplinary identities offer both incentives and disincentives to engage assessment

What are the uses? What kind of outcome do you want to investigate… –Knowledge (“Understanding of…”) –Proficiencies (“The ability to…”) –Practices (“The habit of…”) –Values or attitudes (“A concern for…”)

What are the uses? …and why? Affirming current practicesAffirming current practices Tweaking the content of key coursesTweaking the content of key courses Extending a specific pedagogyExtending a specific pedagogy Enhancing “scaffolding”Enhancing “scaffolding” Piloting innovationsPiloting innovations Supporting grant applicationsSupporting grant applications Setting future assessment agendasSetting future assessment agendas

How will you gather evidence? Assessment is like any other kind of investigation; use strategies that fit the questions you are trying to answer

How will you gather evidence? “Direct” Assessment” “Direct” Assessment Evidence of what students actually know, can do, or care about “Indirect” Assessment Evidence of learning-related experiences or perceptions

How will you gather evidence? Common direct assessment “artifacts”  Papers, essays, abstracts  Presentations and posters  Oral or written examination items  Analytic journals  Responses to survey or interview questions that ask for examples of knowledge, practice, or value

How will you gather evidence? Common indirect assessment “artifacts”  Course-taking patterns or transcript analysis  Responses to survey or interview questions about experiences, perceptions, self-reported progress, or impact of program experiences  Reflective journals

How will you gather evidence? But wait!! Aren’t we observing student work all the time anyway? What’s the difference between grading and assessment?

How will you gather evidence? Gradingsummarizes many outcomes for one student Assessmentsummarizes one outcome for many students

How will you gather evidence? The purpose of an assessment instrument is to provide systematic, summarized information about the extent to which a group of students has realized one or more intended learning outcomes

How will you gather evidence? Options to consider:  Use an instrument developed by someone else  Adapt an existing instrument  Add to something you’re already doing  Connect to institutional-level evidence  Invent something new

How will you gather evidence? Where possible, pair indirect observations of processes and perceptions with direct observations of outcomes (pp ).

How will you gather evidence? The dual goal of sampling: RepresentativenessandManageability

How will you gather evidence? Examples involving comprehensive sampling:  Survey of all senior majors  Application of rubric to all research abstracts in all seminars  Application of rubric to all work submitted for senior art show

How will you gather evidence? Examples involving selective sampling:  Application of rubric to randomly-selected subset of final papers in capstone course  Pre/post administration of locally-developed quiz in required sophomore methods course  End-of-course survey in one introductory and one senior-level course  Aggregation of results on selected items in an evaluation form for student work

How will you animate the evidence? Evidence never speaks for itself; be intentional in summarizing, interpreting, and responding to the findings

How will you animate the evidence? Plan intentionally for summarizing and interpreting findings: Delegate and distributeDelegate and distribute Enlist help from staff and studentsEnlist help from staff and students Use technology to save timeUse technology to save time Balance individual and collective workBalance individual and collective work Dedicate time for discussionDedicate time for discussion

How will you animate the evidence? Plan intentionally for responding to the findings: Borrow strategies from past successes in collective departmental actionBorrow strategies from past successes in collective departmental action Focus reporting on planned actions, not on the evidence itselfFocus reporting on planned actions, not on the evidence itself Weight Watchers trumps The Biggest LoserWeight Watchers trumps The Biggest Loser Dedicate resources for actionDedicate resources for action

A final thought…. Less really is more!