Motivating the Team to Focus on Assessment and Student Learning Outcomes Kimberly Allen-Stuck, Ph.D. Saint Joseph’s University Philadelphia, PA kallen@sju.edu.

Slides:



Advertisements
Similar presentations
Combining Forces on Campus: The Four Cs of Faculty Liaison Elizabeth Marshall and Melanie Mills The D.B. Weldon Library, The University of Western Ontario.
Advertisements

School Improvement Through Capacity Building The PLC Process.
HOUSTON EMPLOYEE ASSESSMENT AND REVIEW (HEAR) PROCESS INFORMATION SESSION NON-SUPERVISOR For more information, visit
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
Collaborative Assessment: A Strategy to Relate, Reflect, and React Leah Barrett, Assistant Vice President, Student Affairs Matt Barone, Assistant Director,
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Process Management Robert A. Sedlak, Ph.D Provost and Vice Chancellor, UW-Stout Education Community of Practice Conference At Tusside in Turkey September.
Webinar #1 The Webinar will begin shortly. Please make sure your phone is muted. (*6 to Mute, #6 to Unmute) 7/3/20151.
 Graphic Design Institute Overview. Managing the Curriculum  Industry Driven  Implementing Project-Based Strategies  Meeting CTE, State, & Industry.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Writing an Effective Assessment Plan
Developed by Cool Pictures & MultiMedia PresentationsCopyright © 2004 by South-Western, a division of Thomson Learning. All rights reserved. Organizational.
NAVIGATING THE WATERS: USING ASSESSMENT TO MAKE A DIFFERENCE Amy Harper, Area Coordinator, Fordham University Greer Jason, PhD, Assistant Dean of Students,
TEACHING EVALUATION CAN BE A ONE DISH MEAL Heather Campbell Brescia University College London, Ontario, Canada
Evidence based research in education Cathy Gunn University of Auckland.
2011 SAA Annual Meeting Genya O’Gara SPECIAL COLLECTIONS RESEARCH CENTER Engaged! Innovative Engagement and Outreach and Its Assessment.
NASPA Presentation Practical Tools for Building Division-wide Assessment Capacity Adrienne Dumpe, Graduate Assistant, VPSA Katie O’Dair, Director of Assessment.
Module 3 Developing Improvements and Building Institutional Capacity.
NSW DEPARTMENT OF EDUCATION AND COMMUNITIES – UNIT/DIRECTORATE NAME SASSPA Conference21 August 2015 Performance and Development NSW.
1 Promoting Evidence-Informed Practice: The BASSC Perspective Michael J. Austin, PhD, MSW, MSPH BASSC Staff Director Mack Professor of Nonprofit Management.
1 Leadership Symposium on Evidence-Based Practice in Child Welfare Services June 28, 2007 Davis, CA Inter-Agency and University Research Collaboration:
PROGRAM Perkins III Accountability and Continuous Improvement “Work in Progress” at Minnesota State Colleges and Universities Mary Jacquart Minnesota State.
Building a Culture of Leadership at Belmont High School Michael M. Harvey, Ed.D. Principal, Belmont High School.
Ascending to Assessment Greatness in presented by the Division of Institutional Effectiveness Helena Mariella-Walrond, PhD Vice President Cory.
District Office of Transition Services LAUSD March 31,2011.
Program Evaluation for Nonprofit Professionals Unit 3: Collecting the Data.
Writing a Professional Development Plan.  Step 1–Identify Indicators to be Assessed  Step 2 –Determine Average Baseline Score  Step 3 –Develop a Growth.
Creating Meaningful Experiences for Graduate Students Using the ACPA & NASPA Professional Competencies Brett Bruner Director of Persistence & Retention.
KEYS TO GREATNESS IN STRATEGIC PLANNING AND ASSESSMENT Presented by Helena Mariella-Walrond, PhD Provost and Senior Vice President Cory Potter Executive.
Using Data to Inform our Practice in Orientation, Transition & Retention Brett Bruner, Director of Persistence & Retention Fort Hays State University Randy.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Funded by HRSA HIV/AIDS Bureau Titles I & II Technical Assistance (TA) Webex January 11, 2007 Donna Yutzy, NQC Consultant Quality Management 101.
Strategic planning A Tool to Promote Organizational Effectiveness
The Performance and Staff Development Program
CAS Standards Student Learning Outcomes and Assessment THE BASICS
Committee Orientation
Management Information Systems
Performance and Development Cycle
Lisa Brun Western Kentucky University
Assessment team Synopsis of ACPA’s Student Affairs Assessment Institute, June 16-19, Louisville, KY Presented to Student Affairs Expanded Managers’ Group.
University Career Services Committee
Division I and Division II Institutional Performance Program
SAMPLE Develop a Comprehensive Competency Framework
Literacy Across Disciplines (LAD)
Focus, Focus, Focus! Using the “Laser Point” Process to Create Assessment Based Plans for Improvement Presented by: Janet McNellis, PhD, School of Education,
Dana Keener, Ph.D. ICF Macro 2009 AEA Annual Meeting November 12, 2009
Performance and Development Cycle
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Vermont’s Family Survey Data Sharing
Grant Writing Information Session
High-Impact Leadership: Train Managers to Inspire Staff to Optimal Performance Move beyond motivation to inspiration by personalizing vision, mission,
Overview – Guide to Developing Safety Improvement Plan
Child Welfare Demonstration Project (CWDP)
Planning for Success: Creating an Effective District Strategy and Plan
Overview – Guide to Developing Safety Improvement Plan
DESE Educator Evaluation System for Superintendents
Salem State University’s Pilot Civic Learning Study
What It Is and How to Design an Action Research Project
Building a Community of Practice
Strategic Plan Implementation July 18, 2018
Performance and Development Cycle
Assessing Administrative and Educational support units
IA Faculty athletics representatives Annual Conference NCAA Division I Institutional Performance Program Katy Yurk Kurt Zorn.
A Focus on Strategic vs. Tactical Action for Boards
ASSESSMENT Overview January 30, 2006 and February 1, 2006
John Early Middle Mission and Vision
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Developing SMART Professional Development Plans
Presentation transcript:

Motivating the Team to Focus on Assessment and Student Learning Outcomes Kimberly Allen-Stuck, Ph.D. Saint Joseph’s University Philadelphia, PA kallen@sju.edu

Assessment History at Saint Joseph’s University 2001 Team of 8 people attended an Assessment Conference 2001 – 2004 – Assessment Committee (grass roots) 2005 – Focus on writing student learning plans (via committee) 2006 – Program Review Process for the Division

Observations of our colleagues Performers Planners Avoiders The Unaffected

Qualities of Performers Have relevant data on hand at all times Have set assessment protocols Know what they are tracking Regularly share their results Respond to the data they receive with continual improvement

Why the Performers succeed Designated person for each assessment project Combination of short automated surveys, on-line/telephone surveys and focus groups Programmatic changes to meet student needs Data is integrated into every report, presentation, and the website

Qualities of Planners Have lots of anecdotal data Talk about surveys they have heard of or plan to develop They find data that others have interesting Know they either need to get started or need to analyze the data they have Have very limited data available when it is needed

Motivating Planners Provide them with a base line of data Assist them to define the goal of their assessment Offer to assist with the administration of the instrument Set a deadline for sharing analysis

Qualities of Avoiders Too busy for assessment Designate the newest employee to oversee assessment First to cite survey fatigue Realize that there are no repercussions for not having data

Motivating Avoiders Start with benchmarking Focus on 2 – 3 areas where the department dedicates the most energy Work with them to set a plan for who, how and when Discuss “What could be the best and worst outcomes of assessment?”

Qualities of the Unaffected Have longitudinal data for the students they serve (headcount, gender, GPA) Have data when asked, but it is not integrated into decision making No mechanism for sharing data beyond the supervisor Unlikely to have future plans

Motivating the Unaffected Make connections between the unaffected and other departments that could benefit from their data Ask deeper questions about the data trend Probe about learning outcomes Develop a sharing mechanism for assessment

Assessment Basics Buy in from the top Provide a starting point (old surveys, trend data, tracking data, etc.) Offer training sessions / discussion groups Teams working on the bigger projects

An Assessment Team Pros Cons Sharing resources Setting a calendar Supportive Environment Bigger picture opportunities Cons Varying levels of contribution Lack of synergy Lack to collaboration Meetings are easy to skip

Individual Consulting Pros Focused discussion Goal setting Plans developed Opportunity to challenge and support Cons Getting access Time Consuming Boundary blurring

Required Assessment Sharing Annual Reports Budget request justifications Assessment sharing sessions (all staff meetings) Populating an Assessment website – department designated areas

What do you already have… In house data from the Registrar, etc. Data already collected from national surveys (CIRP, CSS, NSSE, etc.) Professional Association assessment tools Old surveys that can be revised Tools from Assessment companies

What assistance is available? Professional association instruments Contacting colleagues to view their instruments and outcomes Attending Conferences Working with Graduate assistants Literature Review

Getting on the Same Page Learning Reconsidered Frameworks for Assessing Learning and Developmental Outcomes CAS Standards

Learning Reconsidered 2 “A practical guide …” Places student learning at the center Clear explanation of student learning outcomes Bloom’s Cognitive Development model Implementing promising practices of collaboration

FALDOS Provides an overview of the Outcome Theoretical Context Variables to investigate Available instruments On-line resources Bibliography

CAS Standards Great starting point Developed by professional associations Provide an opportunity to look into functional area and make comparisons to industry standards The departmental musts

Cycle of Continuous Improvement

Student Learning Outcomes As a result of [insert course, program or involvement], students will [differentiate, compare, summarize] [the desired outcome]. And then say how you will know.

Components of the Learning Plan adapted from the California State University, Northridge Program Mission Statement Program Goal Tie to University Goal Program Objectives Strategies Student Learning Outcomes Assessment Instrument and analysis

Creating the environment Values Assessment Sharing results Continuous Improvement