Student Affairs Assessment Council Wednesday, October 28, 2015.

Slides:



Advertisements
Similar presentations
Assessment of Academic Advising Assessing for Excellence Conference Central Carolina Community College Joni Pavlik & Brian Merritt April 16, 2008.
Advertisements

ARMENIA: Quality Assurance (QA) and National Qualifications Framework (NQF) Tbilisi Regional Seminar on Quality Management in the Context of National.
Chalmers University of Technology A COMPARISON OF THE CDIO AND EUR-ACE QUALITY ASSURANCE SYSTEMS Johan Malmqvist Chalmers University of Technology Göteborg,
Selected Items from a Report of the Higher Learning Commission Comprehensive Evaluation Visit to OSU Pam Bowers Director, University Assessment & Testing.
Minnesota State Community and Technical College Critical Thinking Assignment Example and Assessment.
NCCEA Annual Conference Waynesville, NC Assessment Basics: Implementing and Sustaining a Comprehensive, Outcomes-Based Assessment Plan October 19, 2006.
An Assessment Primer Fall 2007 Click here to begin.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
Evaluation. Practical Evaluation Michael Quinn Patton.
SURVEYS, OBSERVATIONS, AND RUBRICS OH MY! ASSESSING CAREER SERVICES Jessica M. Turos Bowling Green State University Career Center.
Standards and Guidelines for Quality Assurance in the European
Developing an Assessment Plan Owens Community College Assessment Day Workshops November 13-14, 2009 Anne Fulkerson, Ph.D. Institutional Research.
Strategic Planning with Appreciative Inquiry
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
NAVIGATING THE WATERS: USING ASSESSMENT TO MAKE A DIFFERENCE Amy Harper, Area Coordinator, Fordham University Greer Jason, PhD, Assistant Dean of Students,
Assessment 101: Back-to-Basics An Introduction to Assessing Student Learning Outcomes.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Overall Teacher Judgements
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
The County Health Rankings & Roadmaps Take Action Cycle.
Performance Measurement and Analysis for Health Organizations
 “…the systematic collection, review, and use of information about educational programs undertaken for the purpose of improving student learning and.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Foundation Degrees Foundation Degree Forward Lichfield Centre The Friary Lichfield Staffs WS13 6QG — Tel: Fax: —
Creating a Culture of Student Affairs Assessment Katie Busby, Ph.D. Jessica Simmons Office of Student Affairs Assessment & Planning University of Alabama.
Approaches to Assessment Workshop for College of the Redwoods Fred Trapp August 18, 2008.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Developing the Year One Report: WVC’s Experience as a Pilot College Dr. Susan Murray Executive Director, Institutional Effectiveness.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Community College Survey of Student Engagement (CCSSE) Benchmarks of Effective Educational Practice Summary Report Background: The Community College Survey.
Florida Education Center Tallahassee, Florida December 3, 2003 Dr. R. E. LeMon Vice Chancellor for Academic and Student Affairs 1 Florida Board of Governors.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Presenters: Erin Nunn, Jarrett Kealey, and Katelin Getz Ohio University.
Beginning with the End in Mind: Choosing Outcomes and Methods
Institutional Effectiveness A set of ongoing and systematic actions, processes, steps and practices that include: Planning Assessment of programs and.
Does Your Academic Program Make a Difference? Knowing What Your Students Learned Rosslyn M. Smith (moderator) Katherine A. Austin Valerie Osland Paton.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Using Portfolios to Assess the Outcomes of a Leadership Program 2008 International Assessment & Retention Conference Scottsdale, AZ June 13, 2008 Presented.
Learning Goals at St. John Fisher College Peter J. Gray, Ph.D. Director of Academic Assessment United States Naval Academy May 2004.
STRATEGIC PLANNING & WASC UPDATE Tom Bennett Presentation to Academic Senate February 1, 2006.
Catholic Charities Performance and Quality Improvement (PQI)
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Self Assessment SELF ASSESSMENT FOR YOU Ann Pike 30 th September 2010.
Copyright 2007 Assessment 101: A Guide for Student Affairs ACPA National Convention Atlanta, GA March 29, 2008 Becki Elkins Nesheim Dianne Timm Kimberly.
Educational Outcomes Service Group: Overview of Year One Lynne Tomasa, PhD May 15, 2003.
February, MansourahProf. Nadia Badrawi Implementation of National Academic Reference Standards Prof. Nadia Badrawi Senior Member and former chairperson.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
Council for the Advancement of Standards in Higher Education.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
Cumberland County College Patti Schmid, Head Librarian Valerie Gouse, Librarian.
The University of Texas-Pan American National Survey of Student Engagement 2013 Presented by: November 2013 Office of Institutional Research & Effectiveness.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
ASSESSING STUDENT LEARNING AND DEVELOPMENT By Dr. Clarice Ford.
Student Affairs Assessment
Governance and leadership roles for equality and diversity in Colleges
UTRGV 2018 National Survey of Student Engagement (NSSE)
PORTERVILLE COLLEGE ACCREDITATION OVERVIEW Fall 2017
Assessing Administrative and Educational Support Services
TLQAA STANDARDS & TOOLS
Presentation transcript:

Student Affairs Assessment Council Wednesday, October 28, 2015

Benchmarking is the “continuous systematic process for evaluating products, services, and work processes of organizations that are recognized as representing best practices for the purposes of organizational improvement.” -Spendolini, 1992

 Justify programs/services within student affairs  Improve quality  Demonstrate affordability  Develop strategic plans  Formulate policy  Aid in making decisions

 Internal: making comparisons within an organization  Competitive: examines performance against peer or competitor organizations  Functional: looking at high performing processes across the industry  Generic: looks at organizations outside of one’s field/industry

Best practices are typically the finest examples of process, program delivery, or methods in a given area that produce the highest known quality outcomes. -Palomba & Banta, 1999

 Usually determined by those meeting and exceeding a list of criteria  Can also be referred to as performance indicators  Can also be referred to as benchmarks  Can also be referred to as standards  Whatever you choose for the benchmarking project you undertake—define your term and be transparent.

 An indicator is “a relevant, easily calculable statistic that reflects the overall condition of an enterprise.” –Ewell, 1997  Easily calculable = easily calculable across institutions  Note that learning/developmental outcomes generally cannot be evaluated by performance indicators such as retention rates, graduation rates, and faculty-to-student ratio.  Indicators do not inform anyone as to the cause of the value found in the program—nor do they indicate how to improve.  USE indicators as measures, but we must incorporate student learning/development outcomes

 Comparative  Data intended to be public (for better or worse)  Intended to learn about what you can improve

 Are you going to share the information publicly? If so, with whom?  Are you benchmarking services and processes or student learning and development?  Is there national data that can help benchmarking data be more legitimate? Examples include:  NSSE  College & University Counseling Center Directors Data Bank  EBI surveys  American College Health Association Survey  National Association of Colleges & Employers Career Services surveys  Which institutions allow you to compare yourself in a meaningful manner?

 University of Connecticut  University of Pittsburgh  Syracuse University  Temple University  University of Chicago  George Mason University  University of Birmingham  University of South Florida  University of Houston

 Define the problem  Make sure benchmarking is appropriate  Determine what to benchmark  Choose who should be involved in the benchmarking project  Select comparable organizations  Determine what information will be collected  Determine how the information will be collected  Analyze the data  Take action  Assess the action taken

 In student affairs, problems can be identified through:  Student use studies  Student satisfaction studies  Student needs studies  Reviews  Change in resources  Realignments/reorganizations  Change in law, policy, procedure  Crisis or emergency situations  Feedback from various constituents

 Ask yourself: Can information from other organizations help my organization and help me achieve my outcome?  Must do preliminary investigation; you’ll need to find out if other organizations have been successful in whatever you are researching.

 Remember:  Product, service or process  Product = educational program

 People directly affected by process should be involved from the very beginning  Process should involve staff (or students) who deal directly with the problem  If staff (or students) are involved from the beginning, there will be greater ownership of the results and a greater likelihood that solutions will be implemented.

 In general, use peer institutions  Can also use:  Programs, services, processes which are similar to your own  Reputations for quality programs, services, or products  Valid information to offer  Hearsay v. Evidence  Reliable information to offer  Can’t always just look at websites; need to ask for data/assessment results  Leadership which values benchmarking  Don’t forget…help other institutions out! You may be calling them in a year for info.  It is important to note that despite comparable characteristics (institution size, academic functions) student affairs divisions may be VERY different. Try, as much as possible, to compare oranges to oranges.

 Determine what information is needed to improve the program, service or process  Develop a format/protocol which provides a framework within which information can be gathered

 Telephone interviews  Personal meetings/site visits  Surveys  Document/publication review  Archival information

 Code = look for themes  Be sure to ask yourself....  “what didn’t they tell us…and why?”  Be sure to address the problem identified initially  Include specific recommendations AND solutions

 Easiest action = those that save resources, or require little to no additional resources  More difficult action = when significant resources are required to solve the problem  Discuss this with leadership PRIOR to benchmarking process. If no resources are available, then what?

 Did the action taken actually solve the problem?

 Needs to be done correctly, or not at all  Process should involve those who are directly affected  Process should have support and commitment of the leadership of the organization  Use organizations that are comparable, willing to participate, and can offer reliable/valid information

 Benchmarking_Resources/CASE_Benchmarking_Toolkit/Alumni_Relations_Bench marking_Template.html Benchmarking_Resources/CASE_Benchmarking_Toolkit/Alumni_Relations_Bench marking_Template.html  case-studies-of-practice-in-student-support.pdf?sfvrsn=18 case-studies-of-practice-in-student-support.pdf?sfvrsn=18  of-Benchmarking-Reports-in-Higher-Education-Membership.pdf of-Benchmarking-Reports-in-Higher-Education-Membership.pdf

 Alstete, J. W. (1995). Benchmarking in higher education: Adapting best practices to improve quality. San Francisco: Jossey-Bass.  Ewell, P. T. (1997b). Identifying indicators of curricular quality. In G. J. Gaff, L. J. Ratfcliff and Associates, Handbook of the undergraduate curriculum: A comprehensive guide to purposes, structures, practices, and change. San Francisco: Jossey-Bass.  Palomba, C. A., and Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco: Jossey-Bass.  Spendolini, M. J. (1992). The benchmarking book. New York: Amacom.  Upcraft, M. L., & Schuh, J. H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco: Jossey-Bass.