Presentation is loading. Please wait.

Presentation is loading. Please wait.

Multiple Perspectives on Directing Assessment in Student Affairs

Similar presentations


Presentation on theme: "Multiple Perspectives on Directing Assessment in Student Affairs"— Presentation transcript:

1 Multiple Perspectives on Directing Assessment in Student Affairs
2008 International Assessment & Retention Conference Scottsdale, AZ June 13, 2008 Presented by: A. Katherine Busby, PhD Director of Student Affairs Assessment and Planning University of Alabama Ellen Meents-DeCaigny, PhD Director, Student Affairs Assessment, Research and Communications DePaul University

2 Overview In assessment one size does not fit all…
Describe the context in which we work Present and discuss 4 scenarios related to assessment Culture Metrics Learning Outcomes Closing the Loop Offer insights based on our contexts and experiences 2

3 Student Affairs Assessment at University of Alabama
Office formed in 2006 Staff includes director and one graduate assistant Modest, director controlled budget Director serves as “internal consultant” Reports to Associate VP Decentralized assessment across division Leadership appreciates, but does not dictate assessment Active Student Affairs Assessment Council Emerging partnerships with other assessment professionals SACS accrediting region Housing & Residential Communities University Recreation Dean of Students Judicial Affairs Greek Affairs Student Development Women’s Resource Center Crimson Care Campus Activities University Programs Community Service Center Ferguson Center Counseling Center Career Center Flagship institution, research institution 85% FY retention rate, 63% graduation rate 3

4 UA Model of Assessment 4

5 Student Affairs Assessment at DePaul University
Part-time Coordinator position created 2004 Full-time position created 2007 1 full-time staff member & 2 graduate assistants report to Director Budget has not yet been established Director reports to VP Division-wide approach to assessment Student Affairs Assessment Committee Strong partnerships with Institutional Planning and Research and Teaching, Learning and Assessment Centralized approach to assessment at institution

6 DePaul’s Integrated Model of Assessment
Success Factors: Programs, services & collaborations in support of mission Department Key Activities Measures: Cost Magnitude Satisfaction Learning Outcomes Measures: How are students learning, engaged or involved? Department Assessment Question Division of Student Affairs Mission

7 Culture: The Human Element
“Successful assessment is not primarily a question of technical skill but rather one of human will” (Angelo, 1999). Establish a support structure Build shared trust Build shared motivation Develop a shared language Establish key partners in assessment Develop shared guidelines and expectations

8 Building Culture at DePaul
“Assessment Initiatives flounder because they’re headed-up by people who lack the time and clout to accomplish what is necessary” (Cohen and Kotter, 2002). Establish a Support Structure: Chief Student Affairs Officer: Vice President of Student Affairs Senior Level Sponsor: Associate Vice President of Student Development Assessment “Champion”: Director of Assessment, Research and Communications

9 Building Culture at DePaul
Build Shared Trust: Hire a knowledgeable, trusted and well-respected “champion” Spend time gathering buy-in Divisional meetings Department meetings Individual meetings Build Shared Motivation: Focus on the benefits of assessment Meet individuals and departments at their level Discuss ALL motivations related to assessment

10 Building Culture at DePaul
Establish key partners in assessment: Find institutional partners involved in assessment and research Select Assessment Committee members interested in the initiative Develop shared guidelines and expectations: Present the assessment model Discuss the logic behind the assessment model Distribute the model, timeline and report templates in a timely fashion

11 Building Culture at DePaul
Develop a shared language: A challenging process! Present terminology related to the model Define important terms learning outcomes process vs. outcomes assessment Develop data definitions

12 Culture: What I do can’t be assessed.
Although most student affairs professionals realize that assessment is a part of their work life we still hear the cries of “what I do can’t be assessed.” Sometimes these cries come from someone who does not buy into assessment, but other times these cries come from those who fear the assessment unknown. 12

13 A few keys points about culture
Helps to have a point person who is focused on assessment Helps to have VP and upper level leadership supporting the process and serving as the “champion” Make the process manageable and provide support for the process Demonstrate how the information will be used (and follow through!) and incorporate it into other areas of the division (annual reports) Determine which battles to fight Assessment vision Leadership – support Ownership within division Professional development Dissemination of information Web presence 13

14 Learning Outcomes: More than just a well phrased sentence.
Developing and utilizing learning outcomes is easier said than done. Some student affairs professionals cling to their “operational outcomes”, indicators, and measures of satisfaction (which are important in their own way) but fall short when it comes to learning. 14

15 Developing Learning Outcomes at UA
Audience Behavior Condition Degree of Achievement (may be omitted) R. Heinich, M. Molenda, J. Russell, S. Smaldino (2002). Instructional Media and Technologies for Learning, 7th Edition. Englewood Cliffs: Prentice Hall, Inc. 15

16 Developing Learning Outcomes at UA
To write a learning outcome, follow the formula: Condition Audience Behavior Degree 16

17 DePaul’s Outcome Evaluation Tool
Key Activity #1: (Name) Learning Outcome Related to key activity? Is the outcome meaningful? What does it tell you about your program? How does the department influence this outcome? Phrased as an outcome? Is it measurable? How? Suggestions/ Revised Outcome Keeling and Associates, LLC, 2006

18 A few key points about learning outcomes
Shoot, ready, aim OR ready, aim, shoot Understand and utilize different types of outcomes Embrace learning outcomes as an iterative process 2. Learning Outcomes (10 minutes) We wanted to get everyone started right away, so we asked departments to determine the 3-5 major activities they are responsible for in their areas and define them based on cost, magnitudes, outcomes and satisfaction/evaluation tools. Many had been collecting metrics, collecting satisfaction data, and had formal or informal outcomes. Start with what you have and build from there. Then we asked them to conduct an assessment project related to one outcome in one activity. What we had to do later was Go BACK and facilitate training to revise and improve our learning outcomes. Do you have them? Are you using them? How did you learn to write them and assess them? 18

19 Metrics: What is the purpose?
Student affairs practitioners often want information particular to their program or office. However, a good assessment plan may call for that same information to be used at the unit/department or division level. Make the most out of your data by using carefully designed metrics. Purpose of metrics Sharing information Collapsing of information Collecting apples and apples across the division Right information to the right person for the right reason 19

20 DePaul’s Use of Metrics
Metrics collected for departmental and divisional purposes Metrics enhance data collected through assessment projects Challenges: Developing common terminology Developing common methods of measurement Reporting metrics in ways that are useful to external audiences

21 UA’s Use of Metrics Utilizing existing student data
Beginning to identify metrics across units

22 A few key points about metrics
Developing common terminology is key to examine data across the division Careful collection of data will allow stories to be told at programmatic level, unit level, and division level Appropriately used metrics affords the opportunity to get the right information to the right person for the right reason Purpose of metrics Sharing information Collapsing of information Collecting apples and apples across the division Right information to the right person for the right reason 22

23 Closing the Loop: How do we use the data?
A successful assessment does not mean that the results met or exceeded our expectations. Rather it means that we used the results to inform our work and improve the college experience for our students. Communicating How do we use the data 23

24 Closing the loop at DePaul
Provide assistance with data analysis Help make connections between departments and across the division Help connect results to institutional and national data Continue to find ways to share results: Departmental and Divisional Annual Reports Assessment Celebration Presentations in and outside the division Share departmental changes that result from assessment

25 Closing the loop at UA Consultation after the analysis
Distribution of reports to targeted audiences

26 A few keys points about closing the loop
Assist staff through interpreting the results (even disappointing results) Maintain focus on improvement Demonstrate for staff how they can utilize results of national surveys Share information in multiple formats 26

27 Contact us Katie Busby Director, Student Affairs Assessment and Planning Box Tuscaloosa, AL Ellen Meents-DeCaigny Director, Student Affairs Assessment, Research & Communications 25 E. Jackson Blvd, Suite 1400 Chicago, IL 27


Download ppt "Multiple Perspectives on Directing Assessment in Student Affairs"

Similar presentations


Ads by Google