Download presentation
Presentation is loading. Please wait.
1
Framework for Assessment
Student Affairs and Enrollment Management 16 May 2016
2
Today’s Agenda Part 1 - UMKC’s Assessment Framework
Part 2 – Intro to UMKC Box app for archiving assessment documents Part 3 – Developing measurable SLOs that map to the SAEM learning goals and subgoals
3
Why we assess Assessment helps us make sure that we are fulfilling the promises we make to our students and to society (Suskie, 2010; Finley, 2014) Our promises to students Institutional mission and values statements SAEM mission and Goals and Subgoals Unit’s/program’s mission statement Unit’s/program’s student learning outcomes (mapped to division goals/subgoals) Mission statement – the 10 second ad – the succinct promise to students and other stakeholders Accountability, Program improvement
4
Purposes of Assessment
Ensure that we are delivering on what we care about: Students get the best possible education Students have access to the best possible learning environments, co-curricular programs, and support systems Ensure that learning inside and outside the classroom is of appropriate scope, depth, and rigor Asking – and answering - questions we care about concerning student learning and development Identify and preserve good practices in supporting student development
5
The Important Questions
Three questions to answer through assessment of SLOs: What have our students learned? Are we satisfied with what they’ve learned? If not, what are we going to do about it? (Eder, 2010)
6
Things to keep in mind There are no “one draft wonders.”
Assessment is an iterative process. Assessment plans will evolve. Part of the assessment process is learning how to do assessment better. Reflect on what went well in previous cycles and what improvements could be made. Assessment is a shared responsibility, not the function of one person in the unit. Assessment is NOT about individual job performance evaluation Don’t wait for perfection with initial plan.
7
UMKC’s Assessment Framework - Planning
Review annually Mission Statement Goals Student Learning Outcomes Annual Assessment Plan (there’s a form for that) Which outcomes will be assessed in the current cycle, when, how (measures) Handout on Assessment Framework
8
UMKC’s Assessment Framework – Annual Reporting
Weave app has been replaced by a report form developed by the UAC Word-based from Archived in UMKC Box App (more on this later) Due Oct. 1
9
UMKC’s Assessment Framework – Annual Reporting
Three major sections on form (available soon) 1) Reflection on previous cycle Action Plan status report – for previous cycles Use of feedback from previous year’s report 2) Assessment Findings – for current cycle What have students learned? Are you satisfied with what they have learned? Evidence? 3) Action Plan What are you going to do to improve student learning, and/or assessment process?
10
Program’s Mission Statement
Communicates the unit’s overriding purpose(s) The 10 second commercial that summarizes the unit’s promises to students and other stakeholders Link to UMKC’s and SAEM’s mission statements; student-centered The mission of the Division of Student Affairs and Enrollment Management is to provide leadership in attracting and developing a diverse student population through a vibrant and engaging collegiate experience that supports students in defining and achieving their personal and educational goals.
11
SAEM learning goals and subgoals
Parallel the outcomes for the gen ed program The broad knowledge, skills, abilities, dispositions students should achieve Each unit/program will develop one SLO that maps to one subgoal for the two goals to be assessed in Again, each unit/program will develop one SLO that maps to one subgoal for each of the two SAEM goals to be assessed. – today’s task
12
Student Learning Outcomes
Student Learning Outcomes (SLOs) – specific accomplishments to be achieved: What should students be able to do when they complete the program? Map to SAEM subgoal What are you looking for in student behavior to tell if they “get it”? Cognitively appropriate <one action verb> + <one something> Must be measurable.
13
SAEM Division Goals Informed Reasoning (2015-16) – report in Oct.
Well-Being ( ) – report in Oct. Personal Responsibility ( ) Culture and Diversity ( ) Technology and Information Literacy ( ) Civic and Community Engagement ( ) Communication Skills ( ) Handout – SAEM Division SLOs
14
SAEM – Goals and subgoals – 2016-17 assessment
Goal: Personal Responsibility Subgoals: Students will: identify a challenge, formulate solutions, and select an appropriate outcome that meets their needs or resolves a conflict. demonstrate knowledge of the UM System code of conduct. recognize the effects of their behavior on oneself, on others, and on the community. set life goals and identify and use specific campus and community resources to articulate the steps needed to reach these goals.
15
Personal Responsibility subgoals (contd.)
identify methods and resources to maintain their fiscal wellness. use their core values to guide decision-making when faced with moral, ethical, or other dilemmas. demonstrate leadership knowledge, skills, and abilities. integrate concepts of honesty and integrity into self-understanding and interactions with others. use appropriate resources for self-advocacy and problem solving.
16
SAEM – Goals and subgoals – 2016-17 assessment
Goal: Culture and Diversity Subgoals: Students will: demonstrate an increased awareness of, appreciation for, and engagement with the diverse aspects of society. demonstrate an increased knowledge of diversity as a vehicle for inclusion, interaction, and respect, and to address systemic oppressions. exhibit an understanding of diverse cultural histories. demonstrate the ability to communicate and interact in a positive manner with people of different cultures, backgrounds, and abilities. utilize empathy skills that facilitate active engagement in another’s life.
17
Student Learning Outcomes
What you tell students, parents, and other stakeholders concerning the primary reasons your unit exits. What students will gain from involvement in your program. Two types of outcomes: Student Learning Outcomes (SLOs) Process Outcomes (aka program or operational) (not today’s focus) Most difficult part is developing measurable outcomes. Maybe appointments were being scheduled when a counselor was not in the office, or maybe multiple appointments were booked for a single time slot
18
Student Learning Outcomes
SLOs are measurable: <action verb> + <something> Students will be able to <identify>, <define>, <evaluate>…… Students will be able to <describe>, <analyze>, <interpret>….. SLOs: Students will implement a six-month long personal fitness plan. Students will utilize stress reduction strategies to manage test anxiety responses.
19
After identifying desired outcomes:
Identify experiences that will support development of the student learning outcome: Seminars, information sessions, performances, mini-series Identify data needed to determine if outcome is met – assessment measures
20
Assessment Plan Annual Assessment Cycles (AY)
4-year for SAEM learning goals Overlapping – will be collecting data for one cycle while reflecting on and developing action plan for previous cycles What outcomes assessed that cycle How each is assessed (measures) When and where data are collected
21
Indirect v. Direct Measures
Indirect assessment provides a snapshot of students’ perspectives on various programs Direct assessments are a means by which students can demonstrate the knowledge they have obtained. For example: Indirect assessments might ask students whether they think they learned certain principles at a workshop. A direct assessment would ask them to demonstrate these principles.
22
Assessment Measures - Direct
Direct assessments gather information about what participants actually know or are able to do. Direct assessments require students / participants to demonstrate their learning. Examples of direct assessment methods: Quizzes / tests Published tests Rubrics applied to student artifacts, presentations Portfolios Observations
23
Assessment Measures - Indirect
Indirect measure ask for participants’ perceptions concerning: Satisfaction with program, services provided, event, etc. Perceptions of their learning gains (e.g., the program improved my time management skills) Examples of indirect measures: Surveys – unit / institutional / published Focus groups Individual interviews
24
Selecting a measure Appropriate to the outcome
Realistic in the context of: the experience of the unit’s staff, and available human, financial, and technological resources. Don’t design a study that calls for complex statistical analyses if don’t have someone experienced in stats available to conduct the analysis.
25
Sampling Sampling is OK in assessment
Ensure the sample is representative of the students participating in the program If program is targeted to all undergraduates, aim for sample representative of undergraduate students at UMKC. If program is targeted to residents students, aim for a representative sample of student living in residence halls. What is an appropriate sample size? How much evidence is enough? Large enough to allow you to feel reasonably confident that you have a representative sample of what your students can do, and you can use the results with confidence to make decisions about the program. Need to take into account what is practical / feasible given available resources
26
Targets Targets – what is “good enough”?
Standard by which you can say that the program is keeping its promise. Targets probably won’t be consistent across outcomes – should vary according to magnitude of the outcome in relation to the program’s purpose Usually stated as X% of students will achieve Y. 95% of students will achieve a 3 or 4 on each criterion on the Leadership Rubric. 80% of students will score 3 or higher on ability to incorporate technology in oral presentation. 85% of the respondents will “agree” or “strongly agree” with items 3, 4, 7, and 10 on the satisfaction survey. If you don’t know where you are going, how do you know when you get there?
27
Assessment Findings Summary of findings – aggregate data for each outcome assessed in the cycle Evidence of student achievement Substantiating Evidence Summary data – tables, charts, Evidence of discussions among staff concerning the data Was target met? If not, what is the unit going to do about it? (Action Plan)
28
Action Plan Action Plan – for each outcome assessed
What will the program do to improve student learning? E.g, add a component to student staff training on working in teams When? Component will be developed in summer 2016 and implemented in Fall 2016 during initial staff training Who is responsible? Supervisor conducting the training How will you determine that the action has been completed? Component developed and used in training, and outcome reassessed in Spring 2016 and results compared to previous cycle – Do they indicate enhanced ability to work in teams? The action plan may include strategies to improve assessment efforts
29
Action Plan Status Status of implementation of the action plans developed in previous years. Completed In progress Not implemented The evidence that program has used the assessment results to enhance student achievement Update status in annual reports until plan fully implemented (or abandoned).
30
Due Oct. 1 Findings, actions plans for 2015-16
New form – archived in UMKC Box app Assessment plan for Cycle
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.