Download presentation
Presentation is loading. Please wait.
Published byCharlotte Terry Modified over 9 years ago
1
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Ryan.McLawhon@tamu.edu Elizabeth C. Bledsoe, M.A. Program Coordinator Institutional Assessment ebledsoe@tamu.edu Kimberlee Pottberg Sr. Admin Coordinator Institutional Assessment K-pottberg@tamu.edu assessment@tamu.edu 979.862.2918 assessment.tamu.edu
2
Components of the WEAVEonline Assessment Plan & expectations of each Assessment Review Process Question and Answer Session Agenda
3
SACS Comprehensive Standard 3.3.1 3.3 Institutional Effectiveness 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) 3.3.1.1educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate SACS Expectations
4
SACS Comprehensive Standard 3.3.1 3.3 Institutional Effectiveness 3.3.1 The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: (Institutional Effectiveness) 3.3.1.1educational programs, to include student learning outcomes 3.3.1.2 administrative support services 3.3.1.3 educational support services 3.3.1.4 research within its educational mission, if appropriate 3.3.1.5 community/public service within its educational mission, if appropriate SACS Expectations and provides evidence of improvement based on analysis of the results…
5
The Assessment Circle Develop Program Mission & Outcomes Design an Assessment Plan Implement the Plan & Gather Information Interpret/ Evaluate Information Modify & Improve Adapted from: Trudy Banta, IUPUI
6
Develop Mission and Outcomes Develop Program Mission & Outcomes
7
The mission statement links the functions of your unit to the overall mission of the institution. A few questions to consider in formulating the mission of your unit: –What is the primary function of your unit? –What should stakeholders interacting with your unit/program experience? Mission Statement
8
Brief, concise, distinctive Clearly identifies the program’s purpose and larger impact Clearly aligns with the mission of the division and the University Clearly identifies the primary stakeholders of the program: i.e., students, faculty, parents, etc. Characteristics of a Well-Defined Mission Statement
9
Limited in number (manageable) Specific, measurable and/or observable Meaningful Outcomes/Objectives should…
10
There are two categories of outcomes: Learning Outcomes Program Objectives Outcomes/Objectives
11
Students participating in service learning activities will articulate how the experience connects to their degree and understanding of their field. Students will identify and discuss various aspects of architectural diversity in their design projects. Examples of Learning Outcomes
12
Process statements –Relate to what the unit intends to accomplish Level or volume of activity (participation rates, turnaround time, etc.) Compliance with external standards of “good practice in the field” or regulations (government standards, etc.) Satisfaction statements –Describe how those you serve rate their satisfaction with your program, services, or activities Program Objectives
13
Process statements –The Office of Safety and Security will prevent and resolve unsafe conditions. Satisfaction statements –Students who participate in Honors and Undergraduate Research core programs will express satisfaction with the format and content of the programs by acknowledging that these activities contributed toward their achieving learning outcomes for undergraduate studies. Examples of Program Objectives
14
Design an Assessment Plan
15
Measurable and/or observable –You can observe it, count it, quantify, etc. –Specifically defined with enough context to understand how it is observable Meaningful –It captures enough of the essential components of the objective to represent it adequately –It will yield vital information about your unit/program Triangulates data –Multiple measures for each outcome –Direct and Indirect Measures Measures should be…
16
Define and identify the sources of evidence you will use to determine whether you are achieving your outcomes and how, if necessary, how that will be analyzed/evaluated. Identify or create measures which can inform decisions about your unit/program’s processes and services. Assessment Measures
17
There are two basic types of assessment measures: Direct Measures Indirect Measures Types of Assessment Measures (Palomba and Banta, 1999)
18
Direct measures are those designed to directly measure what a stakeholder knows or is able to do (i.e., requires a stakeholder to actually demonstrate the skill or knowledge) OR Direct measures are physical representations of the fulfillment of an outcome. Direct Measures
19
Indirect measures focus on: stakeholders’ perception of the performance of the unit stakeholders’ perception of the benefit of programming or intervention completion of requirements or activities stakeholders’ satisfaction with some aspect of the program or service Indirect Measures
20
Surveys Exit interviews Retention/graduation data Demographics Focus groups Common Indirect Measures
21
Some things to think about: –How would you describe the end result of the outcome? OR How will you know if this outcome is being accomplished? What is the end product? –Will the resulting data provide information that could lead to an improvement of your services or processes? Choosing Assessment Measures
22
An achievement target is the result, target, benchmark, or value that will represent success at achieving a given outcome. Achievement targets should be specific numbers or trends representing a reasonable level of success for the given measure/outcome relationship. What does quality mean and/or look like? Achievement Targets
23
95% of all radiation safety inspections assigned will be performed monthly, to include providing recommendations for correcting deficiencies. This target was established with departmental leadership based on previous years' performance and professional judgment. A 5% increase in products and weights of EHS recycled materials (e.g., used oil, light bulbs) from the previous year will be realized. Examples of Achievement Targets
24
Implement & Gather Information Implement the Plan & Gather Information
25
The results of the application of the measure to the collected data The language of this statement should parallel the corresponding achievement target. Results should be described in enough detail to prove you have met, partially met, or not met the achievement target. Findings
26
Interpret/Evaluate Information
27
Analyzing Findings Three key questions at the heart of the analysis: –What did you find and learn? –So What does that mean for your unit or program? –Now What will you do as a result of the first two answers?
28
Analysis Question Responses should… Demonstrate thorough analysis of the given findings Provide additional context to the action plan (why this approach was selected, why it is expected to make a difference, etc.) Update previous action plans – results of implementation
29
Modify/Improve Modify & Improve
30
After reflecting on the findings, you and your colleagues should determine appropriate action to improve the services provided. Actions outlined in the Action Plan should be specific and relate to the outcome and the results of assessment. –Action Plans should not be related to the assessment process itself Action Plans
31
An Action Plan will… Clearly communicate how the collected evidence of efficiency, satisfaction, or other Findings inform a change or improvement to processes and services. This DOES NOT include: –Changes to assessment processes –Continued monitoring of information –Changes to the program not informed by the data collected through the assessment process
32
Assessment Review
33
Mission Statement
34
Outcomes/Objectives
35
Measures
36
Targets
37
Findings
38
Action Plans
39
Analysis Questions
40
Assess what is important Use your findings to inform actions You do not have to assess everything every year Take-Home Messages
41
OIA Consultations WEAVEonline support and training Assessment plan design, clean-up, and re-design –And we can come to you! New Website: assessment.tamu.edu
42
Questions?
43
http://assessment.tamu.edu/conference
44
The Principles of Accreditation: Foundations for Quality Enhancement. SACS COC. 2008 Edition. Banta, Trudy W., & Palomba, C. (1999). Assessment Essentials. San Francisco: Jossey-Bass. Banta, Trudy W. (2004). Hallmarks of Effective Outcomes Assessment. San Francisco: John Wiley and Sons. Walvoord, Barbara E. (2004). Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education. San Francisco: Jossey-Bass. Assessment manuals from Western Carolina University, Texas Christian University, the University of Central Florida were very helpful in developing this presentation. Putting It All Together examples adapted from Georgia State University, the University of North Texas, and the University of Central Florida’s Assessment Plans References
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.