Download presentation
Presentation is loading. Please wait.
Published byAlena Yore Modified over 10 years ago
1
Institutional Effectiveness at the University of North Alabama Dr. Andrew L. Luna Institutional Research, Planning, and Assessment IE Presentation, November 20, 2008
2
Connections to IE?
3
Hawthorne Works, Bell Laboratories Telephones and IE? Walter Shewhart Edwards Deming Joseph Juran
4
Shewhart Cycle FACT... The Shewhart Cycle is the foundation for all quality and Continuous improvement processes that we use today Plan Do Check Act Continuous Improvement
5
Points of Discussion Similarities between the Shewhart Cycle and Institutional Effectiveness Overview of Institutional Effectiveness at UNA Review of Outcomes and Improvement Processes Review of Assessment Questions
6
More on the Shewhart Cycle Plan – Create a strategy as to what you want to do and how you will measure success Do – Follow the plan and do what you say you will do Check – Assess the effectiveness of the current plan by looking at the success outcomes measures Act – Make changes to the strategies to improve the measured outcomes Repeat the Cycle!
7
Why is the Shewhart Cycle Important? If you cant measure something, you cant understand it… If you cant understand it, you cant control it… If you cant control it, you cant improve it… If you cant improve it…then why the heck are you doing it?
8
So, What is Institutional Effectiveness? A sharpened statement of institutional mission and objectives Identification of intended departmental/programmatic outcomes or results (Plan) Establishment of effective means of assessing the accomplishments outcomes and results (Do, Act, Check) FACT... Institutional Effectiveness is primarily undertaken to improve what we do…not just to pass accreditation.
9
Shewhart Cycle and SACS Macro IE Core Requirement 2.5: The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that (1) incorporate a systematic review of institutional mission, goals, and outcomes; (2) result in continuing improvement in institutional quality; (3) demonstrate the institution is effectively accomplishing its mission. Plan Check and Act Do
10
Key Points to Core Requirement 2.5 Emphasizes an expectation that the institution is the primary focal point for compliance Sets expectations for the description of planning and evaluation processes that are active and continuous rather than static or single occurrences. Points to a clear and strong expectation for documentation of the systematic review of institutional mission, goals and accomplishments consistent with its mission Sets expectations for the documented use of results of institutional planning and evaluation to achieve institutional improvements
11
Shewhart and SACS, Cont. Micro IE Comprehensive Standard 3.3.1: The institution identifies expected outcomes for its education programs … and its administrative and educational support services; assesses whether it achieves those outcomes; and provides evidence of improvement based on analysis of those results. Plan Check Do and Act
12
Key Points to Comprehensive Standard 3.3.1 Emphasizes the unit level of individual educational programs and support services The expected achievements of educational programs and support services should be articulated, and evidence presented concerning accomplishments Distinguishes between program outcomes and learning outcomes Sets expectations that improvement is guided by the establishment and evaluation of program and learning outcomes
13
Shewhart and SACS, Cont. General Education and IE Comprehensive Standards 3.5.1 The institution identifies college-level competencies within the general education core and provides evidence that graduates have attained those competencies. Plan Do, Check, Act
14
Key Points to Comprehensive Standard 3.5.1 General Education should be part of the institutional mission The expected achievements of the General Education program should be articulated, and evidence presented concerning accomplishments Improvement should be guided by the establishment and evaluation of learning outcomes
15
Overview of Institutional Effectiveness Focus on Assessment Comprehensive Dept./Program Review Program Outcomes Quality Indicators Productivity Indicators Viability Indicators Comprehensive Dept./Program Review Program Outcomes Quality Indicators Productivity Indicators Viability Indicators Continuous Improvement of Programs and Departments Institutional Effectiveness Mission Strategic Goals Mission Strategic Goals Evaluation of Learning Learning Outcomes What graduates know What graduates can do What attitudes/values graduates possess Evaluation of Learning Learning Outcomes What graduates know What graduates can do What attitudes/values graduates possess Continuous Improvement of Student Learning
16
Institutional Effectiveness System at UNA Annual Report - Annual Action Plan and Assessment Report Comprehensive Program and Department review – Five-year Review Review of General Education – Five-year cycle of General Education assessment
17
Schematic of Institutional Effectiveness Process Year 1 Annual Reports Year 1 Annual Reports Year 2 Annual Reports Year 2 Annual Reports Year 3 Annual Reports Year 3 Annual Reports Year 4 Annual Reports Year 4 Annual Reports Year 5 Annual Reports Year 5 Annual Reports Review of Strategic Goals Five-Year Review for Selected Depts. Area I Five-Year Assessment Area I Five-Year Assessment Area II Five-Year Assessment Area II Five-Year Assessment Area III Five-Year Assessment Area III Five-Year Assessment Area IV Five-Year Assessment Area IV Five-Year Assessment Overall Gen. Ed. Assessment Overall Gen. Ed. Assessment OIRPA IE Committee Gen. Ed. Committee OIRPA IE Committee Gen. Ed. Committee 5-Year Cycle? No Yes
18
Five-Year Program/Department Review Timeline (pending IE Committee Approval) September October November December January February March April May June July August September Last years Depts. that underwent 5- year review submits outcomes of review as AAPAR priority initiatives OIRPA submits Five-Year Enrollment report to academic departments OIRPA conducts assessment workshop for UNA campus OIRPA meets with Deans/VP for overview Deans/VPs meet with departments to discuss review OIRPA meets with departments up for review OIRPA initiates individual departments meetings Five-Year Reviews completed and sent to Dean/VP OIRPA submits overview of Five-Year process to IE Committee
19
Annual Action Plan and Assessment Report Timeline (pending IE Committee Approval) September October November December January February March April May June July August September President, VP, and Dean Initiatives Due 1 st part of AAPAR due for current fiscal year w/ one Priority Initiative for next FY Next FY Priority Initiatives by Deans Next FY Priority Initiatives by VPs SPBS reviews Next FY Priority Initiatives 2 nd Part of AAPAR completed by depts. OIRPA submits AAPAR overview to IE Committee Budget initiatives based on Priority Initiatives are established
20
Outcomes Operational Outcomes - measures of how well the institution/division/department is meeting/exceeding requirements Learning Outcomes - statements of the knowledge, skills, and abilities the individual student possesses and can demonstrate upon completion of a learning experience or sequence of learning experiences (e.g., course, program, degree).
21
Problems with Outcomes Outcomes are too broad Outcomes do not address core requirements/competencies or mission Outcomes are not measurable
22
Types of Measurement Discrete or Attributes data Binary data with only two values Continuous or Variable data Information that can be measured on a continuum or scale Yes/No Good Bad On/Off Male/Female Pass/Fail Height/Weight Temperature Test Scores Time Distance
23
Blooms Taxonomy of Learning Outcomes CategoryDefinitionRelated Behaviors Knowledge recalling or remembering something without necessarily understanding, using, or changing it define, describe, identify, label, list, match, memorize, point to, recall, select, state Comprehension understanding something that has been communicated without necessarily relating it to anything else alter, account for, annotate, calculate, change, convert, group, explain, generalize, give examples, infer, interpret, paraphrase, predict, review, summarize, translate Application using a general concept to solve problems in a particular situation; using learned material in new and concrete situations apply, adopt, collect, construct, demonstrate, discover, illustrate, interview, make use of, manipulate, relate, show, solve, use
24
Blooms Taxonomy, Cont. CategoryDefinitionRelated Behaviors Analysis breaking something down into its parts; may focus on identification of parts or analysis of relationships between parts, or recognition of organizational principles analyze, compare, contrast, diagram, differentiate, dissect, distinguish, identify, illustrate, infer, outline, point out, select, separate, sort, subdivide Synthesis relating something new by putting parts of different ideas together to make a whole blend, build, change, combine, compile, compose, conceive, create, design, formulate, generate, hypothesize, plan, predict, produce, reorder, revise, tell, write Evaluation judging the value of material or methods as they might be applied in a particular situation; judging with the use of definite criteria accept, appraise, assess, arbitrate, award, choose, conclude, criticize, grade, judge, prioritize, recommend, referee, select, support
25
Forms of Measurement Longitudinal data is gathered over an extended period Semester 1,, Semester 2 … Semester 3 Semester t
26
Forms of Measurement, Cont. Cross-sectional data represent a snapshot of one point in time
27
What is Improvement? Measurable actions that increase learning, efficiency, effectiveness, and/or the bottom line Decrease the Bad Increase the Good Decrease Variability
28
Decrease Variability? What the heck is that? Class A 100, 100 99, 98 88, 77 72, 68 67, 52 43, 42 Mean = 75.5 Class B 91, 85 81, 79 78, 77 73, 75 72, 70 65, 60 Mean = 75.5 STD = 21.93 STD = 8.42 Mean
29
Inputs, Processes, and Outcomes Outcomes Input MeasurementMaterialsMethods Environment PeopleMachines Xs Xs Ys
30
Assessment Steps Develop learning/operational objectives Check for alignment between the curriculum/business process and the objectives Develop an assessment plan Collect assessment data Use results to improve programs/department Routinely examine the assessment process and correct, as needed
31
Types of Assessment – Direct Academic Published Tests Locally Developed Tests Embedded Assignments and Course Activities Competence Interviews Portfolios
32
Types of Assessment – Direct Educational Support/Administrative People enrolled/participating/served Work accomplished Revenue generated Turnaround time Decrease in nonconformities
33
Types of Assessment - Indirect Surveys Interviews Focus Groups Reflective Essays
34
How Can OIRPA Assist? Create university wide reports – Five-year departmental reports Analyze university-wide assessment data – NSSE, CAAP Hold workshops on assessment and IE Work with individual departments on annual reports, program review, and outcomes assessment Provide ad hoc data reports for departments Work with committees to develop assessment plans – IE Committee, Gen. Ed. Committee
35
Questions or Comments?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.