Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.

Slides:



Advertisements
Similar presentations
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
Performance Management
Introduction to Monitoring and Evaluation
What is Assessment? The university, with the support and encouragement of President Hitt and the Vice President team, engages in the institutional effectiveness.
What “Counts” as Evidence of Student Learning in Program Assessment?
Aligning Employee Performance with Agency Mission
Del Mar College Planning and Assessment Process Office of Institutional Research and Effectiveness January 10, 2005.
Assessment Plans Discussion Career Services Julie Guevara, Accreditation & Assessment Officer February 6, 2006.
Student Learning Outcomes Curriculum Change Request Academic Council Presentation Gary Howard Rosemary Hays-Thomas October 22, 2004.
An Assessment Primer Fall 2007 Click here to begin.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Formative and Summative Evaluations
The Academic Assessment Process
Evaluation. Practical Evaluation Michael Quinn Patton.
Writing an Effective Assessment Plan
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Standards and Guidelines for Quality Assurance in the European
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Learning Outcomes Assessment RESULTS AND ACTION PLAN Beth Wuest Director, Academic Development and Assessment Lisa Garza Director, University Planning.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Assessing Organizational Communication Quality
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Outcomes Assessment 101 Assessment Showcase 2009: Best Practices in the Assessment of Learning and Teaching February 11, 2009 Gwendolyn Johnson, Ph.D.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Assessment 101 Center for Analytics, Research and Data (CARD) United Church of Christ.
Note: Because of slide animation, this ppt is intended to be viewed as a slide show.  While viewing the ppt, it may be helpful to obtain a sample Core.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
HECSE Quality Indicators for Leadership Preparation.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
ASSESSMENT OF CORE SKILLS/ GENERAL EDUCATION OUTCOMES Angelina Hill, PhD Associate Director, Office of Academic Assessment.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Periodic Program Review Guiding Programs in Today’s Assessment Climate LaMont Rouse Executive Director of Assessment, Accreditation & Compliance.
Administrative and Educational Support Outcomes: Reporting Results, Taking Action, and Improving Services Lisa Garza Director, University Planning and.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
Behavioral and Emotional Rating Scale - 2 Understanding and Sharing BERS-2 Information and Scoring with Parents, Caregivers and Youth May 1, 2012.
Advances in Human Resource Development and Management Course code: MGT 712 Lecture 9.
Student Learning Outcomes and SACSCOC 1.  Classroom assessment ◦ Grades ◦ Student evaluation of class/course  Course assessment –????  Academic program.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
Assessing Student Learning Workshop for Department Chairs & Program Directors Workshop for Department Chairs & Program Directors January 9, 2007.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
IT Leading the Way to Institutional Effectiveness Presenter: Kendell Rice, Ph.D. July 11, 2007.
UK Office of Assessment. The LEARNING Initiative Dual Track Implementation Strategy Completion Dates Not actively engaged in program level assessment.
Session 2: Developing a Comprehensive M&E Work Plan.
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
Assessing Student Learning Outcomes Andrew Swan What are Student Learning Outcomes?  Education reform in the 1990s pushed for more sophisticated goals.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment to Administrative Services Mohawk Valley Community College February 7, 2006.
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Welcome! Session Recording
Functional Area Assessment
The assessment process For Administrative units
Programme Review Dhaya Naidoo Director: Quality Promotion
Welcome! Session Recording
Dr. Ron Atwell Ms. Kathleen Connelly
Assessing Academic Programs at IPFW
Assessment components
Presentation transcript:

Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest Director, Academic Development and Assessment February 13, 2007

Workshop Goals To become:  familiar with developing administrative and educational support outcomes in compliance with SACS requirements  knowledgeable about outcomes assessment in relation to department activities  aware of the importance of methods of assessment in relation to intended outcomes and continuous improvement  knowledgeable about direct and indirect assessment methods  competent at developing methods for assessing outcomes  more adept at reviewing methods for assessing effectiveness and efficiency

Overview Southern Association of Colleges and Schools (SACS) Core Requirement 3.3.1: “An institution is expected to identify expected outcomes for its educational programs and its administrative and educational support services; assess whether it achieves these outcomes; and provide evidence of improvement based on analysis of those results.”

Overview  For evidence of success and continuous improvement Directors for each administrative and educational support service as determined by each Division are requested to identify 3-5 measurable outcomes with two assessment methods for each outcome and submit by April 30, 2007 An assessment report of these outcomes will be due toward the end of the academic year

Outcomes Assessment: What it is and What it’s not  Definition  Outcomes versus Inputs  Formative versus Summative  Positive versus punitive  Evaluating service versus individual  Continuous Improvement

Definitions  Outcomes Desired results expressed in general terms  Methods Tools or instruments used to gauge progress toward achieving outcomes  Measures Intended performance targets expressed in specific terms

Focus At present we are focusing only on outcomes and methods. Although measures should be considered when developing these, they will not be specifically addressed until the first assessment cycle ( ).

Linkages to Other University Assessment  Program Review  Department, Division, and University strategic planning  Program and University accreditations

Identify Department Mission and Goals Based on the mission and stated goals of your department:  What is your overall purpose or function?  In what direction is your department headed?  What needs to be accomplished in order to get to where you are going?  How will you know when you have accomplished these goals?

Developing Intended Outcomes  What are your expectations regarding these goals?  What is the end result you hope to see once department goals have been implemented?  What are the intended outcomes you hope to accomplish?

Writing Intended Outcomes  Do not join multiple outcomes in one statement. Customers will be highly satisfied with the service and requests for service will increase.  State so that the outcome can be assessed by more than one method (ideally). Advisors will provide high quality academic information to students.  As evidenced by “very good” to “excellent” student ratings on a point of service questionnaire from 90% of the students served  As evidenced by reduced number of follow-up phone calls from students served

Evaluating Quality of Outcomes  Are outcomes aligned with your mission and goals?  Is it possible to collect accurate and reliable data for each outcome?  Taken together, would the indicators associated with the outcomes accurately reflect the key results of the programs, operations, or service offered by your department?

Evaluating Quality of Outcomes  Is there anything missing?  Are the outcomes stated so that it is possible to use a single method to assess the outcome?  Are they stated so that more than one assessment method can be used?  Can they be used to identify areas to improve?

Methods of Assessing Outcomes  Should provide an objective means of supporting the outcomes, quality, efficiency or productivity of programs, operations, activities or services  Should indicate how you will assess each of your outcomes  Should indicate when you will assess each outcome  Provide at least two ways to assess each outcome

Categories of Assessment Methods  student learning direct assessments evaluate the competence of students  exam scores, rated portfolios indirect assessments evaluate the perceived learning  student perception, employer perception  program or unit processes direct assessments evaluate actual performance  error rates, time, cost, efficiency, productivity indirect assessments evaluate the perceived performance  perceived satisfaction, perceived timeliness, perceived capability

Examples of Direct Methods  Samples of work assignments  Projects or presentations  Project embedded assessment  Documented observation and analysis of behavior or performance  Activity logs  Case study/problems  Interviews (including videotaped)

Examples of Indirect Methods  Questionnaires and Surveys Students  Prospective  Current  Non-returning  Alumni Customers Employees

Describing Assessment Methods  What are you going to use? presentation, assignment, survey, observation, performance rating  Of and/or by whom? student, employee, focus group, customers  Context (e.g., where or when)? point-of-service, throughout the year, annually  For what purpose? desired intended outcome  example: Observe employees annually for their level of efficiency in performing XYZ.

Creating Assessment Methods WhatWhoWhere/WhenOutcomes PresentationStudentPoint-of-serviceLearning AssignmentAlumniOn the jobQuality PortfolioCustomerThroughout the yearTimeliness RecordsEmployeeEnd of yearSkills ProjectMentorEnd of programSatisfaction PerformanceFocus groupBiannuallyPreparation SurveyCommitteesEfficiency ObservationEmployer

Creating Assessment Methods WhatWhoWhere/WhenOutcomes PresentationStudentPoint-of-serviceLearning AssignmentAlumniOn the jobQuality PortfolioCustomerThroughout the yearTimeliness RecordsEmployeeEnd of yearSkills ProjectMentorEnd of programSatisfaction PerformanceFocus groupBiannuallyPreparation SurveyCommitteeEfficiency ObservationEmployer

Locally Developed Surveys  institutional level alumni survey academic advising survey student survey image survey customer satisfaction survey  program or department level customer surveys program-specific surveys advisory board surveys student surveys graduating senior survey employee exit interviews employee surveys

Hints on Selecting Methods  match assessment method with the intended outcome Maintenance will complete routine work orders in a timely manner.  Review of completed work orders for length of time from open to closure.  Review number of repeat work order requests for same service. Not related to outcome  the assessment results should be usable Resident Assistants training effectively prepares for their role as an RA.  RAs will be surveyed at the end of the end of the academic year to determine the effectiveness of various aspects of the training.  RAs will complete Resident Assistant Training program. Completion of the program will be recorded. Not Useful

Hints on Selecting Methods  results should be easily interpreted and unambiguous  data should not be difficult to collect or access  information should be directly controllable by the unit or program  identify multiple methods for assessing each outcome direct and indirect methods qualitative and quantitative passive or active methods conducted by different groups  identify subcomponents where other methods may be used that allow deeper analysis

Hints on Selecting Methods  use methods that can assess both the strengths and weaknesses of your department or initiative  when using surveys, target all stakeholders  build on existing data collection accreditation criteria program review

exercise

Selecting the “Best” Assessment Methods  relationship to assessment — provide you with the information you need  reliability — yields consistent responses over time  validity — appropriate for what you want to measure  timeliness and cost — preparation, response, and analysis time; opportunity and tangible costs  motivation — provides value to student, respondents are motivated to participate  other results easy to understand and interpret changes in results can be attributed to changes in the service

After Identifying the Potential List of Assessment Methods You Need to…  select the “best” ones try to identify at least two methods for assessing each outcome  consider possible performance targets for the future balance between stretch targets versus achievable targets  Examples of methods survey customers at the end of the year as to their satisfaction with services provided (indirect method) Customers will rate their likelihood of recommending service to others on an evaluation form provided upon completion of service.

After Identifying the Potential List of Assessment Methods You Need to…  develop assessment instruments surveys evaluation forms assignments scoring rubrics  ideally you want them to be reliable, valid, and cheap  approaches use external sources seek help from internal sources (e.g., University Planning & Assessment, Academic Development & Assessment) do it yourself  the instrument may need to be modified based on assessment results

Example  Outcome: Clients will receive timely analyses of survey results. (Institutional Research) 95% of the results are properly analyzed and provided to the client within two weeks of survey administration as obtained by measuring the time it takes to deliver the survey results from the time of administration (direct measurement of timeliness). 95% of our clients are “satisfied” or “very satisfied” with the perceived timeliness obtained through a customer survey given at the point of service (indirect measurement of timeliness).

Example  Outcome: Increase the number of employers that participate in recruiting activities. (Career Services) Attendance will be logged noting overall employer attendance at all recruiting activities. (Direct method). Review acceptance responses to determine the number of different employers represented at recruitment activities. (Direct method).

Example  Outcome: Increase the total dollar amount of donations collected during the Capital Campaign. (University Advancement) Count of total dollars received (Direct method) Review of dollars pledged during open campaign. (Direct method)

Re-Cap of Process Step 1: Define mission Step 2: Define goals Step 3: Define intended outcomes Step 4: Inventory existing and needed assessment methods Step 5: Identify assessment methods for each intended outcome

Challenges and Pitfalls  one size does not fit all — some methods work well for one program but not others  do not try to do the perfect assessment all at once — take a continuous improvement approach  allow for ongoing feedback  match the assessment method to the outcome and not vice-versa

When is Assessment Successful?  When people measure their performance, implement changes, and improve their performance  When the program or service improves as a result of the assessment process

Questions and Comments

For additional assistance, contact: Lisa Garza, Director University Planning & Assessment JCK Beth Wuest, Director Academic Development & Assessment JCK