ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.

Slides:



Advertisements
Similar presentations
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Advertisements

Results Based Monitoring (RBM)
EVALUATOR TIPS FOR REVIEW AND COMMENT WRITING The following slides were excerpted from an evaluator training session presented as part of the June 2011.
What is Assessment? The university, with the support and encouragement of President Hitt and the Vice President team, engages in the institutional effectiveness.
Introduction to Assessment – Support Services Andrea Brown Director of Program Assessment and Institutional Research Dr. Debra Bryant Accreditation Liaison.
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
Aligning Employee Performance with Agency Mission
The Massachusetts Model System for Educator Evaluation Training Module 5: Gathering Evidence August
An Assessment Primer Fall 2007 Click here to begin.
Institutional Effectiveness Operational Update Presentation made to the Indiana State University Board of Trustees October 5, 2001.
Writing Effective Assessment Plans. Why Assessment Plans? Facilitates periodic, not episodic assessment of student learning and program outcomes Serves.
The Academic Assessment Process
Writing Effective Assessment Plans Office of Assessment and Accreditation Indiana State University.
Title I, Part A, Schoolwide Planning Part II: Goal Setting
Tracy Unified School District Leadership Institute – “Leading the Transformation” Breakout Session Authentic Data Driven Decision Making July/August 2014.
System Office Performance Management
Writing an Effective Assessment Plan
How to Write Goals and Objectives
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Standards and Guidelines for Quality Assurance in the European
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Administrative.
February 8, 2012 Session 3: Performance Management Systems 1.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Academic.
Learning Outcomes Assessment in WEAVEonline
Staff Performance Evaluation Process
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
1 Focus on Quality and the Academic Quality Improvement Program At Cuyahoga Community College.
Copyright 2010, The World Bank Group. All Rights Reserved. Planning and programming Planning and prioritizing Part 1 Strengthening Statistics Produced.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Annual Plan Database Training October 4, 2006 Annual Plan Database Training October 4, 2006.
Foundations of Assessment I Understanding the Assessment Process.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
Don Dodson, Senior Vice Provost Diane Jonte-Pace, Vice Provost for Undergraduate Studies Carol Ann Gittens, Director, Office of Assessment Learning Assessment.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Florida Tech’s University Assessment Committee For A Continuing Culture of Assessment.
Quality Assurance and Quality Improvement.  Standard Pathway - Required for all institutions granted initial accreditation, institutions in significant.
Making Plans for the Future April 29, 2013 Brenda M. Tanner, Ed.D.
Administrative and Educational Support Outcomes: Reporting Results, Taking Action, and Improving Services Lisa Garza Director, University Planning and.
Office of Human Resources1 PERFORMANCE MANAGEMENT 2008 AT HKS.
1 Performance Measures A model for understanding the behavior of our work Presented by Wendy Fraser.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
National Quality Assurance and Accreditation Committee & Quality Assurance and Accreditation Project Action Planning Workshop January 2007.
Ascending to Assessment Greatness in presented by the Division of Institutional Effectiveness Helena Mariella-Walrond, PhD Vice President Cory.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Performance Monitoring COURTNEY MILLS SCPCSD DIRECTOR OF ACADEMIC PROGRAMS.
How to Help Your Evaluation Team Help You: Peer Review and your ACCJC External Visiting team Stephanie Curry—Reedley College Dolores Davison—Area B Representative.
Assessment 101: What do you (really) need to know? Melinda Jackson, SJSU Assessment Director October
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
KEYS TO GREATNESS IN STRATEGIC PLANNING AND ASSESSMENT Presented by Helena Mariella-Walrond, PhD Provost and Senior Vice President Cory Potter Executive.
Middle Managers Workshop 2: Measuring Progress. An opportunity for middle managers… Two linked workshops exploring what it means to implement the Act.
Administrative Assessment Planning: Writing Assessment Outcomes and Measures April 2016 UM Assessment and Accreditation University of Miami.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
STELLAR TOP STRATEGIES FOR COMPILING A STELLAR ASSESSMENT REPORT A BETHUNE-COOKMAN UNIVERSITY OFFICE OF ASSESSMENT WORKSHOP Presented by Cory A. Potter.
Help Me Understand the Basics of Non-academic Assessment A BETHUNE-COOKMAN UNIVERSITY OFFICE OF ASSESSMENT WORKSHOP Presented by Cory A. Potter Executive.
OH NO!!! Quality Improvement. Objectives Define a Quality Improvement Program Identify how to get started Identify who should be involved Identify how.
Program Assessment Processes for Developing and Strengthening
Presented by: Skyline College SLOAC Committee Fall 2007
Hands-On: FSA Assessments For Foreign Schools
Institutional Self Evaluation Report Team Training
Co-Curricular Assessment
Presentation transcript:

ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS

SESSION ONE OVERVIEW  Gathering Evidence (data) on Efficiency, Quality, and Satisfaction from  Preparing the (academic year) Annual Planning and Assessment Report for your unit  Crafting the assessment narrative  Preparing the (academic year) Annual Planning and Assessment Report for your unit

SESSION TWO OVERVIEW  Planning to measure Efficiency, Quality, and Satisfaction in  Choosing the right measures  The “Annual Assessment, Planning and Budgeting” Cycle  Writing and Communicating the Assessment Plan for your unit

SESSION 1 ASSESSMENT WORKSHOP: SESSION 1

A CULTURE OF EVIDENCE “Accrediting agencies – both at the institutional level and the programmatic level – are now operating in a ‘culture of evidence’ that requires institutions to qualitatively and quantitatively demonstrate that they are meeting student learning goals and effectively marshalling human and fiscal resources toward that end.” Michael Middaugh Author of Planning and Assessment in Higher Education

EVIDENCE How do you know that your area was successful in ?

ASSESSMENT PROCESS State Outcomes Implement plan and collect data Aggregate, disaggregate and analyze the data Plan for how outcomes will be assessed (benchmark, process-what (data, measure), when, how, who?) Articulate how changes and improvements were made

GATHERING EVIDENCE: QUESTIONS TO ASK FOR ASSESSMENT  Do you have assessment data? (count, timeframes, percent of issues fixed, etc.)  Did anyone leave you any assessment data? (computer or paper files)  Did you collect any surveys (feedback from students or external audiences, quality, etc.)?  Did your unit keep any internal measures?  Were there any Key Performance Indicators (KPIs) from the Strategic Plan for which you produced data? Breakout Session: Get into groups. Take a few moments individually to answer the above questions. Then, take 20 minutes to share ideas and brainstorm additional ways that information on record could be used for assessment data on efficiency, quality, and satisfaction during the academic year.

GATHERING EVIDENCE Key Ideas:  If you find that you do not have data on a specific item (outcome) you want to assess, then still record the outcome of the form provided and write: No assessment data available.  It’s better to do a few things well than to try to assess too much.  Assessment is a process of continuous improvement. We should try to improve each year and will get better as we ascend to greatness.  It’s more rewarding/fulfilling when decisions and actions are driven by data! However, it is likely that changes and improvements made in were not always linked to the actual data from

ANNUAL PLANNING & ASSESSMENT REPORT GuideTemplate

ELEMENTS OF THE REPORT 1.Description of the Support Services in the Unit 2.Mission Statement 3.Resources Used 4.Support Specifics 5.Outcomes 6.Annual Planning and Assessment Grid 7.Assessment Plan Methods and Procedures 8.Analysis and Evaluation 9.Action Plan 10.Rubric for Assessing the Annual Assessment Report

ANALYZING EVIDENCE: ANNUAL PLANNING & ASSESSMENT GRID FOR Annual Planning and Assessment Mission Statement: Targeted Strategic Goal(s): T argeted Division Key Performance Indicators: Targeted Institutional Student Learning Outcome(s): Outcomes Expected Level of Achievement Measure/ToolResults Use of Results (to include analysis and Action Plan) Met Not Met 1.List outcomes 2.Set Expected Levels of Achievement 3.Describe measure/tool 4.Answer: What were the results of the items you assessed in ? How did you use those results? Did you meet the expected level of achievement?

ANNUAL PLANNING & ASSESSMENT GRID FOR Annual Planning and Assessment Mission Statement: Targeted Strategic Goal(s): Targeted Division Key Performance Indicators: Institutional Student Learning Outcome(s) addressed: OutcomesExpected Level of AchievementMeasure/Tool This is listed on the report as a section for planning toward the academic year. Basically, you are answering: What did you assess in the academic year? Breakout Session: Brainstorm in groups to answer. (15-20 minutes) 1)List the new things put into place during How were they assessed? 2)What evidence was collected for ? 3)Did your expected level of achievement change to ensure improvements for students?

ANNUAL PLANNING & ASSESSMENT REPORT Report Due: Friday, June 5th Report Due: Friday, October 16th

ASSESSMENT WORKSHOP: SESSION 2 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS

ASSESSMENT PROCESS State Outcomes Implement plan and collect data Aggregate, disaggregate and analyze the data Plan for how outcomes will be assessed (benchmark, process-what (data, measure), when, how, who?) Articulate how changes and improvements were made

THE QUESTION What are good unit outcomes?

UNIT OUTCOMES – GENERAL GUIDELINES Administrative Support Services Academic and Student Support Services Satisfaction Quality Efficiency Also: volume of activity, count, accuracy

GOOD ASSESSMENT Begins with the students in mind!

WRITING UNIT OUTCOMES  Articulate levels of satisfaction with the unit.  Articulate volume of activity, level of efficiency, quality.  Articulate external validation (audits – financial, IT, Public Health Inspector, Fire Marshall)  Include action verbs describing observables.

CHARACTERISTICS OF OUTCOMES  SMART  S pecific - The outcome is clear to anyone familiar with the program, project  M easurable - Concrete methods assessing progress, achievement of outcome (rubric, checklist, survey)  A ttainable – The outcome is reasonable give the program's resources and influence  R elevant - The outcome must be relevant to the program's mission, responsibilities, and all people affiliated with the program  T ime-bound - The period of time for accomplishing goal is reasonable (annual or longer of strategic)  Outcome statements should not be bundled  Outcome statements should be developed, agreed upon, and supported by members of the department.  Outcome statements do not include understand or know because the words are not, by themselves, measurable.  Outcome statements should be understood by anyone.

EXAMPLES OF ASSESSMENTS FOR STUDENT SUPPORT UNITS  Satisfaction  General (institutional)  Specific (satisfaction with particular area)  Direct Measures  Volume of Activity (# of persons served)  Level of Efficiency (average response time)  Quality average (# of errors)  External Validation  Audits (financial, IT)  Public Health Inspector  Fire Marshall

NON-ACADEMIC AREA EXAMPLE – OFFICE OF THE REGISTRAR  Outcome : The Office of the registrar will process transcripts in a timely manner.  Benchmark : 90% of transcript requests will be processed within 2 business days.  Measure/Tool : Log of transcript requests (monthly)  Results : What do the data tell you?  Use of results : How will data be used to drive improvements?

NON-ACADEMIC AREA EXAMPLE – OFFICE OF THE REGISTRAR  Outcome : The Office of the registrar will be in compliance with FERPA regulations.  Benchmark : 100% of employees will answer 90% of the questions correctly.  Measure/Tool : Survey  Results : What do the data tell you?  Use of results : How will data be used to drive improvements?

NON-ACADEMIC AREA EXAMPLE – OFFICE OF SECURITY Outcome : The Office of Security will respond to calls for assistance in a timely and effective manner. Benchmark: 80% of calls will be answered in a timely and effective manner. Measure: Survey Results: What do the data tell you? Use of Results: How will data be used to drive improvements?

CHARACTERISTICS OF OUTCOMES  SMART  S pecific - The outcome is clear to anyone familiar with the program, project  M easurable - Concrete methods assessing progress, achievement of outcome (rubric, checklist, survey)  A ttainable – The outcome is reasonable give the program's resources and influence  R elevant - The outcome must be relevant to the program's mission, responsibilities, and all people affiliated with the program  T ime-bound - The period of time for accomplishing goal is reasonable (annual or longer of strategic)  Outcome statements should not be bundled  Outcome statements should be developed, agreed upon, and supported by members of the unit.  Outcome statements do not include ‘understand’ or ‘know’ because the words are not, by themselves, measurable.  Outcome statements should be understood by anyone. Breakout Session: Brainstorm in groups to answer. (15-20 minutes) 1)Do each of your outcomes meet these criteria?

SHARING IDEAS Time to Share

STEPS OF THE PROCESS  How will the assessments be collected?  Who will collect the assessments?  When will the assessments be collected?  Who will aggregate and disaggregate the data?  Who will analyze the data?  How are data used to drive improvement?  How do you want to share the information with stakeholders?  Who will be involved in the process of reviewing and revising the assessments?

USING ASSESSMENT TO DRIVE PROGRAM IMPROVEMENT (CLOSING THE LOOP) The most challenging part of any assessment process.  The analysis of data and changes implemented rest squarely with Program Faculty and Staff in respective areas  Building capacity and supporting rests with Assessment Coordinators. Guiding Questions:  What do the data tell you? (trends and benchmarking are important). Results  How are data used to drive improvement? Use of results  How do you want to share the information with stakeholders?

PLANNING/BUDGETING/ASSESSMENT CYCLE Planning Budget Assessment