Presentation is loading. Please wait.

Presentation is loading. Please wait.

ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.

Similar presentations


Presentation on theme: "ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS."— Presentation transcript:

1 ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS

2 SESSION ONE OVERVIEW 2013-14  Gathering Evidence (data) on Efficiency, Quality, and Satisfaction from 2013-14  Preparing the 2013-14 (academic year) Annual Planning and Assessment Report for your unit  Crafting the assessment narrative 2014-15  Preparing the 2014-15 (academic year) Annual Planning and Assessment Report for your unit

3 SESSION TWO OVERVIEW 2015-16  Planning to measure Efficiency, Quality, and Satisfaction in 2015-16  Choosing the right measures  The “Annual Assessment, Planning and Budgeting” Cycle  Writing and Communicating the Assessment Plan for your unit

4 SESSION 1 ASSESSMENT WORKSHOP: SESSION 1

5 A CULTURE OF EVIDENCE “Accrediting agencies – both at the institutional level and the programmatic level – are now operating in a ‘culture of evidence’ that requires institutions to qualitatively and quantitatively demonstrate that they are meeting student learning goals and effectively marshalling human and fiscal resources toward that end.” Michael Middaugh Author of Planning and Assessment in Higher Education

6 EVIDENCE How do you know that your area was successful in 2013-14?

7 ASSESSMENT PROCESS State Outcomes Implement plan and collect data Aggregate, disaggregate and analyze the data Plan for how outcomes will be assessed (benchmark, process-what (data, measure), when, how, who?) Articulate how changes and improvements were made

8 GATHERING EVIDENCE: QUESTIONS TO ASK FOR 2013-14 ASSESSMENT  Do you have assessment data? (count, timeframes, percent of issues fixed, etc.)  Did anyone leave you any assessment data? (computer or paper files)  Did you collect any surveys (feedback from students or external audiences, quality, etc.)?  Did your unit keep any internal measures?  Were there any Key Performance Indicators (KPIs) from the Strategic Plan for which you produced data? Breakout Session: Get into groups. Take a few moments individually to answer the above questions. Then, take 20 minutes to share ideas and brainstorm additional ways that information on record could be used for assessment data on efficiency, quality, and satisfaction during the 2013-14 academic year.

9 GATHERING EVIDENCE Key Ideas:  If you find that you do not have data on a specific item (outcome) you want to assess, then still record the outcome of the form provided and write: No assessment data available.  It’s better to do a few things well than to try to assess too much.  Assessment is a process of continuous improvement. We should try to improve each year and will get better as we ascend to greatness.  It’s more rewarding/fulfilling when decisions and actions are driven by data! However, it is likely that changes and improvements made in 2014-15 were not always linked to the actual data from 2013-14.

10 ANNUAL PLANNING & ASSESSMENT REPORT 2013-14 GuideTemplate

11 ELEMENTS OF THE REPORT 1.Description of the Support Services in the Unit 2.Mission Statement 3.Resources Used 4.Support Specifics 5.Outcomes 6.Annual Planning and Assessment Grid 7.Assessment Plan Methods and Procedures 8.Analysis and Evaluation 9.Action Plan 10.Rubric for Assessing the Annual Assessment Report

12 ANALYZING EVIDENCE: ANNUAL PLANNING & ASSESSMENT GRID FOR 2013-14 Annual Planning and Assessment 2013-14 Mission Statement: Targeted Strategic Goal(s): T argeted Division Key Performance Indicators: Targeted Institutional Student Learning Outcome(s): Outcomes Expected Level of Achievement Measure/ToolResults Use of Results (to include analysis and Action Plan) Met Not Met 1.List outcomes 2.Set Expected Levels of Achievement 3.Describe measure/tool 4.Answer: What were the results of the items you assessed in 2013-14? How did you use those results? Did you meet the expected level of achievement?

13 ANNUAL PLANNING & ASSESSMENT GRID FOR 2014-15 Annual Planning and Assessment 2014-15 Mission Statement: Targeted Strategic Goal(s): Targeted Division Key Performance Indicators: Institutional Student Learning Outcome(s) addressed: OutcomesExpected Level of AchievementMeasure/Tool This is listed on the 2013-14 report as a section for planning toward the 2014-15 academic year. Basically, you are answering: What did you assess in the 2014-15 academic year? Breakout Session: Brainstorm in groups to answer. (15-20 minutes) 1)List the new things put into place during 2014-15. How were they assessed? 2)What evidence was collected for 2014-15? 3)Did your expected level of achievement change to ensure improvements for students?

14 ANNUAL PLANNING & ASSESSMENT REPORT 2013-14 2013-14 Report Due: Friday, June 5th 2014-15 Report Due: Friday, October 16th

15 ASSESSMENT WORKSHOP: SESSION 2 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS

16 ASSESSMENT PROCESS State Outcomes Implement plan and collect data Aggregate, disaggregate and analyze the data Plan for how outcomes will be assessed (benchmark, process-what (data, measure), when, how, who?) Articulate how changes and improvements were made

17 THE QUESTION What are good unit outcomes?

18 UNIT OUTCOMES – GENERAL GUIDELINES Administrative Support Services Academic and Student Support Services Satisfaction Quality Efficiency Also: volume of activity, count, accuracy

19 GOOD ASSESSMENT Begins with the students in mind!

20 WRITING UNIT OUTCOMES  Articulate levels of satisfaction with the unit.  Articulate volume of activity, level of efficiency, quality.  Articulate external validation (audits – financial, IT, Public Health Inspector, Fire Marshall)  Include action verbs describing observables.

21 CHARACTERISTICS OF OUTCOMES  SMART  S pecific - The outcome is clear to anyone familiar with the program, project  M easurable - Concrete methods assessing progress, achievement of outcome (rubric, checklist, survey)  A ttainable – The outcome is reasonable give the program's resources and influence  R elevant - The outcome must be relevant to the program's mission, responsibilities, and all people affiliated with the program  T ime-bound - The period of time for accomplishing goal is reasonable (annual or longer of strategic)  Outcome statements should not be bundled  Outcome statements should be developed, agreed upon, and supported by members of the department.  Outcome statements do not include understand or know because the words are not, by themselves, measurable.  Outcome statements should be understood by anyone.

22 EXAMPLES OF ASSESSMENTS FOR STUDENT SUPPORT UNITS  Satisfaction  General (institutional)  Specific (satisfaction with particular area)  Direct Measures  Volume of Activity (# of persons served)  Level of Efficiency (average response time)  Quality average (# of errors)  External Validation  Audits (financial, IT)  Public Health Inspector  Fire Marshall

23 NON-ACADEMIC AREA EXAMPLE – OFFICE OF THE REGISTRAR  Outcome : The Office of the registrar will process transcripts in a timely manner.  Benchmark : 90% of transcript requests will be processed within 2 business days.  Measure/Tool : Log of transcript requests (monthly)  Results : What do the data tell you?  Use of results : How will data be used to drive improvements?

24 NON-ACADEMIC AREA EXAMPLE – OFFICE OF THE REGISTRAR  Outcome : The Office of the registrar will be in compliance with FERPA regulations.  Benchmark : 100% of employees will answer 90% of the questions correctly.  Measure/Tool : Survey  Results : What do the data tell you?  Use of results : How will data be used to drive improvements?

25 NON-ACADEMIC AREA EXAMPLE – OFFICE OF SECURITY Outcome : The Office of Security will respond to calls for assistance in a timely and effective manner. Benchmark: 80% of calls will be answered in a timely and effective manner. Measure: Survey Results: What do the data tell you? Use of Results: How will data be used to drive improvements?

26 CHARACTERISTICS OF OUTCOMES  SMART  S pecific - The outcome is clear to anyone familiar with the program, project  M easurable - Concrete methods assessing progress, achievement of outcome (rubric, checklist, survey)  A ttainable – The outcome is reasonable give the program's resources and influence  R elevant - The outcome must be relevant to the program's mission, responsibilities, and all people affiliated with the program  T ime-bound - The period of time for accomplishing goal is reasonable (annual or longer of strategic)  Outcome statements should not be bundled  Outcome statements should be developed, agreed upon, and supported by members of the unit.  Outcome statements do not include ‘understand’ or ‘know’ because the words are not, by themselves, measurable.  Outcome statements should be understood by anyone. Breakout Session: Brainstorm in groups to answer. (15-20 minutes) 1)Do each of your outcomes meet these criteria?

27 SHARING IDEAS Time to Share

28 STEPS OF THE PROCESS  How will the assessments be collected?  Who will collect the assessments?  When will the assessments be collected?  Who will aggregate and disaggregate the data?  Who will analyze the data?  How are data used to drive improvement?  How do you want to share the information with stakeholders?  Who will be involved in the process of reviewing and revising the assessments?

29 USING ASSESSMENT TO DRIVE PROGRAM IMPROVEMENT (CLOSING THE LOOP) The most challenging part of any assessment process.  The analysis of data and changes implemented rest squarely with Program Faculty and Staff in respective areas  Building capacity and supporting rests with Assessment Coordinators. Guiding Questions:  What do the data tell you? (trends and benchmarking are important). Results  How are data used to drive improvement? Use of results  How do you want to share the information with stakeholders?

30 PLANNING/BUDGETING/ASSESSMENT CYCLE Planning Budget Assessment


Download ppt "ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS."

Similar presentations


Ads by Google