Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015.

Similar presentations


Presentation on theme: "Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015."— Presentation transcript:

1 Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015

2 Ideally you already evaluate your unit’s effectiveness

3 Don’t create special data collection process for SACSCOC; just summarize existing processes Save time and unnecessary work by adapting your existing annual report to the SACSCOC Program Assessment Report template

4 Study resources and template before starting Use existing assessments, available documentation, and your current reports whenever possible Start with your existing assessments (measures) and then write outcomes to go with them

5 Non-academic units often use survey data for their assessment Surveys are indirect measures of student learning, but they are direct measures of customer (client, employee, patient, student) experience Source: Mary Harrington, Univ of Mississippi

6 1) Administrative support services 2) Academic and student support services 3) Research 4) Community/public service

7

8 may prefer to submit a Program Assessment Report (PAR) for each office within the division, particularly if outcomes are not the same across those offices.

9 defined desired mission, program outcomes or objectives, and related measures, collected and evaluated results from ongoing assessment (multiple years), undertaken actions for continuous improvement. Help reviewers find key components quickly & easily Implement Change (Improve) Collect Findings Define Outcomes & Measures Evaluate Results

10 Mission and program outcomes (objectives) Operational and/or student learning outcomes (2+) and related measures (2+ each, 1 should be direct measure) Assessment findings: results of measures from multiple years (if feasible)

11 Discussion of results: review of findings, including whether performance meets expectations Discussion of changes: initiatives to improve program and whether continuous improvement has occurred Clear narrative and organization to make compliance obvious (does everything make sense?)

12 tie to UM Mission: “ The University of Miami’s mission is to educate and nurture students, to create knowledge, and to provide service to our community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation and the world. ” and your strategic plan, describe program outcomes/objectives (e.g., purpose of unit, type of support for students—including any research or service components)

13 Describe reasonable expectations in measurable terms (efficiency, accuracy, effectiveness, comprehensiveness, etc.) Include at least 2 outcomes Make outcomes easy to identify (e.g., use bolding & numbering) and clearly stated (follow expected structure)

14 Focus on a current service or process Be under the control of or responsibility of the unit Be measurable Lend itself to improvements Be singular, not “bundled” Be meaningful and not trivial Not lead to “yes/no” answer Source: Mary Harrington, Univ of Mississippi

15 Efficiency: The Registrar’s Office processes transcript requests in a timely manner. Accuracy: Purchasing accurately processes purchase orders. Effectiveness: Human Resources provides effective new employee orientation services. Comprehensiveness: Financial Aid provides comprehensive customer service. Source: Mary Harrington, Univ of Mississippi

16 Library: Students will have basic information literacy skills. Career Services: Students will be able to create an effective resume. Information Technology: Staff will know to how use the student information system. Human Resources: New employees will be familiar with the benefit package. Source: Mary Harrington, Univ of Mississippi

17 Research: number of grants, total funding, number of peer-reviewed publications, conference presentations Administrative support: timeliness in processing orders, budget growth (or savings), complaint tracking/resolution, public safety improvements, audits Academic/student support: number of students counseled, job placements, scholarship awards, seminar participation, leadership training participation Community/public service: number of patients seen, community event participation, annual volunteer commitments

18 Ensure each measure has corresponding findings (and no findings without earlier measure) Insert corresponding outcome/measure as heading for each set of results Include multi-year data or explain if measures are new: “As part of the major three-year continuous improvement update of our program assessment report in FY 2014, we decided to start using customer satisfaction surveys in conjunction with service requests. Because this is a new measure, we have data for only FY 2015, but we will continue to update the data in upcoming years to monitor continuous improvement.”

19 If measure is a narrative rather than data, include summary plus sample evaluations or insert statement Ensure results are presented clearly (tables) Decide if appendix of findings (e.g., survey instrument) will be necessary (usually not) Common error: Programs simply state they evaluate outcomes or omit measure(s). Solution: You should provide evidence of assessment activity (table/text summary of findings).

20 Provide a statement as to why these particular assessment instruments were used Include an analysis of the assessment findings and evidence of improvement general trends specifically in response to improvement

21 When describing initiatives to improve outcomes: The report simply lists initiatives. Solution: Include brief commentary on which outcome will benefit. When describing continuous improvement: The report does not include any evidence of improvement over time. Solution: At least discuss efforts to improve outcomes.

22 Add bold, indents, and/or underlines Nest measures under related outcomes Label/nest Outcomes/Measures in Findings section Remove yellow template instructions Expand acronyms (e.g., RSMAS, PRISM, UMHC) Spell check and fix typos

23 Contact: Dr. Claudia Grigorescu Compliance Specialist (Assessment) 305-284-4714 Dr. David E. Wiles Executive Director, Assessment and Accreditation Institutional Accreditation Liaison 305-284-3276


Download ppt "Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015."

Similar presentations


Ads by Google