Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2014.

Slides:



Advertisements
Similar presentations
Building a Career Portfolio
Advertisements

Graduation and Employment: Program Evaluation Using Dr. Michele F. Ernst Chief Academic Officer Globe Education Network.
How to Develop an Assessment Plan to Measure Effectiveness in Administrative and Academic Support Units Ann Boudinot-Amin Director of Planning and Assessment.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
LAKE COUNTY SCHOOLS System Accreditation Overview of Standards March 3-6, 2013 Susan Moxley, Ed.D. Superintendent Hugh Hattabaugh Chief Academic Officer.
NCCEA Annual Conference Waynesville, NC Assessment Basics: Implementing and Sustaining a Comprehensive, Outcomes-Based Assessment Plan October 19, 2006.
A specialized accrediting agency for English language programs and institutions Accreditation Presentation ABLA conference 2012.
Assessment of Student Affairs Initiatives for First-Year Students National Conference on First-Year Assessment October 12-14, 2008 San Antonio, Texas Jennifer.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
FY 2010 Leadership Performance Management Process and Form Who should take this course? All Leaders Content Expert: Jane Pettit
Writing an Effective Assessment Plan
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
National Public Health Performance Standards Local Assessment Instrument Essential Service:8 Assure a Competent Public Health and Personal Healthcare Workforce.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
2012 West Texas Assessment Conference CREATING A CULTURE OF ASSESSMENT IN NON-INSTRUCTIONAL AREAS KARA LARKAN-SKINNER, DIRECTOR OF IR AND IE & KRISTIN.
1 RESUMES American Corner Volgograd, Russia August 10, 2003.
North Carolina Appalachian Collaborative for Higher Education Submitting Proposals For Mini-grants supporting College Access and Student Success.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
The mission of the Office of Migrant Education is to provide excellent leadership, technical assistance, and financial support to improve the educational.
Strategic Planning for EEO & HR Offices Dinah Cohen CAP Director Derek Shields CAP Program Manager EEOC Executive Leadership Conference – May 3-5, 2011.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Administrative.
Departmental Assessment Process.  The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Academic.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
Making It Meaningful: Authentic Assessment for Intentional Education David W. Marshall, PhD Joanna M. Oxendine, MEd.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Developing Administrative and Educational Support Outcomes and Methods of Assessment Lisa Garza Director, University Planning and Assessment Beth Wuest.
Presented by Linda Martin
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Family Service System Reform Grant Application Training Video FY Donna Bostick-Knox, Pennsylvania Department of Public Welfare, Office of Children.
Technology planning* … is the process of identifying the ideal role of technology in an organization and mapping the evolution of that role into the future.
Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015.
SHORTER COLLEGE Assessment Week Sponsored by the Office of Institutional Effectiveness and Assessment & the Division of Academic Affairs.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
Strategic Plan Strategic Goals (Thrusts) 1. Achieve Performance Excellence CRJ uses metrics of performance to evaluate, manage and plan its.
Tips for Writing SACSCOC Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2015.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
SACS Coordinators Meeting Wednesday, June 6, 2012 Timothy Brophy – Director, Institutional Assessment Cheryl Gater – Director, SACS Accreditation.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
What Your Program Needs to Know about Learning Outcomes Assessment at UGA.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT PLAN/REPORT By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Enhancing Education Through Technology Round 8 Competitive.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Gordon State College Office of Institutional Effectiveness Faculty Meeting August 5, 2015.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
A Presentation for the Annual Conference of the Missouri Community College Association November 6, 2003 Larry McDoniel Ann Campion Riley Assessment of.
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment toCCED Mohawk Valley Community College October 11, 2004.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP By: Dr. Shemeka McClung Director Ms. Arnitra Hunter Research Associate Institutional Research.
Administrative Assessment Planning: Writing Assessment Outcomes and Measures April 2016 UM Assessment and Accreditation University of Miami.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
INSTITUTIONAL RESEARCH PLANNING AND ASSESSMENT DR. SHEMEKA MCCLUNG DIRECTOR ARNITRA HUNTER RESEARCH ASSOCIATE.
Presentation on Outcomes Assessment Presentation on Outcomes Assessment to Administrative Services Mohawk Valley Community College February 7, 2006.
Tips for Writing SACS Program Assessment Reports
OUTcomes Assessment Planning and Reporting
Student Learning Outcomes Assessment
Institutional Effectiveness Presented By Claudette H. Williams
WHO WE ARE AND WHY WHAT WE DO MATTERS
Assessing Administrative and Educational Support Services
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’19
Co-Curricular Assessment
Presentation transcript:

Tips for Writing SACSCOC Non-Academic Program Assessment Reports Office of Planning, Institutional Research, and Assessment (PIRA) Fall 2014

Ideally you already evaluate your unit’s effectiveness Program Assessment Reports should describe these activities using SACSCOC guidelines and terminology Data or other findings that measure operational and/or student learning outcomes should be included, as should interpretation of findings Initiatives to improve should be included

But don’t create special data collection process for SACSCOC; just summarize existing processes Save yourself time and unnecessary work by adapting your existing annual report to the SACSCOC Program Assessment Report template

1) Administrative support services 2) Academic and student support services 3) Research 4) Community/public service

may prefer to submit a Program Assessment Report (PAR) for each office within the division, particularly if outcomes are not the same across those offices.

defined desired mission, program outcomes or objectives, and related measures, collected and evaluated results from ongoing assessment (multiple years), undertaken actions to continuously improve outcomes. Help reviewers find key components quickly & easily Implement Change (Improve) Collect Findings Define Outcomes & Measures Evaluate Results

mission and program outcomes (objectives) operational and/or student learning outcomes (2+) and related measures (2+ each, 1 should be direct measure) assessment findings: results of measures from multiple years (if feasible) discussion of results: review of findings, including whether performance meets expectations discussion of changes: initiatives to improve program and whether continuous improvement has occurred clear narrative and organization to make compliance obvious (does everything make sense?)

tie it to UM Mission: “ The University of Miami’s mission is to educate and nurture students, to create knowledge, and to provide service to our community and beyond. Committed to excellence and proud of the diversity of our University family, we strive to develop future leaders of our nation and the world. ” and your strategic plan describe program outcomes/objectives (e.g., purpose of unit, type of support for students—including any research or service components)

describe reasonable expectations in measurable terms (efficiency, accuracy, effectiveness, comprehensiveness, etc.) include at least 2 outcomes make outcomes easy to identify (e.g., use bolding & numbering) and clearly stated (follow expected structure)

Focus on a current service or process Be under the control of or responsibility of the unit Be measurable Lend itself to improvements Be singular, not “bundled” Be meaningful and not trivial Not lead to “yes/no” answer Source: Mary Harrington, Univ of Mississippi

Efficiency: The Registrar’s Office processes transcript requests in a timely manner. Accuracy: Purchasing accurately processes purchase orders. Effectiveness: Human Resources provides effective new employee orientation services. Comprehensiveness: Financial Aid provides comprehensive customer service. Source: Mary Harrington, Univ of Mississippi

Start with words like Students… Graduates… We want students to… Include verbs or phrases like will demonstrate… should have ability to … Include words like …mastery of… …a capacity for… Describe expected competence (e.g., practical skills, communication, leadership, multi-cultural awareness)

Library: Students will have basic information literacy skills. Career Services: Students will be able to create an effective resume. Information Technology: Staff will know to use the student information system. Human Resources: New employees will be familiar with the benefit package. Source: Mary Harrington, Univ of Mississippi

Research: number of grants, total funding, number of peer-reviewed publications, conference presentations Administrative support: timeliness in processing orders, budget growth (or savings), complaint tracking/resolution, public safety improvements, audits Academic/student support: number of students counseled, job placements, scholarship awards, seminar participation, leadership training participation Community/public service: number of patients seen, community event participation, annual volunteer commitments

Often non-academic units use survey data for their assessment Surveys are indirect measures of student learning, but they are direct measures of customer (client, employee, patient, student) experience Source: Mary Harrington, Univ of Mississippi

ensure each measure has corresponding findings (and no findings without earlier measure) insert corresponding outcome/measure as heading for each set of results ensure multiple years or insert explanation that data not provided for new program/revised measures: “As part of the major three-year continuous improvement update of our program assessment report in FY 2013, we decided to start using customer satisfaction surveys in conjunction with service requests. Because this is a new measure, we have data for only FY 2014, but we will continue to update the data in upcoming years to monitor continuous improvement.”

if measure is a narrative rather than data, ensure summary plus sample evaluations or insert statement ensure results are presented clearly (tables) decide if appendix of findings, survey instrument, etc. will be necessary (usually not) Common error: Programs simply state they evaluate outcomes or omit measure(s). Solution: You should provide evidence of assessment activity (table/text summary of findings).

statement as to why these particular assessment instruments were used analysis of the assessment findings How are periodic reviews used for improvements? How does the use of assessment results improve your services? What changes have been implemented or will be developed to improve your operational and/or student learning outcomes? evidence of improvement general trends specifically in response to improvement initiatives

When describing initiatives to improve outcomes: The report simply lists initiatives. Solution: Include brief commentary on which outcome will benefit. When describing continuous improvement: The report does not include any evidence of improvement over time. Solution: At least discuss efforts to improve outcomes.

Add bold, indents, and/or underlines to assist reviewers Nest measures under related outcomes Label/nest Outcomes/Measures in Findings section Include discussion of improvements/changes in Discussion section, not in Outcomes or Findings sections Remove yellow template instructions Delete extraneous text and data (clarity more important than length) Expand acronyms (e.g., RSMAS, PRISM, UMHC) Spell check; fix typos

Study resources and template before starting Use existing assessments, available documentation, and your current reports whenever possible (saves time and effort) Consider starting with measures and then writing outcomes to go with them instead of the other traditional order

Contact: Dr. David E. Wiles Executive Director, Assessment and Accreditation Institutional Accreditation Liaison (305)