Assessment at the TTUHSC Or….What does assessment have to do with me?

Slides:



Advertisements
Similar presentations
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Advertisements

How to Develop an Assessment Plan to Measure Effectiveness in Administrative and Academic Support Units Ann Boudinot-Amin Director of Planning and Assessment.
March 23, Todays Outcomes By the end of todays workshop, participants will be able to... Revise and/or finalize SAOs within their areas Discover.
Assessing Student Learning Outcomes In the Context of SACS Re-accreditation Standards Presentation to the Dean’s Council September 2, 2004.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
NCCEA Annual Conference Waynesville, NC Assessment Basics: Implementing and Sustaining a Comprehensive, Outcomes-Based Assessment Plan October 19, 2006.
An Assessment Primer Fall 2007 Click here to begin.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
Institutional Effectiveness Operational Update Presentation made to the Indiana State University Board of Trustees October 5, 2001.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
1 GETTING STARTED WITH ASSESSMENT Barbara Pennipede Associate Director of Assessment Office of Planning, Assessment and Research Office of Planning, Assessment.
The Academic Assessment Process
Support Program Assessment November 18-19, 2014 Ryan J. McLawhon, Ed.D. Director Institutional Assessment Elizabeth C. Bledsoe,
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
The SACS Re-accreditation Process: Opportunities to Enhance Quality at Carolina Presentation to the Faculty Council September 3, 2004.
Evaluation. Practical Evaluation Michael Quinn Patton.
HELPFUL TIPS FOR UNIT PLANNING Office of Institutional Effectiveness.
Columbia-Greene Community College The following presentation is a chronology of the College strategic planning process, plan and committee progress The.
Standards and Guidelines for Quality Assurance in the European
Student Services Assessment Lee Gordon Assistant Vice President for Student Services Purdue University.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Internal Auditing and Outsourcing
Accrediting Commission for Community and Junior Colleges of the Western Association of Schools and Colleges.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Session Goals: To redefine assessment as it relates to our University mission. To visit assessment plan/report templates and ensure understanding for.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
Departmental Assessment Process.  The institution identifies expected outcomes, assesses the extent to which it achieves these outcomes, and provides.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Preparing for ABET Accreditation: Meeting Criteria 2, 3 & 4 October 12, 2006 Melissa Canady Wargo, Director Office of Assessment.
EASTERN WASHINGTON UNIVERSITY Eastern Washington University EWU ODP Maps EWU ODP Maps
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
Focus on Learning: Student Outcomes Assessment and the Learning College.
JACKSON STATE UNIVERSITY ACADEMIC ASSESSMENT COMMITTEE WORKSHOP
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Working Definition of Program Evaluation
WELCOME Strategic Directions Finale May 1, SUSTAINABLE CONTINUOUS QUALITY IMPROVEMENT.
WELCOME Strategic Directions Finale May 1, SETTING THE STAGE Planning for BC’s Future 2015—2018.
Institutional Effectiveness &. Institutional Effectiveness & Strategic Planning IE & SP Committees have developed a new system that integrates these two.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
HECSE Quality Indicators for Leadership Preparation.
NIPEC Organisational Guide to Practice & Quality Improvement Tanya McCance, Director of Nursing Research & Practice Development (UCHT) & Reader (UU) Brendan.
Quality Enhancement Plan (QEP) 101 Del Mar College January 8, 2007 Loraine Phillips, Ph.D. Interim Assessment Director Texas A&M University.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
University of Idaho Successful External Program Review Archie George, Director Institutional Research and Assessment Jane Baillargeon, Assistant Director.
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
What could we learn from learning outcomes assessment programs in the U.S public research universities? Samuel S. Peng Center for Educational Research.
UWF SACS REAFFIRMATION OF ACCREDITATION PROJECT Presentation to UWF Board of Trustees November 7, 2003.
By Monica Y. Peters, Ph.D. Coordinator of Institutional Effectiveness/QEP Office of Quality Enhancement.
VT University Libraries: Identifying, Teaching, and Assessing What Matters Most Office of Academic Assessment Ray Van Dyke,
The University of Kentucky Program Review Process for Administrative Units April 18 & 20, 2006 JoLynn Noe, Assistant Director Office of Assessment
Western Carolina University Office of Assessment A Division of the Office of the Provost.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
ADMINISTRATIVE AND EDUCATIONAL SUPPORT UNITS ASSESSMENT WORKSHOP CYCLE Cycle Del Mar College Risè Knight May 2010.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
2008 Spring Semester Workshop AN INTRODUCTION TO STRATEGIC PLANNING AND ASSESSMENT WORKSHOP T. Gilmour Reeve, Ph.D. Director of Strategic Planning.
1 Learning Outcomes Assessment: An Overview of the Process at Texas State Beth Wuest Director, Academic Development and Assessment Lisa Garza Director,
Institutional Effectiveness: Administrative and Educational Support Assessment Units A Practical Handbook Incorporating TracDat Terminology.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Help Me Understand the Basics of Non-academic Assessment A BETHUNE-COOKMAN UNIVERSITY OFFICE OF ASSESSMENT WORKSHOP Presented by Cory A. Potter Executive.
AQIP Categories Category One: Helping Students Learn focuses on the design, deployment, and effectiveness of teaching-learning processes (and on the processes.
Student Learning Outcomes (SLOs) Module #4: SLO Annual Report
Functional Area Assessment
The assessment process For Administrative units
Governance and leadership roles for equality and diversity in Colleges
NON-ACADEMIC ASSESSMENT AND REPORTING FY’17
NON-ACADEMIC ASSESSMENT REPORTING FY’17
Presentation transcript:

Assessment at the TTUHSC Or….What does assessment have to do with me?

Assessment Process of collecting and reviewing evidence about the TTUHSC’s academic and administrative programs and services and using it to evaluate these programs and services to improve their quality ASSESS IMPROVE Dialogue PLAN Dialogue

Institutional Effectiveness The process by which the TTUHSC demonstrates how well it is succeeding in accomplishing its mission and meeting its goals

SACS (Southern Association of Colleges and Schools) Principles of Accreditation 2.5 The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves those outcomes; and provides evidence of improvement based on analysis of those results. 2.5 The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission The institution identifies expected outcomes for its educational programs and its administrative and educational support services; assesses whether it achieves those outcomes; and provides evidence of improvement based on analysis of those results.

TTUHSC Mission To improve the health of people by providing educational opportunities for students and health care professionals, advancing knowledge through scholarship and research, and providing patient care and service

TTUHSC GOALS  Prepare competent health professionals and scientists  Increase externally funded, peer-reviewed research, especially NIH funded research, and research focused on aging, cancer, and rural health  Improve access to quality health care for the TTUHSC’s target populations  Prepare health professions students for an increasingly diverse workforce and patient population  Provide leadership in the development of partnerships and collaborations to improve community health  Operate TTUHSC as an efficient and effective institution  Prepare competent health professionals and scientists  Increase externally funded, peer-reviewed research, especially NIH funded research, and research focused on aging, cancer, and rural health  Improve access to quality health care for the TTUHSC’s target populations  Prepare health professions students for an increasingly diverse workforce and patient population  Provide leadership in the development of partnerships and collaborations to improve community health  Operate TTUHSC as an efficient and effective institution

Institutional Mission, Goals & Strategies Academic Planning Administrative Planning Budget Planning Facilities Planning Outcomes Means of Assessment Assessment Results` Use of Results Institutional Effectiveness is focused on Results & Improvement Strategic Planning is focused on Process & Action Answers Question: What actions should we take to support the mission & goals of the institution? Answers Question: How well are our students learning & our administrative services (AES) functioning? Adapted from Nichols, J.O & Karen W., A Road Map for Improvement of Student Learning and Support Services Through Assessment Inform the Planning Process Unit Mission/Functions Relationship of Types of Planning at an Institution

What is WEAVEonline?  A data-based tool for tracking assessment at the unit or program level  Tracks both student learning outcomes for degree programs and outcomes for administrative & educational support units  Demonstrates links between units/programs and the TTUHSC mission, goals, strategies  Reporting capabilities  A data-based tool for tracking assessment at the unit or program level  Tracks both student learning outcomes for degree programs and outcomes for administrative & educational support units  Demonstrates links between units/programs and the TTUHSC mission, goals, strategies  Reporting capabilities

WEAVE

W rite expected outcomes E stablish criteria for success A ssess performance V iew assessment results E ffect improvements W rite expected outcomes E stablish criteria for success A ssess performance V iew assessment results E ffect improvements

What’s the Assessment Timeline? During our “ramp up” year:  Assessment plans entered during the Fall Semester (ideally by Oct. 31, 2006)  Assessment activities Sept through June 2007  Assessment reports due July 15, 2007 Includes “findings” or results Includes a brief action plan on how findings will be used for improvement  Revised assessment plans due Sept. 2, 2007 During our “ramp up” year:  Assessment plans entered during the Fall Semester (ideally by Oct. 31, 2006)  Assessment activities Sept through June 2007  Assessment reports due July 15, 2007 Includes “findings” or results Includes a brief action plan on how findings will be used for improvement  Revised assessment plans due Sept. 2, 2007

Who will use WEAVE?  All Academic degree programs (for student learning outcomes)  Administrative and educational support programs/units Includes academic program units, i.e. Dept. of Pediatrics, Office of the Dean for SON, etc. Administrative units, i.e. Traffic & Parking, IT-El Paso Educational Support Units, i.e. Outreach Service, TTUHSC Libraries; SOAHS Office of Admissions & Student Affairs  All Academic degree programs (for student learning outcomes)  Administrative and educational support programs/units Includes academic program units, i.e. Dept. of Pediatrics, Office of the Dean for SON, etc. Administrative units, i.e. Traffic & Parking, IT-El Paso Educational Support Units, i.e. Outreach Service, TTUHSC Libraries; SOAHS Office of Admissions & Student Affairs

Help! Office of Institutional Planning & Effectiveness (OIPE) Add new office number Staff: Sharon Kohout, Director H.D. Stearman, Associate Director Te’Ree Reese, Senior Business Assistant Office of Institutional Planning & Effectiveness (OIPE) Add new office number Staff: Sharon Kohout, Director H.D. Stearman, Associate Director Te’Ree Reese, Senior Business Assistant

Assessment 101

Assessment  Assessment is a systematic process of gathering and interpreting information to discover if your program is meeting its outcomes and then using that information to enhance your program.  Assessment is a process designed to answer the question: “Are our efforts bringing forth the desired results?”  Assessment is a systematic process of gathering and interpreting information to discover if your program is meeting its outcomes and then using that information to enhance your program.  Assessment is a process designed to answer the question: “Are our efforts bringing forth the desired results?”

Who needs to “do” assessment?  All academic degree programs must conduct ongoing assessment related to student learning outcomes  Administrative & educational support programs or units Possible criteria: Does your program or unit….. – have a purpose (or a mission) statement? – provide a unique service? – have a box on the organizational chart? – have a separate budget?  There are no “rules”  At what level does assessment makes sense if your focus is improvement of your program or services?  All academic degree programs must conduct ongoing assessment related to student learning outcomes  Administrative & educational support programs or units Possible criteria: Does your program or unit….. – have a purpose (or a mission) statement? – provide a unique service? – have a box on the organizational chart? – have a separate budget?  There are no “rules”  At what level does assessment makes sense if your focus is improvement of your program or services?

Where do you start?  Mission or Purpose Statement A brief description of the unit or program’s mission and purpose stated in broad terms. Be sure to mention how the work of the unit supports some part of the University’s mission and/or goals. Keep the mission statement to 2 – 3 sentences (for WEAVE). Should be revisited annually in case services are eliminated or added.  Mission or Purpose Statement A brief description of the unit or program’s mission and purpose stated in broad terms. Be sure to mention how the work of the unit supports some part of the University’s mission and/or goals. Keep the mission statement to 2 – 3 sentences (for WEAVE). Should be revisited annually in case services are eliminated or added.

Core Functions/Services  Four to seven functions/areas of responsibility that ensure that the mission is being accomplished  Helpful for determining outcomes/objectives and potential areas for improvement  Four to seven functions/areas of responsibility that ensure that the mission is being accomplished  Helpful for determining outcomes/objectives and potential areas for improvement

Ask Yourself These Questions  What decision did you make about your unit or program last year?  What evidence did you use to inform that decision?  What was it that you were trying to influence or change about your unit/program when you made that decision?  What decision did you make about your unit or program last year?  What evidence did you use to inform that decision?  What was it that you were trying to influence or change about your unit/program when you made that decision?

Outcomes-Based Assessment Most people…  Do capitalize on their innate intellectually curiosity to find out what works  Don’t articulate their intended end results (e.g., outcomes) ahead of time  Don’t document the decisions made based on their results  Don’t follow up later to see if their decisions made the intended improvement Most people…  Do capitalize on their innate intellectually curiosity to find out what works  Don’t articulate their intended end results (e.g., outcomes) ahead of time  Don’t document the decisions made based on their results  Don’t follow up later to see if their decisions made the intended improvement

The Assessment Cycle (Bresciani, 2003)  The key questions … What are we trying to do and why? or What is my unit or program supposed to accomplish? How well are we doing it? How do we know? How do we use the information to improve or celebrate successes? Do the improvements we make work?  The key questions … What are we trying to do and why? or What is my unit or program supposed to accomplish? How well are we doing it? How do we know? How do we use the information to improve or celebrate successes? Do the improvements we make work? ASSESS IMPROVE Dialogue PLAN Dialogue

Assessment ( Bresciani, 2006)  Most importantly, it should be: Understood Inclusive Meaningful Manageable Flexible Truth-seeking/objective/ethical Systematic  Inform decisions for continuous improvement or provide evidence of proof  Promote a culture of accountability, of learning, and of improvement  Most importantly, it should be: Understood Inclusive Meaningful Manageable Flexible Truth-seeking/objective/ethical Systematic  Inform decisions for continuous improvement or provide evidence of proof  Promote a culture of accountability, of learning, and of improvement

The Purpose of Assessment  Outcomes-Based assessment does not exist for assessment’s sake  It is taking what most of us already do, and making it systematic  It is NOT personnel evaluation  Its purpose is to reflect on the end result of doing—the outcome. Are we accomplishing that which we say we are?  Outcomes-Based assessment does not exist for assessment’s sake  It is taking what most of us already do, and making it systematic  It is NOT personnel evaluation  Its purpose is to reflect on the end result of doing—the outcome. Are we accomplishing that which we say we are? Bresciani, 2002

Purpose of Assessment (continued)  Reinforce or emphasize the mission of your unit  Improve programs and/or performance  Inform planning  Inform decision making  Evaluate programs, not personnel  Reinforce or emphasize the mission of your unit  Improve programs and/or performance  Inform planning  Inform decision making  Evaluate programs, not personnel Bresciani, 2002

Purpose of Assessment (continued)  Assist in the request for additional funds from the University and external community  Assist in the re-allocation of resources  Assist in meeting accreditation requirements, models of best practices, and national benchmarks  Celebrate successes  Create a culture of continuous improvement – a culture of accountability, of learning, and of improvement  Assist in the request for additional funds from the University and external community  Assist in the re-allocation of resources  Assist in meeting accreditation requirements, models of best practices, and national benchmarks  Celebrate successes  Create a culture of continuous improvement – a culture of accountability, of learning, and of improvement Bresciani, 2002

Components of a TTUHSC Assessment Plan  Program/Unit Name  Program/Unit Mission or Purpose Statement  Outcomes Student Learning or Administrative Link to the TTUHSC Strategic Plan  Measure (Method of Assessment)  Target Level (Criteria for Success)  Findings (Assessment Results) Summarize the results for each outcome Summarize the process to verify/validate the results  Action Plan (Use of Results) Summarize the decisions/recommendations made for each outcome  Program/Unit Name  Program/Unit Mission or Purpose Statement  Outcomes Student Learning or Administrative Link to the TTUHSC Strategic Plan  Measure (Method of Assessment)  Target Level (Criteria for Success)  Findings (Assessment Results) Summarize the results for each outcome Summarize the process to verify/validate the results  Action Plan (Use of Results) Summarize the decisions/recommendations made for each outcome

Mission Statement  A mission statement needs to communicate the essence or purpose of your unit or program.  Your mission statement answers the question: “What do we do and how do we support the TTUHSC mission?”  A mission statement needs to communicate the essence or purpose of your unit or program.  Your mission statement answers the question: “What do we do and how do we support the TTUHSC mission?”

Outcomes  Outcomes are statements that describe results that you expect to see…. in the way you deliver your services related to internal goals you might have set within your unit as you carry out your functions or responsibilities  Outcomes are specifically about what you want the end results of your activities to be.  Outcomes are statements that describe results that you expect to see…. in the way you deliver your services related to internal goals you might have set within your unit as you carry out your functions or responsibilities  Outcomes are specifically about what you want the end results of your activities to be.

Guidelines for Formulating Outcomes  Must be related to something under the control of the unit  Should be worded in terms of what the unit will accomplish or what its clients should think, know, or do following the provision of services  Should lead to improved service  Must be linked to a TTUHSC strategy and/or goal (WEAVE will make the linking simple)  YOU decide which strategies and/or goals your unit best links to (can be more than one, but must link to a goal at the very least – see Goal 5)  Must be related to something under the control of the unit  Should be worded in terms of what the unit will accomplish or what its clients should think, know, or do following the provision of services  Should lead to improved service  Must be linked to a TTUHSC strategy and/or goal (WEAVE will make the linking simple)  YOU decide which strategies and/or goals your unit best links to (can be more than one, but must link to a goal at the very least – see Goal 5)

Ideas for Deciding on Outcomes  Review your core services/ functions/responsibilities  Brainstorm with other staff about potential areas for improvement  Look at data/trends to determine issues or areas that need time and attention  What are areas in which you are held accountable by senior management or external audiences?  Remember: outcomes-based assessment is about improvement – not just “doing”  Start small – with just 2 or 3 outcomes  Review your core services/ functions/responsibilities  Brainstorm with other staff about potential areas for improvement  Look at data/trends to determine issues or areas that need time and attention  What are areas in which you are held accountable by senior management or external audiences?  Remember: outcomes-based assessment is about improvement – not just “doing”  Start small – with just 2 or 3 outcomes

Examples of Intended Outcomes  “All news media inquiries concerning the University will receive appropriate and timely responses.”  “An increasing number of undergraduate students will engage in research with faculty and graduate students.”  “Seminars and conferences offered by this program will provide enhanced opportunities for faculty to engage in interdisciplinary discussion and develop networks for future research collaborations.”  “Customers will receive prompt assistance in effectively resolving technical problems related to systems, networks, and desktop applications.”  “All news media inquiries concerning the University will receive appropriate and timely responses.”  “An increasing number of undergraduate students will engage in research with faculty and graduate students.”  “Seminars and conferences offered by this program will provide enhanced opportunities for faculty to engage in interdisciplinary discussion and develop networks for future research collaborations.”  “Customers will receive prompt assistance in effectively resolving technical problems related to systems, networks, and desktop applications.”

Questions to Ask Yourself About Outcomes  Is it measurable/identifiable?  Is it meaningful?  Is it manageable?  Who is the target audience of my outcome?  Who would know if my outcome has been met?  How will I know if it has been met?  Will it provide me with evidence that will lead me to make a decision for continuous improvement?  Is it measurable/identifiable?  Is it meaningful?  Is it manageable?  Who is the target audience of my outcome?  Who would know if my outcome has been met?  How will I know if it has been met?  Will it provide me with evidence that will lead me to make a decision for continuous improvement?

Before Choosing A Method or Measure…  Think about what meeting the outcome looks like For example: “All news media inquiries concerning the University will receive appropriate and timely responses.” – What does it look like when your outcome has been met? – How do you currently deliver this outcome?  There may be clues in the delivery of the outcome that help you determine how to assess it  Think about what meeting the outcome looks like For example: “All news media inquiries concerning the University will receive appropriate and timely responses.” – What does it look like when your outcome has been met? – How do you currently deliver this outcome?  There may be clues in the delivery of the outcome that help you determine how to assess it

Before Choosing a Method or Measure… (continued)  Think about collecting data from different sources to make more meaningful and informed decisions for continuous improvement (e.g., surveys, observations, self-assessment) that you believe will be useful in answering the important questions you have raised that will appeal to your primary audience or to those with whom you are trying to influence  Think about collecting data from different sources to make more meaningful and informed decisions for continuous improvement (e.g., surveys, observations, self-assessment) that you believe will be useful in answering the important questions you have raised that will appeal to your primary audience or to those with whom you are trying to influence

Measurement Methods  Evidence of learning- basically two types Direct- – methods of collecting information that require the customers/students to display their knowledge and skills Indirect – methods that ask customers/students or some one else to reflect on the customer/student learning rather than to demonstrate it  Evidence of learning- basically two types Direct- – methods of collecting information that require the customers/students to display their knowledge and skills Indirect – methods that ask customers/students or some one else to reflect on the customer/student learning rather than to demonstrate it (Palomba and Banta, 1999)

Some Methods That Provide Direct Evidence  Observations of customer/student behavior  Tracking the use of a service (i.e. hits on a website)  Tracking processing time  Counts of participants/customers served  Tracking complaints and how they are resolved  Observations of customer/student behavior  Tracking the use of a service (i.e. hits on a website)  Tracking processing time  Counts of participants/customers served  Tracking complaints and how they are resolved

Some Methods That Provide Direct Evidence (continued)  Document analysis (e.g., meeting minutes, policies, handbooks)  Observations of work performance  Web tracking program analysis  Other document tracking analysis  Document analysis (e.g., meeting minutes, policies, handbooks)  Observations of work performance  Web tracking program analysis  Other document tracking analysis

Some Methods That Provide Indirect Evidence  Client satisfaction surveys (students, staff, faculty, alumni, other customers)  Interviews and focus groups with participants, customers, or other stakeholders  Benchmarks set by national, state, or peer organizations or institutions  Retention rates  Enrollment trends  Client satisfaction surveys (students, staff, faculty, alumni, other customers)  Interviews and focus groups with participants, customers, or other stakeholders  Benchmarks set by national, state, or peer organizations or institutions  Retention rates  Enrollment trends

Examples of Measures  Surveys of customer satisfaction  Counts of program participants; growth in participation  Focus groups, individual interviews, phone surveys  Feedback from advisory groups or committees  Analysis of service usage  Surveys of customer satisfaction  Counts of program participants; growth in participation  Focus groups, individual interviews, phone surveys  Feedback from advisory groups or committees  Analysis of service usage

Choosing an Instrument  What outcome(s) are you measuring?  Who is being assessed? How often do I have access to them? Do I know who they are?  What is my budget?  What is my timeline?  What type of data is most meaningful to me: direct/indirect and qualitative/quantitative  Who will have responsibility for data collection?  What outcome(s) are you measuring?  Who is being assessed? How often do I have access to them? Do I know who they are?  What is my budget?  What is my timeline?  What type of data is most meaningful to me: direct/indirect and qualitative/quantitative  Who will have responsibility for data collection?

Target Levels (Criteria for Success)  What target will you set for yourself during the coming year in order to accomplish your outcome?  What level of accomplishment do you hope to see?  Target levels should specific, measurable, and attainable.  You are encouraged to set more than one target level or criteria for success for each outcome.  What target will you set for yourself during the coming year in order to accomplish your outcome?  What level of accomplishment do you hope to see?  Target levels should specific, measurable, and attainable.  You are encouraged to set more than one target level or criteria for success for each outcome.

Examples: Outcomes with Criteria for Success/Target Level  Students will be satisfied with the health care provider on their campus. Target: On Student Satisfaction Survey, 90% will respond that they are satisfied or very satisfied with health care provider on their campus.  Instructional support services will be provided to meet the needs of TechLink faculty. Target: By Dec. 2006, faculty using TechLink will be surveyed to determine three most critical areas of need Target: By June 2007, a workplan for addressing the major areas of need will be presented to IT leadership.  Students will be satisfied with the health care provider on their campus. Target: On Student Satisfaction Survey, 90% will respond that they are satisfied or very satisfied with health care provider on their campus.  Instructional support services will be provided to meet the needs of TechLink faculty. Target: By Dec. 2006, faculty using TechLink will be surveyed to determine three most critical areas of need Target: By June 2007, a workplan for addressing the major areas of need will be presented to IT leadership.

Conduct Assessment Activities  Be sure you have a plan for conducting the activities that will lead to accomplishment of your target level

Findings (Results of Assessments Conducted) Summarize briefly the major findings from your assessments  “70% of respondents indicated that they had received a response to their request for service within 24 hours.”  “Customers expressed frustration with the wait time for help desk requests.”  “Feedback gathered from our advisory group indicated that department staff desire additional assistance with web design.” Summarize briefly the major findings from your assessments  “70% of respondents indicated that they had received a response to their request for service within 24 hours.”  “Customers expressed frustration with the wait time for help desk requests.”  “Feedback gathered from our advisory group indicated that department staff desire additional assistance with web design.”

Closing the Loop  Briefly report method of assessment for each outcome  Document where the customers/students are meeting the intended outcome  Document where they are not meeting the outcome  Document decisions made to improve the program and assessment plan  Refine assessment method and repeat process after proper time for implementation  Briefly report method of assessment for each outcome  Document where the customers/students are meeting the intended outcome  Document where they are not meeting the outcome  Document decisions made to improve the program and assessment plan  Refine assessment method and repeat process after proper time for implementation

Action Plans (Use of Results for Improvement)  How do you plan to use the results of your assessments to improve the unit’s programs or services?  Briefly describe any improvements, changes, or plans that you will undertake  If you meet your target levels, review your outcomes to see if other improvements or activities are needed  How do you plan to use the results of your assessments to improve the unit’s programs or services?  Briefly describe any improvements, changes, or plans that you will undertake  If you meet your target levels, review your outcomes to see if other improvements or activities are needed

Examples of Improvement Statements  “A new campaign emphasizing to new graduates the benefits of annual giving to the TTUHSC was developed and implemented in July We continue to monitor our progress in encouraging this group to contribute.”  “Since our service demands have increased, additional staff are needed to provide adequate coverage, and will be requested in the next fiscal year.”  “Our focus groups interviews with users indicated a preference for electronic reports, so the hard copy version was discontinued.”  “A new campaign emphasizing to new graduates the benefits of annual giving to the TTUHSC was developed and implemented in July We continue to monitor our progress in encouraging this group to contribute.”  “Since our service demands have increased, additional staff are needed to provide adequate coverage, and will be requested in the next fiscal year.”  “Our focus groups interviews with users indicated a preference for electronic reports, so the hard copy version was discontinued.”

Demonstrate & Document Use of Results for Improvements  Most important step = data-based decision making  Data collected from doing assessment often will inform Strategic Planning  Most important step = data-based decision making  Data collected from doing assessment often will inform Strategic Planning

Institutional Mission, Goals & Strategies Academic Planning Administrative Planning Budget Planning Facilities Planning Outcomes Means of Assessment Assessment Results` Use of Results Institutional Effectiveness is focused on Results & Improvement Strategic Planning is focused on Process & Action Answers Question: What actions should we take to support the mission & goals of the institution? Answers Question: How well are our students learning & administrative services (AES) functioning? Adapted from Nichols, J.O & Karen W., A Road Map for Improvement of Student Learning and Support Services Through Assessment Inform the Planning Process Unit Mission/Functions Relationship of Types of Planning at an Institution

Take-Home Messages  You do not have to assess everything you do every year.  You don’t have to do everything at once-start with a reasonable number of intended outcomes  Think baby steps  Be flexible  Acknowledge and use what you have already done.  You do not have to assess everything you do every year.  You don’t have to do everything at once-start with a reasonable number of intended outcomes  Think baby steps  Be flexible  Acknowledge and use what you have already done.

 Assessment expertise is available to help you evaluate your program better – not to evaluate your program for you  Borrow examples from other institutions to modify as appropriate  Time for this must be re-allocated  We allocate time according to our values and priorities  Assessment expertise is available to help you evaluate your program better – not to evaluate your program for you  Borrow examples from other institutions to modify as appropriate  Time for this must be re-allocated  We allocate time according to our values and priorities More Take-Home Messages

Resources  Each Other  Web Resources   htm#hbooks htm#hbooks  x.htm x.htm  Each Other  Web Resources   htm#hbooks htm#hbooks  x.htm x.htm

Help! Office of Institutional Planning & Effectiveness (OIPE) Staff: Sharon Kohout, Director H.D. Stearman, Associate Director Te’Ree Reese, Senior Business Assistant Office of Institutional Planning & Effectiveness (OIPE) Staff: Sharon Kohout, Director H.D. Stearman, Associate Director Te’Ree Reese, Senior Business Assistant