Specific outcomes can be compared to resources expended; and successful programs can be highlighted;

Slides:



Advertisements
Similar presentations
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Advertisements

MSCHE Annual Conference December st Century Higher Education Projections Increasingly diverse student populations Widely varying levels of secondary.
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Donald T. Simeon Caribbean Health Research Council
Arts in Basic Curriculum 20-Year Anniversary Evaluation the Improve Group.
Using the New CAS Standards to Assess Your Transfer Student Programs and Services Janet Marling, Executive Director National Institute for the Study of.
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Commission for Academic Accreditation 1 Accreditation and Academic Quality King Fahd University of Petroleum and Minerals Faculty Workshop Accreditation,
SMART: Developing Effective Goals and Objectives Presented by Barry Nagle Evaluation and Action Research Associates (EARA) Fairfax, VA March 2009.
An Assessment Primer Fall 2007 Click here to begin.
SEM Planning Model.
October Priority 8 Review Team 8: Planning Subcommittee M. DesVignes, D. Kinney, J. Moore, M. Siegel, R. Tillberg Collect and use data systematically.
Institutional Effectiveness Operational Update Presentation made to the Indiana State University Board of Trustees October 5, 2001.
Comprehensive M&E Systems
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Strategically Internationalizing Your Campus Using Trends and Data Jim Crawley, Director University Recruitment and Advising Services – ELS Educational.
Standards and Guidelines for Quality Assurance in the European
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
South Seattle Community College BUDGET HEARING Fiscal Year June 7, 2005.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Strategic Planning Summit GAP/Committee Chairs/IE December 5,
Mia Alexander-Snow, PhD Director, Office for Planning and Institutional Effectiveness Program Review Orientation 1.
Institutional Effectiveness & B-CU Dr. Helena Mariella-Walrond VP of Institutional Effectiveness Cory A. Potter Director of Assessment Administrative.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
1 The Journey to Reaffirmation “Systematic Based Evaluation” Spring 2009 Faculty/Staff Conference Southern University at Shreveport January 12, 2009 Planning,
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Monash University Library Quality Cycle EXCELLENCE AND DIVERSITY and LEADING THE WAY Monash University’s strategic framework and overall directions MONASH.
Washington State Library OBE Retreat October 20, 2006 Matthew Saxton & Eric Meyers.
1. Continue to distinguish and clarify between Student Learning Outcomes (SLOs) and Service Area Outcomes (SAOs) 2. Develop broad SLOs/SAOs in order to.
TODAY AND TOMORROW University of Houston- Downtown Strategic Plan Highlights.
ANDREW LAMANQUE, PHD SPRING 2014 Status Report: Foothill Reaffirmation of Accreditation.
Action Planning Assess, Plan and Perform Melissa Spears Business Advisory Services.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Best Practices: Enrollment Management Plan Lorie Hach Director of Student Success Lisa McLaughlin Institutional Data Coordinator.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Basic Workshop For Reviewers NQAAC Recognize the developmental engagements Ensure that they operate smoothly and effectively” Ensure that all team members.
HECSE Quality Indicators for Leadership Preparation.
Stephanie Curry-Reedley College James Todd- ASCCC Area A Representative.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
Developing the Year One Report: WVC’s Experience as a Pilot College Dr. Susan Murray Executive Director, Institutional Effectiveness.
School Counselor Student Services Job Responsibilities.
University of Central Florida Assessment Toolkit for Academic, Student and Enrollment Services Dr. Mark Allen Poisel Dr. Ron Atwell Dr. Paula Krist Dr.
ASSESSING PERFORMANCE AND WRITING IMPACT REPORTS Presented by: Gloria Pryor James, PhD Vice President AID, Inc.
Federal Flexibility Initiative and Schoolwide Programs.
Why Do State and Federal Programs Require a Needs Assessment?
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Institutionalizing Integrated Planning Dave Bolt West Hills College Lemoore.
State Center Community College District 2008 Strategic Plan One-Year Status Report December 2008.
Project 3 Supporting Technology. Project Proposal.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Planning for School Implementation. Choice Programs Requires both district and school level coordination roles The district office establishes guidelines,
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Improving Technology Infrastructure and Web-based Information and Services Northeast Iowa Community College PRP031A Christine Woodson, Project Director.
Program Quality Assessment Duane K. Larick North Carolina State University Council Of Graduate Schools New Deans Institute July, 2007.
Assessment of Student Learning: Phase III OSU-Okmulgee’s Evidence of Student Learning.
Federal Flexibility Initiative and Schoolwide Programs.
Excellence for Each Student Utah State Board of Education Strategic Plan.
Time to answer critical and inter-related questions: Whom will we serve? What will we offer? How will we serve them?
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Help Me Understand the Basics of Non-academic Assessment A BETHUNE-COOKMAN UNIVERSITY OFFICE OF ASSESSMENT WORKSHOP Presented by Cory A. Potter Executive.
Best Practices: Enrollment Management Plan
The City University of New York Performance Management Process (PMP)
Dr. Ron Atwell Ms. Kathleen Connelly
Student Success Initiative
Presentation transcript:

Specific outcomes can be compared to resources expended; and successful programs can be highlighted;

Programs that have not accomplished their intended outcomes can be identified; The extent to which the planned outcomes were met can be documented;

The adequacy of the program’s outcomes as related to the problem that the program was developed and implemented to address can be documented;

Strengths and weaknesses of each activity can be identified that affected the Effectiveness and Adequacy of the programmatic inputs ($$$, human resources, space, etc.) in meeting the activity’s planned outcomes.

The institution has the ability to determine the impact of the Title III funds awarded to various discrete activities and programs.

The process determines not only the success or failure of the program, but also provides explanations for the actual outcomes.

As a result of the MANDATE of funding agencies and institutions to demonstrate programmatic accountability, one of the services requested most by institutions of higher education is program ASSESSMENT/EVALUATION.

NCCU, chartered in 1909, opened its doors to students in 1910 as the National Religious Training School and Chautauqua.

North Carolina Central University is a part of the University of North Carolina System, made up of 16 individual campuses (17 when we count the School of the Arts) and headed by a single president and governed by the University of North Carolina Board of Governors.

North Carolina Central University is a comprehensive university offering bachelor’s degrees in more than 100 disciplines, master’s degrees in more than 40 programs and three professional degrees in law.

Enrollment at North Carolina Central University increased from 5,753 in Fall 2001 to 8, 619 for Fall 2006, an increase over the six year period of 33%. We currently enroll 8,500 students ( ). Major emphasis for the current planning period ( ) is “Enhancing Retention, Graduation, and Placement Rates through Increased Student Learning.”

Our goals: to provide programs, activities, and services that will assist students in becoming successful during their tenure at the University so that they persist and graduate and are successful in the career markets of the future; to put in place those systems designed to make the matriculation more seamless and easier to negotiate;

to provide the training and upgrading of the skills of our faculty to work with those students whom we enroll that have been underserved and are from diverse backgrounds; and to promote a more efficient, effective, and accountable environment as relates to student learning and student success.

Impetus for conducting an Impact Study – Chancellor Charlie Nelms; A directive he gave me as Title III Director, which I passed on to the Activity Coordinators in our planning meeting; His absolute need to have concrete documentation of the impact that our Title III dollars had had over the years.

North Carolina Central University (NCCU) contracted with Associates for Institutional Development, Inc. (AID, Inc.) to conduct an Impact Study of select Title III activities encompassing the grant years to document the extent to which each activity had accomplished the outcomes for which the activity was funded.

PROVIDE ACCESS TO ALL RECORDS PERTINENT TO EACH ACTIVITY WORK CLOSELY WITH THE TITLE III STAFF TO COLLECT THE DATA

A data collection process MUST be implemented that includes: 1.Onsite visits; 2.Interviews with key staff; 3.Participant self-assessments;

4.Interviews with faculty, students, and external stakeholders; 5.The administration of surveys and questionnaires, as needed.

To guarantee objectivity, the process relies upon multiple sources of data encompassing multiple points of view. Finally, to be adaptable, the assessment’s design must meet the assessment needs of the institution.

PROGRAM PLANS (RATIONALE, TARGET AUDIENCE, BASELINE DATA; MEASURABLE OBJECTIVES; MEASURABLE AND TIME-SPECIFIC PERFORMANCE INDICATORS; IMPLEMENTATION STRATEGIES IN CHRONOLOGICAL ORDER

DOCUMENTATION ON ACTUAL PROGRAMMATIC OUTCOMES (DESCRIPTIVE AND STATISTICAL) EXTERNAL EVALUATION REPORTS PROGRESS REPORTS (MONTHLY OR QUARTERLY)

PROGRAMMATIC OUTCOMES (QUANTIFIABLE AND DESCRIPTIVE)

ANNUAL REPORTS RESULTS OF SURVEYS AND QUESTIONNAIRES FUNDS AWARDED EACH ACTIVITY DURING THE YEARS INCLUDED IN THE STUDY

ACTIVITY BUDGET BURN RATES ANEDOTAL INFORMATION THAT WILL ADD VALUE TO THE OUTCOMES

OBJECTIVES THAT ARE NOT MEASURABLE TOO MANY TASKS MASQUERADING AS OBJECTIVES PERFORMANCE INDICATORS THAT ARE NOT MEASURABLE AND NOT TIME-SPECIFIC

THE ABSENCE OF BASELINE DATA FOR THE FIRST YEAR OF IMPLEMENTATION AND EACH YEAR OF FUNDING (WHERE APPROPRIATE) LACK OF ANNUAL EVALUATION REPORTS OR SOME OTHER TYPE OF ASSESSMENT TO VERIFY THE ATTAINMENT OF THE OBJECTIVES EACH YEAR OF FUNDING

ACTIVITY DIRECTORS NOT HAVING A CLEAR VISION OF THE PRIMARY ANTICIPATED/PLANNED OUTCOMES OF THE ACTIVITY AND HOW THE ACTIVITY WILL STRENGTHEN THE INSTITUTION

Identified the over-arching goal or planned outcome for each program; Identified the actual outcome based on data;

Analyzed the gap between the desired outcome and the actual outcome by utilizing the “Effectiveness Estimate (EE)” technique:

DOCUMENTS: THE EFFECTIVENESS OF THE PROGRAM IN AMELIORATING OR STRENGTHENING THE ENTITY FOR WHICH IT WAS FUNDED

(R) Actual Results (C) Results Without Program (P) Planned Impact

(R) Actual Results (C) Results Without Program

ADEQUACY RATIO = (R) Actual Results 1 - (C) Results Without Program

1. Institutional enhancements as a result of implementing the program; 2.Enhancements to student and faculty performance, where applicable;

3. Affect on administrative personnel effectiveness, where applicable; 4.Overall academic and/or administrative program effectiveness enhancements; 5.Problems encountered that adversely affected the implementation of the program during the specified time- period;

Recommendations to strengthen the continued implementation of each program regardless of funding status;

Information garnered through the impact study process, in combination with other sources of data, will provide an opportunity for the institutional leadership to incorporate the findings into programmatic and budgetary decision-making.

Dr. Gloria P. James: Dr. Brenda R. Shaw: Mr. Samuel T. Rhoades: