1 Chapter 3 - SAEOPP Priority 1 Training SHOW & TELL: TRIO PROJECT EVALUATION Presenter: Elizabeth C. Retamozo.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Introduction to Monitoring and Evaluation
Educational Specialists Performance Evaluation System
Southeastern Association of Educational Opportunity Program Personnel 38 th Annual Conference January 30 – February 3, 2010 Upward Bound Internal & External.
M & E for K to 12 BEP in Schools
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
Campus Improvement Plans
Analyzing & Reporting Data– TS & EOC SAEOPP Priority 1 Training PY 2010.
New and Emerging GEAR UP Evaluators
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
High-Quality Supplemental Educational Services And After-School Partnerships Demonstration Program (CFDA Number: ) CLOSING DATE: August 12, 2008.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Talbert House Project PASS Goals and Outcomes.
Chapter 15 Evaluation.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Standards and Guidelines for Quality Assurance in the European
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Using Data to Improve Adult Ed Programs Administrators’ Workshop.
Evaluating NSF Programs
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
ASSESSMENT  Are you having an impact?  How do you know?  Are the programs and services you work on getting better?  How do you know?
The Role of Institutional Research in Delta Sigma Theta Sorority, Inc Regional Conference.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
NJ - 1 Performance Measurement Reporting Development Services Group, Inc. Don Johnson For more information contact Development Services Group, Inc
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
The Evaluation Plan.
Toolkit Series from the Office of Migrant Education Webinar: Program Evaluation Toolkit August 9, 2012.
Overview of the SPDG Competition Jennifer Doolittle, Ph.D. 1.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
The Formative & Summative Evaluation Process TS & EOC SAEOPP Priority 1 Training PY 2009.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Analyzing & Reporting Data– TS & EOC SAEOPP Priority 1 Training PY 2009.
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Targeted Assistance Programs: Requirements and Implementation Spring Title I Statewide Conference May 15, 2014.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Virginia Association of School Superintendents Annual Conference Patty.
ASSESSMENT. Assessment is the systematic and on-going process of collecting and reviewing evidence about the College's academic and administrative programs.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
OVERVIEW Partners in Pregnancy is a community program aimed at giving young couples the resources they need with their pregnancy. Young pregnant couples.
Why Do State and Federal Programs Require a Needs Assessment?
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Project Design Jennifer Coffey OSEP May 4,
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Note: In 2009, this survey replaced the NCA/Baldrige Quality Standards Assessment that was administered from Also, 2010 was the first time.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
Publisher to insert cover image here Chapter 9
Controlling Measuring Quality of Patient Care
Presentation transcript:

1 Chapter 3 - SAEOPP Priority 1 Training SHOW & TELL: TRIO PROJECT EVALUATION Presenter: Elizabeth C. Retamozo

2 Presenter Elizabeth C. Retamozo Executive Director NOSOTROS Education Center Phone:

3 Session Summary This session will focus on providing TRIO personnel with an overview of: This session will focus on providing TRIO personnel with an overview of: Program Evaluation Program Evaluation Types of Program Evaluations Types of Program Evaluations Methods & Types of Data Collection Methods & Types of Data Collection Quantitative & Qualitative Data Quantitative & Qualitative Data Formative & Summative Methods Formative & Summative Methods

4 Evaluation Evaluation - Systematic method for collecting, analyzing, and using information to answer basic questions about your program. Evaluation - Systematic method for collecting, analyzing, and using information to answer basic questions about your program. Helps to identify effective & ineffective services, practices, & approaches. Helps to identify effective & ineffective services, practices, & approaches. Evaluation Plan - A written document describing the overall approach or design you anticipate using to guide your evaluation. Evaluation Plan - A written document describing the overall approach or design you anticipate using to guide your evaluation. Includes what you plan to do, how you plan to do it, who will do it, when it will be done, & why the evaluation is being conducted. Includes what you plan to do, how you plan to do it, who will do it, when it will be done, & why the evaluation is being conducted. The evaluation plan serves as a guide for the evaluation. The evaluation plan serves as a guide for the evaluation.

5 Evaluation Plan Comprehensive plan to determine the effectiveness of the program in meeting stated objectives. Comprehensive plan to determine the effectiveness of the program in meeting stated objectives. Includes the criteria that will determine effectiveness & the documentation that objectives have been met. Includes the criteria that will determine effectiveness & the documentation that objectives have been met. Describes the methodology Describes the methodology Provides the information about the delivery of services & use of resources Provides the information about the delivery of services & use of resources Informs & assists program staff to make changes to improve program effectiveness. Informs & assists program staff to make changes to improve program effectiveness.

6 The GRPA Connection Government Performance and Results Act of 1993 (GPRA) Government Performance and Results Act of 1993 (GPRA) DOE Response to GPRA Requirements: Goals 1-6 DOE Response to GPRA Requirements: Goals 1-6 TRIO Programs Performance Indicator: Goal 5 TRIO Programs Performance Indicator: Goal 5 TRIO Programs’ Specific Program Measures: TRIO Programs’ Specific Program Measures:  Upward Bound, Talent Search, Educational Opportunity Centers  Student Support Services & Ronald E. McNair  Prior Experience Criteria & Mandatory Objectives  Project Evaluation Plan  Activities & Services Specific to Each Project

7 Types of TRIO Evaluation Plans Goals-Based Evaluation Goals-Based Evaluation Goal-based evaluations are evaluating the extent to which programs are meeting predetermined goals or objectives. Goal-based evaluations are evaluating the extent to which programs are meeting predetermined goals or objectives. Outcomes-Based Evaluation Outcomes-Based Evaluation Outcomes evaluation looks at impacts/benefits to clients during & after participation in your programs. Outcomes evaluation looks at impacts/benefits to clients during & after participation in your programs. Process-Based Evaluation Process-Based Evaluation Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does. Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does.

8 TRIO Program Evaluation Shapes the development of the project from the beginning of the grant period. Shapes the development of the project from the beginning of the grant period. Comprehensive plan to determine the effectiveness of the program in meeting stated objectives. Comprehensive plan to determine the effectiveness of the program in meeting stated objectives. Includes criteria that will determine effectiveness & document that objectives have been met. Includes criteria that will determine effectiveness & document that objectives have been met. Provides information about the delivery of services & their impact & effectiveness Provides information about the delivery of services & their impact & effectiveness

9 TRIO Program Evaluation Provides a mechanism for evaluating & implementing programmatic changes to improve program effectiveness Provides a mechanism for evaluating & implementing programmatic changes to improve program effectiveness Provides accountability information about the success at the initial site & effective strategies for replication in other settings Provides accountability information about the success at the initial site & effective strategies for replication in other settings Relates your project to GPRA & the TRIO Programs Performance Indicators Relates your project to GPRA & the TRIO Programs Performance Indicators

10 UB Proposal Evaluation Criteria The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which the applicant’s methods of evaluation - - The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which the applicant’s methods of evaluation - - (1) Are appropriate to the project and include both quantitative and qualitative evaluation measures; and (1) Are appropriate to the project and include both quantitative and qualitative evaluation measures; and (2) Examine in specific and measurable ways the success of the project in making progress toward achieving its process and outcome objectives. (2) Examine in specific and measurable ways the success of the project in making progress toward achieving its process and outcome objectives.

11 TS/EOC Proposal Evaluation Criteria The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which the applicant’s methods of evaluation - - The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which the applicant’s methods of evaluation - - (1) Are appropriate to the project’s objectives; (1) Are appropriate to the project’s objectives; (2) Provide for the applicant to determine, using specific and quantifiable measures, the success of the project in - (2) Provide for the applicant to determine, using specific and quantifiable measures, the success of the project in - (i) Making progress toward achieving its objectives (a formative evaluation); and (i) Making progress toward achieving its objectives (a formative evaluation); and (ii) Achieving its objectives at the end of the project period (a summative evaluation); and (ii) Achieving its objectives at the end of the project period (a summative evaluation); and (3) Provide for the disclosure of unanticipated project outcomes, using quantifable measures if appropriate. (3) Provide for the disclosure of unanticipated project outcomes, using quantifable measures if appropriate.

12 SSS Proposal Evaluation Criteria The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which - - The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which - - (1) The applicant’s methods of evaluation - - (1) The applicant’s methods of evaluation - - (i) Are appropriate to the project and include both quantitative and qualitative evaluation measures; and (i) Are appropriate to the project and include both quantitative and qualitative evaluation measures; and (ii) Examine in specific and measurable ways, using appropriate baseline data, the success of the project in improving academic achievement, retention and graduation of project participants; and (ii) Examine in specific and measurable ways, using appropriate baseline data, the success of the project in improving academic achievement, retention and graduation of project participants; and (2) The applicant intends to use the results of an evaluation to make a programmatic changes based upon the results of project evaluation. (2) The applicant intends to use the results of an evaluation to make a programmatic changes based upon the results of project evaluation.

13 McNair Proposal Evaluation Criteria The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which the applicant's methods of evaluation– The Secretary evaluates the quality of the evaluation plan for the project on the basis of the extent to which the applicant's methods of evaluation– (1) Are appropriate to the project's objectives; (1) Are appropriate to the project's objectives; (2) Provide for the applicant to determine, in specific and measurable ways, the success of the project in— (2) Provide for the applicant to determine, in specific and measurable ways, the success of the project in— (i) Making progress toward achieving its objectives (a formative evaluation); and (ii) Achieving its objectives at the end of the project period (a summative evaluation); and (ii) Achieving its objectives at the end of the project period (a summative evaluation); and (3) Provide for a description of other project outcomes, including the use of quantifiable measures, if appropriate. (3) Provide for a description of other project outcomes, including the use of quantifiable measures, if appropriate.

14 Common Words & Terminology Appropriate to Project’s Objectives Appropriate to Project’s Objectives Specific & Measurable Specific & Measurable Qualitative & Quantitative Qualitative & Quantitative Formative & Summative Formative & Summative

15 Appropriate to Project’s Objectives Goals-Based Evaluation - Internal Goals-Based Evaluation - Internal Goal-based evaluations are evaluating the extent to which programs are meeting predetermined goals or objectives. Goal-based evaluations are evaluating the extent to which programs are meeting predetermined goals or objectives. Outcomes-Based Evaluation – Internal/External Outcomes-Based Evaluation – Internal/External Outcomes evaluation looks at impacts/benefits to clients during & after participation in your programs. Outcomes evaluation looks at impacts/benefits to clients during & after participation in your programs. Process-Based Evaluation - External Process-Based Evaluation - External Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does. Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does.

16 Specific & Measurable Quantify description of program effectiveness for each objective Quantify description of program effectiveness for each objective Be specific & use quantifiable measures Be specific & use quantifiable measures Measures outcomes related to your mandatory objectives Measures outcomes related to your mandatory objectives Measurable: %, time lines, aggregate data, descriptive numbers Measurable: %, time lines, aggregate data, descriptive numbers Data examples: Demographics, GPAs, Year to Year Retention, Graduation Rates, College Enrollment & Graduation Rates, & Enrollment in Continuing Education Data examples: Demographics, GPAs, Year to Year Retention, Graduation Rates, College Enrollment & Graduation Rates, & Enrollment in Continuing Education

17 Quantitative Data Information that can be expressed in numerical terms, counted or compared on a scale. For example, improvement in a student’ test scores as measured pre and post test scores. Information that can be expressed in numerical terms, counted or compared on a scale. For example, improvement in a student’ test scores as measured pre and post test scores. Quantifiable – specifically measure (GPA increase, Retention & Graduation rates, academic progress, credit hours earned). Quantifiable – specifically measure (GPA increase, Retention & Graduation rates, academic progress, credit hours earned). Methods: data collection, test results & documents Methods: data collection, test results & documents

18 Quantitative Data Grade Point Averages Grade Point Averages Project Retention Rates Project Retention Rates Grade Reports & Test Scores Grade Reports & Test Scores Secondary/Postsecondary Persistence Status Secondary/Postsecondary Persistence Status Secondary/Postsecondary Graduation Rates Secondary/Postsecondary Graduation Rates Credit Hour Completion Credit Hour Completion Good Standing Status Good Standing Status

19 Qualitative Methods May include “meaning”, experience, opinions May include “meaning”, experience, opinions Ways to describe program effectiveness that are measurable, but may not aggregate in traditional, quantifiable methods Ways to describe program effectiveness that are measurable, but may not aggregate in traditional, quantifiable methods Examples: - Focus groups - Case studies - Open ended interviews - Surveys

20 Qualitative Data Information that is difficult to measure, count, or express in numerical terms. For example, a participant's impression about the fairness of a program rule/requirement is qualitative data. Information that is difficult to measure, count, or express in numerical terms. For example, a participant's impression about the fairness of a program rule/requirement is qualitative data. Interviews & Focus Groups Interviews & Focus Groups Used to fully understand someone's impressions or experiences, or learn more about their answers to questionnaires Used to fully understand someone's impressions or experiences, or learn more about their answers to questionnaires

21 Qualitative Data Observations Used to gather accurate information about how a program actually operates, particularly about processes Observations Used to gather accurate information about how a program actually operates, particularly about processes Questionnaires, Surveys, Checklists Questionnaires, Surveys, Checklists Used to quickly and/or easily get lots of information from people in a non-threatening way Used to quickly and/or easily get lots of information from people in a non-threatening way Examples: Sample Evaluation Forms Examples: Sample Evaluation Forms Pre & Post Test to Measure Learning Pre & Post Test to Measure Learning

22 Formative Evaluation Type of process evaluation of programs or services that focuses on collecting data on program operations so that needed changes or modifications can be made to the program in its early stages. Type of process evaluation of programs or services that focuses on collecting data on program operations so that needed changes or modifications can be made to the program in its early stages. Used to provide feedback to staff about the program components that are working & those that need to be changed. Used to provide feedback to staff about the program components that are working & those that need to be changed. Example: Changes to Delivery of Services Example: Changes to Delivery of Services

23 Formative Evaluation Continuous (ex. monthly) assessment of program effectiveness & progress towards achieving objectives Continuous (ex. monthly) assessment of program effectiveness & progress towards achieving objectives Allows for changes in activities so final objectives/outcomes will be met – changes in delivery of services Allows for changes in activities so final objectives/outcomes will be met – changes in delivery of services Ex: Monthly report of student grades; student/client progress reports; monthly student/client contact reports; client/student evaluation of program; monthly staff meeting progress reporting; student/client attendance & feedback; student pre-post test results Ex: Monthly report of student grades; student/client progress reports; monthly student/client contact reports; client/student evaluation of program; monthly staff meeting progress reporting; student/client attendance & feedback; student pre-post test results

24 Summative Evaluation A type of outcome evaluation that assesses the results or outcomes of a program. A type of outcome evaluation that assesses the results or outcomes of a program.  Concerned with a program's overall effectiveness  Measurement of objectives or outcomes at the end of a pre-defined period  Quantitative data is analyzed for a given period  Qualitative evaluation supplements the quantitative data

25 Summative Evaluation Questions Did you meet program objectives? Did you meet program objectives? How did you accomplish them? How did you accomplish them? What is your documentation? What is your documentation? What changes do you need to make to the activities or objectives? What changes do you need to make to the activities or objectives? Examples: Student year-end program evaluations; annual performance report Examples: Student year-end program evaluations; annual performance report

26 Goals-Based Evaluation - Internal State Mandatory Objectives State Mandatory Objectives State Evaluation Criteria State Evaluation Criteria Summative & Formative Evaluation Methods Summative & Formative Evaluation Methods Qualitative & Quantitative Measures Qualitative & Quantitative Measures Assign Staff Responsibility Assign Staff Responsibility State Time Delineation State Time Delineation Analyze & Report Findings Analyze & Report Findings

27 Goals-Based Evaluation - Internal Data Collection Chart (1) What types of data will be collected; (1) What types of data will be collected; (2) When various types of data will be collected; (3) What methods will be used; (4) What instruments will be developed & when; (5) How the data will be analyzed; (6) When reports & outcomes will be available; and (7) How the grantee will use the information collected through evaluation to monitor progress of the funded project & to provide information about success at the initial site & effective strategies for replication in other settings.

28 Goals-Based Evaluation - Internal Sample Data Collection Chart Talent Search Talent Search Educational Opportunity Centers Educational Opportunity Centers Upward Bound Upward Bound Student Support Services Student Support Services McNair McNair

29 Making Progress Toward Achieving its Objectives - Formative Evaluation: Project Director will conduct the following formative evaluation methods of all project activities and services that correspond to keep the project running smoothly during the year. Project Director will conduct the following formative evaluation methods of all project activities and services that correspond to keep the project running smoothly during the year. Project Director will evaluate the results of all daily contacts & services provided weekly and monthly as it relates to progress to meet project objectives. Project Director will evaluate the results of all daily contacts & services provided weekly and monthly as it relates to progress to meet project objectives.

30 Making Progress Toward Achieving its Objectives - Formative Evaluation: Progress towards the accomplishment of all objectives will be tracked through quantitative reports generated by the project database on a daily, weekly, & monthly basis. Progress towards the accomplishment of all objectives will be tracked through quantitative reports generated by the project database on a daily, weekly, & monthly basis. Quantitative reports collect specific information documenting what participants have been served, the types of services provided, the frequency of services provided, & by which project staff member. Quantitative reports collect specific information documenting what participants have been served, the types of services provided, the frequency of services provided, & by which project staff member.

31 Making Progress Toward Achieving its Objectives - Formative Evaluation: Project database generates weekly, monthly, & quarterly performance reports & generates individual participant outcome & contact reports documenting services provided & the impact of these services on each project participant. Project database generates weekly, monthly, & quarterly performance reports & generates individual participant outcome & contact reports documenting services provided & the impact of these services on each project participant. Participants & parents will be asked to evaluate specific group activities such as workshops, field trips to colleges and career sites to rate their level of satisfaction with the quality of services they receive – qualitative. Participants & parents will be asked to evaluate specific group activities such as workshops, field trips to colleges and career sites to rate their level of satisfaction with the quality of services they receive – qualitative.

32 Achieving Objectives at the End of the Project Period - Summative Evaluation: Project database will combine all monthly quantitative reports to produce a report documenting & confirming the achievement of project objectives. Project database will combine all monthly quantitative reports to produce a report documenting & confirming the achievement of project objectives. Participants & parents will complete end-of-year surveys to measure their level of satisfaction with the overall quality of project services. Participants & parents will complete end-of-year surveys to measure their level of satisfaction with the overall quality of project services.

33 Achieving Objectives at the End of the Project Period - Summative Evaluation: Target school personnel & project staff will be asked to evaluate specific activities & services on a numeric rating system to measure their level of satisfaction with the overall quality of project services - qualitative. Target school personnel & project staff will be asked to evaluate specific activities & services on a numeric rating system to measure their level of satisfaction with the overall quality of project services - qualitative. Project Director will address the results of the summative evaluation to implement any corrective measures to eliminate potential problems by appropriately modifying & improving project activities & services for the next project year. Project Director will address the results of the summative evaluation to implement any corrective measures to eliminate potential problems by appropriately modifying & improving project activities & services for the next project year.

34 Sample TRIO Evaluation Plans Upward Bound Upward BoundUpward BoundUpward Bound Talent Search Talent Search Talent Search Talent Search Educational Opportunity Centers Educational Opportunity CentersEducational Opportunity CentersEducational Opportunity Centers Student Support Services Student Support ServicesStudent Support ServicesStudent Support Services McNair McNairMcNair

35 External Evaluation Recommended in last grant proposal competitions Recommended in last grant proposal competitions Must be conducted if included in grant proposals Must be conducted if included in grant proposals Allowable cost for the project Allowable cost for the project May be paid by the grantee agency/institution May be paid by the grantee agency/institution May be provided free of charge – in-kind/donation May be provided free of charge – in-kind/donation

36 External Evaluation May be conducted by person working/employed by agency/institution BUT not for the project May be conducted by person working/employed by agency/institution BUT not for the project Must have experience conducting evaluations for educational opportunity programs & TRIO like programs Must have experience conducting evaluations for educational opportunity programs & TRIO like programs Must state what type of evaluation will be conducted by the evaluator. Must state what type of evaluation will be conducted by the evaluator.

37 External Evaluation Process-Based Evaluation Process-Based Evaluation Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does. Process-based evaluations are geared to fully understanding how a program works -- how does it produce that results that it does. Useful if programs are long-standing and have changed over the years. Useful if programs are long-standing and have changed over the years. Useful for accurately portraying to outside parties how a program truly operates (e.g., for replication elsewhere). Useful for accurately portraying to outside parties how a program truly operates (e.g., for replication elsewhere).

38 Process-Based Evaluation Examines the extent to which a program is operating as intended by assessing ongoing program operations and whether the targeted population is being served. Examines the extent to which a program is operating as intended by assessing ongoing program operations and whether the targeted population is being served. Helps program staff identify needed interventions and/or change program components to improve service delivery. Helps program staff identify needed interventions and/or change program components to improve service delivery.

39 Process-Based Evaluation Involves collecting data that describes program operations in detail, including: Involves collecting data that describes program operations in detail, including: Types & levels of services provided Types & levels of services provided Staffing & location of service delivery Staffing & location of service delivery Sociodemographic characteristics of participants Sociodemographic characteristics of participants Community in which services are provided Community in which services are provided Linkages with collaborating agencies. Linkages with collaborating agencies.

40 External Evaluation Plan Outcomes-Based Evaluation Outcomes-based evaluation helps determine if your organization is really doing the right program activities to bring about the outcomes you believe to be needed by your clients/students. Outcomes-based evaluation helps determine if your organization is really doing the right program activities to bring about the outcomes you believe to be needed by your clients/students. Outcomes are benefits to clients from participation in the program. Example: Enhanced learning (academic knowledge, perceptions/attitudes or skills, etc). Outcomes are benefits to clients from participation in the program. Example: Enhanced learning (academic knowledge, perceptions/attitudes or skills, etc).

41 External Evaluation Plan External Evaluation Plan Outcome-Based Evaluation Outcome-Based Evaluation 1.Identify the major outcomes that you want to examine or verify for the program under evaluation. 2. Choose the outcomes that you want to examine, prioritize the outcomes. 3. For each outcome, specify what observable measures, or indicators, will suggest that you're achieving that key outcome with your clients. 4. Specify a "target" goal of clients, i.e., what number or percent of clients you commit to achieving specific outcomes

42 External Evaluation Plan  Outcome-Based Evaluation 5. Identify what information is needed to show these indicators. 6. Decide how can that information be efficiently and realistically gathered & which methods to use (ex. program documentation, observation of program personnel & clients in the program, questionnaires & interviews about clients perceived benefits from the program, case studies of program failures & successes, etc.) 7. Analyze & report the findings

43 Promotes standards in student affairs, student services, and student development programs since its inception in 1979 Develops a Book of Professional Standards and Guidelines and Self-Assessment Guides (SAG) that are designed to lead to a host of quality-controlled programs and services. CAS TRIO/EOP Standards & Guidelines & Self- Assessment Guide (SAG) for TRIO & EOP Programs Provides a vehicle and opportunity for projects to plan for formative & summative evaluation, to participate in project self-assessment, and plan for project improvements. The Council for the Advancement of Standards in Higher Education (CAS)

44 Benefits of Program Evaluation Understand, verify or increase the impact of services on participants or clients Understand, verify or increase the impact of services on participants or clients Improve delivery mechanisms to be more efficient and less costly Improve delivery mechanisms to be more efficient and less costly Produce data or verify results that can be used for public relations & promoting services in the community Produce data or verify results that can be used for public relations & promoting services in the community Fully examine & describe effective programs for duplication elsewhere. Fully examine & describe effective programs for duplication elsewhere.

45 Basic Guide to Program Evaluation Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright Adapted from the Field Guide to Nonprofit Program Design, Marketing and Evaluation.Carter McNamara, MBA, PhD, Authenticity Consulting, LLCField Guide to Nonprofit Program Design, Marketing and Evaluation Online Guide: de%20to%20Program%20Evaluation.pdf Bibliography & Resources

46 Basic Guide to Outcomes-Based Evaluation for Nonprofit Organizations with Very Limited Resources Written by Carter McNamara, MBA, PhD, Authenticity Consulting, LLC. Copyright Adapted from the Field Guide to Nonprofit Program Design, Marketing and Evaluation.Carter McNamara, MBA, PhD, Authenticity Consulting, LLCField Guide to Nonprofit Program Design, Marketing and Evaluation Bibliography & Resources

47 US Department of Health & Human Services Administration for Children & Families - Office of Planning, Research & Development The Program Manager's Guide to Evaluation eports/pmguide/pmguide_toc.html This guide explains program evaluation -what it is, how to understand it, and how to do it. It answers your questions about evaluation and explains how to use evaluation to improve programs and benefit students, families, and staff. Bibliography & Resources

48 The Council for the Advancement of Standards in Higher Education (CAS) CAS TRIO/EOP Standards & Guidelines & Self-Assessment Guide (SAG) for TRIO & EOP Programs Provides a vehicle and opportunity for projects to plan for formative & summative evaluation, to participate in project self-assessment, and plan for project improvements. Resources Basics PowerPoint Handout Assessment Model Online Store Self-Assessment Guide Bibliography & Resources

49 Questions & Answers