1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Program Development & Logic Model
Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Donald T. Simeon Caribbean Health Research Council
THIS WORKSHOP WILL ADDRESS WHY THE FOLLOWING ARE IMPORTANT: 1. A comprehensive rationale for funding; 2. Measurable objectives and performance indicators/performance.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
Campus Improvement Plans
New and Emerging GEAR UP Evaluators
Specific outcomes can be compared to resources expended; and successful programs can be highlighted;
9 th Annual Public Health Finance Roundtable November 3, 2012 Boston, MA Peggy Honoré.
An Assessment Primer Fall 2007 Click here to begin.
SEM Planning Model.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Talbert House Project PASS Goals and Outcomes.
Reporting Results for Pesticide Programs Robin Powell, EM Pyramid Lake Paiute Tribe Environmental Department.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Using Measurable Outcomes to Evaluate Tutor Programs Jan Norton, Presenter.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Innovative Educators Webinar December 1, 2010 Jan Norton, Presenter.
Student Data Reporting Tools Michele J. Hansen, Ph.D. Executive Director Student Data, Analysis, and Evaluation Enrollment Management Advisory Council.
INTRODUCTION TO OPERATIONS RESEARCH Niranjan Saggurti, PhD Senior Researcher Scientific Development Workshop Operations Research on GBV/HIV AIDS 2014,
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
PROJECT OBJECTIVES Identify, procure, and implement software that provided a common system for students, faculty, and staff to enter and measure.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Data-Driven Change: Tools to Improve Campus-wide Retention Derek Price, DVP-PRAXIS LTD Vincent Tinto, Syracuse University/Pell Institute for the Study.
ILASFAA: 2014 MAP ADVISING RECOMMENDATIONS Background MAP Task Force – Illinois General Assembly – 2013 Concluded that students would benefit from.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
WRITING THE SUCCESSFUL PROPOSAL C. June Strickland, Ph.D., RN Associate Professor University of Washington School of Nursing.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Creating a S.I.M.P.L.E. & Comprehensive Assessment Plan for Residence Life and Housing Dr. Mike Fulford Emory University Student Affairs Assessment Conference.
The Change Proposal Review of the Elements Chuck Vergon / 6933.
The Value of Data The Vital Importance of Accountability American Institutes for Research February 2005.
AGEP Evaluation Capacity Meeting 2008 Yolanda George, Deputy Director, Education & Human Resources Programs.
Training Individuals to Implement a Brief Experimental Analysis of Oral Reading Fluency Amber Zank, M.S.E & Michael Axelrod, Ph.D. Human Development Center.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Key Considerations in Collecting Student Follow-up Data NACTEI May 15, 2012 Portland, OR Promoting Rigorous Career and Technical Education Programs of.
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Best Practices in Demonstrating Evidence Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Carla Ganiel, AmeriCorps State and National.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Dr. Dawn Person Chieh-hsing Chou (Jessie) Spring 2010.
Master Plan Process FY Proposed Draft. October - February Cluster Groups and Units Identify Initiatives These are general goals or outcomes that.
Performance Monitoring COURTNEY MILLS SCPCSD DIRECTOR OF ACADEMIC PROGRAMS.
Logic Model Training for HUD SuperNOFA Grantees U.S. Department of Housing and Urban Development Satellite Broadcast – June 1, 2004 – 2:00PM-5:00PM Washington,
Institutional Effectiveness at CPCC DENISE H WELLS.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
HLC Criterion Five Primer Thursday, Nov. 5, :40 – 11:40 a.m. Event Center.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Development of Gender Sensitive M&E: Tools and Strategies.
Research And Evaluation Differences Between Research and Evaluation  Research and evaluation are closely related but differ in four ways: –The purpose.
Demonstrating Institutional Effectiveness Documenting Using SPOL.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Assessment of Advising Linda Taylor, PhD, LPC, NCC September 9, 2011.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
325D: PREPARATION OF LEADERSHIP PERSONNEL Webinar on Project Objective and Performance Measures for the Annual Performance Report for Continuation Funding.
College Success Program John Cowles, Ph.D. Dean of Student Success and Retention Grand Rapids Community College Grand Rapids, Michigan.
How To Build An Assessment And Impact Model Dr. Suzan Harkness
An Introduction to Evaluating Federal Title Funding
Presentation transcript:

1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS

2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access and completion among differing student populations Improving academic attainment Strengthening institutions Strengthening accountability

3 We do this by: Expanding and enhancing institutional - - Academic quality - Management - Financial stability Using Federal program grant funds Implementing specific projects based on established and innovative practices Following sound management and oversight processes Measuring and reporting the results of the projects

4 Success is evaluated by: Students Project staff Institutions Department of Education Office of Management and Budget Congress America’s taxpayers

5 Success is measured, in part, by: An institution’s evaluation results –Project performance data –Annual Performance Report (APR) data Percentage of project goals met or exceeded The Department’s program evaluation results –Strategic Plan program data –Achieving a high rating on OMB’s Program Assessment and Rating Tool (PART) How well we use this information for - –Strategic planning, decision-making, funding, and improving student success nationwide

6 Program and Project Processes for Achieving and Measuring Success: Project I.Development plan II.Implementation strategy & activity objectives III.Project management plan (key personnel & budget) IV.Evaluation Plan Program I.Purpose II.Strategic plan with national goals and measures III.Program Management IV.Program Results

7 Program and Project Results Results are outcomes based on activities (interventions) and costs Outcomes are quantifiable measures of: –Annual and long-term performance improvements; and –Result from the funded activities Outcomes affect future policy and funding decisions

8 Defining Annual Outcomes Program Measure Students at grantee institutions persist at a higher rate than other similar students who previously attended these institutions or currently attend other institutions. Project Measure Students who receive tutoring pass courses and improve GPAs at a higher rate than similar students at the school who were not tutored.

9 Defining Long -Term Outcomes Program Measure Students at grantee institutions graduate at a higher rate than similar students who previously attended these institutions or attend other institutions. Project Measure Students who receive tutoring graduate at a higher rate than similar students not tutored.

10 Outcomes are not: Output data: Activities necessary to achieve impact and outcome goals - Number of tutoring sessions held Number students receiving tutoring Staff hours and costs per activity Descriptive data: General information about the the project and participants - Funding amount Length of project Socio-economic data Race/ethnicity

11 Quantitative Outcome Measure Consistent measure overtime True and accurate measure of the intervention outcome Easily interpreted data (e.g., rates and proportions) Is not based on qualitative (e.g., surveys, focus groups) or descriptive data

12 Planning Before Grant Writing Is Key 1.Purpose – defines the problem 2.Goals – establishes objectives/outcomes that are: –Measurable and logically related to intervention and anticipated results –Based on historical data and benchmarking 3.Evaluation Plan - –Shows how the goals/objectives will be met –Establishes processes for measuring if the goals/objectives were met and at what cost –Identifies approach for clearly reporting results (outcomes based on activities and costs) in a timely manner

13 Benchmark & Benchmarking Benchmark - A standard by which something can be measured or judged Benchmarking - To measure (rival’s product) according to specified standards in order to compare it with and improve one’s own product

14 Data Timing Evaluation plan must ensure data are useful for strategic, operational, policy, and budget decision- making School  APR Department  Performance accountability report (October)  Integrated performance and budget (November) OMB – President’s integrated performance and budget plan (early in calendar year)

15 Evaluation Process for Measuring Outcomes Classic experimental approach: 1.Similar students randomly assigned to intervention and control groups 2.Pre- and post- intervention performance measurement 3.Multiple interventions and measurements over time

16 Difficulties in Implementing Random Assignment Students generally self-select Identifying a valid comparison group Options to help address these: Draw comparison group from waiting list or those who applied but were not selected. Use historical data and benchmarking on similar students performance.

17 Difficulties Implementing Pre- and Post- Testing Identifying valid baseline and post-activity data measures Options to help address this: Use institutional data measures already collected - –Institutional records (services offered) –Course grades –Graduation rates Identify Federal & other data sources already collected, such as the National Center for Educational Statistic’s IPEDS database.

18 Difficulties Implementing Multiple Interventions and Data Collections Students may not choose to participate annually or consistently. Options to help address this: Collect student-specific information. Measure outcomes each semester. Conduct same activity and measurement with similar students over time. Conduct activity and measurement at multiple institutions. Look for relationships between varying usage of activity and performance of participants.

19 Difficulties in Implementing the Evaluation Plan Limited resources-staff and money Implementation staff also conduct analysis Options to help address these: Request evaluation monies in grant application (5-10%). Hire an independent evaluator. Use academic staff and students. Use measures and student identifiers already collected.

20 Difficulties in Collecting and Analyzing Data Data are self-reported Inconsistent or missing data Options to help address these: Document clear data definitions. Require routine, systematic collection and provide training/guidance. Develop simple data collection tools (Excel, Access databases). Hold project staff meetings to regularly review data collection processes, data quality, progress in meeting performance outcomes.

21 Analysis of Data and Reporting Your Results Flows from evaluation plan and data collection processes in grant application Includes descriptive, quantitative, and qualitative data collected Results: outcomes based on activities and costs Identification of “best practices” Conclusions and recommendations

22 Project Evaluation Plan Must be fully documented Address the three components of an experimental design Specify data definitions, and collection and analysis processes. Include outcome and cost data needed to measure results. Completed for the grant application

23 Resources ED: offices/list/ope/fipse/evaluate NSF: FY 2004 Annual Plan: – an/plan2004.doc – e-institutionaldev.html