CAMP Evaluation: Measurement of Strategy Implementation Office of Migrant Education United States Department of Education CAMP Technical Assistance Webinar.

Slides:



Advertisements
Similar presentations
Introduction to Assessment – Support Services Andrea Brown Director of Program Assessment and Institutional Research Dr. Debra Bryant Accreditation Liaison.
Advertisements

Mission of the Office of Migrant Education To provide excellent leadership, technical assistance, and financial support to improve the educational opportunities.
Mission of the Office of Migrant Education To provide excellent leadership, technical assistance, and financial support to improve the educational opportunities.
Office of Special Education & Early Intervention Services Getting Ready at the Local Level Preparing for the Service Provider Self-Review.
1 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt 10 pt 15 pt 20 pt 25 pt 5 pt Transition.
Evaluation at The Prince’s Trust Fire Service Prince's Trust Association meeting 18 th February 2010 Subtitle.
Does It Work? Evaluating Your Program
By: Shorena Dolaberidze and Gulshan Huseynli DROPOUT PROBLEM AT TECHNICAL LYCEUM.
1 Referrals, Evaluations and Eligibility Determinations Office of Vocational and Educational Services for Individuals with Disabilities Special Education.
Evaluation.
Refresher: Background on Federal and State Requirements.
Key Communities and Objectives Outcomes- Based Assessment Telling the Story Results Closing the Loop.
Evaluating and Revising the Physical Education Instructional Program.
Report Card Improvements Grades 7-12
Evaluation. Practical Evaluation Michael Quinn Patton.
Mission of the Office of Migrant Education To provide excellent leadership, technical assistance, and financial support to improve the educational opportunities.
 A written document that specifies how, where, and to whom a business plans to market its product(s) and/or brand(s).  A small business typically creates.
PowerBase Webinar: Using Campaigns & Surveys Alice Aguilar Program Director For Audio call: ; access code #
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Action Research: For Both Teacher and Student
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
Tennessee Department of Education Compliance Training February 2012 Department of Exceptional Children.
ACUPA The Association of College and University Policy Administrators Communicating Policy Michele Gross University Policy Program Director.
COLLEGE ASSISTANCE MIGRANT PROGRAM C.A.M.P. “A FIRST YEAR RETENTION PROGRAM” February 9-10, 2009 Hyatt Regency San Francisco Airport Burlingame, California.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Mentoring for Excluded Groups and Networks (MEGAN) Peer Review Report Dr. Ioan Durnescu Brussels
The Evaluation Plan.
If you don’t know where you’re going, any road will take you there.
Is a systematic process of evaluating and managing employee performance in order to achieve the best outcomes for a business PERFORMANCE MANAGEMENT.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
SASA WEBGRAM State Title I Directors July 27, 2011 Topic: 2011 Grantee Satisfaction Survey Patricia A. McKee Acting Director Student Achievement and School.
Evaluation Research and Engineering Education Lesley Jolly For AaeE ERM wiki at
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
Module 4 Evaluating Services to Binational Migrant Students Designing an Implementation and Outcome Evaluation of State and Local Binational Services 1.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Academic Progress Plan Results. Two Topics to be Covered ASD DCAS results relative to other Delaware school districts SY Performance.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
The State of Maine Managerial Effectiveness Survey Results.
ASSESSMENT OF STUDENT SUPPORT SERVICES Kimberly Gargiulo, Coordinator of Assessment Office of Institutional Research and Assessment.
IEP Training for Kansas Schools 2013 – 2014 Kansas State Department of Education Technical Assistance System Network (TASN) Overview and Preparation for.
The mission of the Office of Migrant Education is to provide excellent leadership, technical assistance, and financial support to improve the educational.
The Nature of Advertising Research. Objectives  To review the IMC planning process  To understand the role of research in the context of marketing and.
Project Report. Suggested TOC Executive Summary Project Background and Assumptions Vision and Mission Statements Objectives SWOT Analysis Recommended.
Progress Monitoring for All Student Adapted from the Kentucky Systems of Interventions Guidance Document.
Getting Your Stories Straight: Using Examples and Anecdotes as Outcome Measures Montgomery County September 17, 2015 Barry Jay Seltser.
July 21-22, “ TOP 10” Projects High School Equivalency Program (HEP) and College Assistance Migrant Program (CAMP) US Department of Education.
Formative assessment and effective feedback at Manor Lakes College
The University of Kentucky Program Review Process for Administrative Units April 18 & 20, 2006 JoLynn Noe, Assistant Director Office of Assessment
BEGINNING EDUCATOR INDUCTION PROGRAM MEETING CCSD Professional Development Mrs. Jackie Miller Dr. Shannon Carroll August 6, 2014.
Measuring the Impact of Your Volunteer Program Barbra J. Portzline, Ph.D. Liz Benton, MBA.
Advances in Human Resource Development and Management Course code: MGT 712 Lecture 9.
The Nebraska Frameworks External Visitation Team Exit Report.
RTI Defined and Refined A System for Implementing RTI to Set Student Goals, Track Progress and Close Gaps.
A Commitment to Continuous Improvement in Teaching and Learning Michaela Rome, Ph.D. NYU Assistant Vice Provost for Assessment.
Planning a Customer Survey Part 1 of 3 Elaine Carlson, Westat Anne D’Agostino, Compass Evaluation & Research.
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
2018 OSEP Project Directors’ Conference
Overview What is evaluation? Why an evaluation framework?
Highline Community College Assessment Training
Process Evaluation the implementation phase
Industrial Technology Management Program Canino School of Engineering Technology Fall 2016 Assessment Report Curriculum Coordinator: Eric Y. Cheng Date.
Industrial Technology Management Program Canino School of Engineering Technology Fall 2015 Assessment Report Curriculum Coordinator: Eric Y. Cheng Date.
AO4 Evaluation.
Refresher: Background on Federal and State Requirements
Assessment Day Strategy
Lincoln Intermediate Unit 12
Presentation transcript:

CAMP Evaluation: Measurement of Strategy Implementation Office of Migrant Education United States Department of Education CAMP Technical Assistance Webinar 5 October

Background for the Webinar 2 We have some variation in our results.

Background for the Webinar 3  2010 GPRA Measure 1 Target = Not Met  National Target = 86% of students complete their first year in college  National Results = 85%  Performance results excluded first year (2009) projects  2010 GPRA Measure 2 Target – MET!  National Target = 85% of first year completers continue into a second year  National Results = 88%  Performance results excluded first year (2009) projects

Purpose of the Webinar 4  SHARE PRACTICES from Improving or High Performing Projects, so that…  IMPROVE CAMP results nationwide, so that we may effectively and efficiently assist Migrant students to successfully complete their first year, and continue into their second year.  CONTROL VARIATION so that our CAMP projects are all successful.

Today’s Objectives Review the meaning of program evaluation. Review the four components of program evaluation. Individual CAMP project directors present effective ways to measure strategy implementation. Questions and answers for our presenters. Leave our webinar with ideas to improve measurement of strategy implementation! 5

The Meaning of Program Evaluation Evaluation means systematically and methodically collecting information about a program in order to improve the program or make decisions about the merit or worth of the program. 6

Components of Program Evaluation 1. To Determine Overall Effectiveness in Meeting Program Goals/Objectives (Are we on track?) 2. To Determine the Level Program Activities Are Being Implemented (Did we really deliver what we agreed to deliver in the application? How do we know?) We need both! Now let’s hear from the audience why we need program implementation… 7

Components of Program Evaluation (cont.) 3. To Identify Strengths and Weaknesses in Program Implementation and Program Effectiveness (What strategies are working, and which ones are not?) 4. To Provide Recommendations for Change in Program, in Order to Improve Results (What can we do differently to change the results?) 8

Components of Program Evaluation 1. To Determine Overall Effectiveness in Meeting Program Goals/Objectives  Are program activities effective, leading to the achievement of strong GPRA 1 and GPRA 2 results and reaching these targets?  What evidence is available that supports the conclusions?  Are program activities cost efficient? 9

Components of Program Evaluation 2. To Determine the Level Program Activities Are Being Implemented (Prerequisite to ALL Evaluation Practices)  “With a level of quality, and all of the time!”  “With a level of quality, most of the time ”  “With a level of quality, some of the time …”  “We really didn’t have the time  ” 10

Components of Program Evaluation Why study level of implementation? It can help answer the questions: 1. Did our project fulfill its responsibilities in our application? 2. Did our project’s strategies make a difference in meeting our GPRA targets? 11

Components of Program Evaluation 3. To Identify Strengths and Weaknesses in Program Implementation and Program Effectiveness  Through customer satisfaction surveys (plus and delta, all services)  Through observations (e.g., tutoring, instruction, placement)  Through documentation (e.g., logs of recruitment, counseling, tutoring)  Through exit interviews  Through research, by finding correlations between practices and results (using student level data, discovering relationships between practices and achievement) 12

Components of Program Evaluation 4. To Provide Recommendations for Change in Program, in Order to Improve Results. Examples:  Tutoring targeted to specific skills (teachers/mentors/students)  Mentors with regular and structured meetings, specific supports  Individualized Education Plans with specific learning objectives and learning plans  Structured recruiting plan 13

Measurement of Strategy Implementation… How Do We Know We Implemented Each Service? CAMP Directors Present Ideas…  Viridiana Diaz, CSU – Sacramento  Carolina Gonzalez-Lujan, CSU – Monterey Bay  Gypsy Hall – Boise State University  Doris Roundtree – ABAC  Minerva Gonzalez, CSU – San Marcos  Josue Estrada – Washington State University 14

Measurement of Strategy Implementation… CSU-San Marcos, Minerva Gonzales  Mid-Semester Evaluations: Communication with Instructors/Advisor  Appropriate Interventions  Recruitment  Communication with Students: Support  Counseling 15

Measurement of Strategy Implementation Questions for the presenters, from the audience… 16

Summary Good evaluation will tell you:  If program activities are delivering results.  If program activities are being highly implemented.  Program strengths and weaknesses.  Suggestions for program improvement. 17

Program Evaluation THANK YOU! In order for OME to better serve your needs, please provide an evaluation of today’s webinar! Also, if you’d like to discuss your program’s evaluation process… Ed Monaghan