Capacity-Building Workshop 2013 GEAR UP Evaluation Page 1 GEAR UP Evaluation 101 NCCEP/GEAR UP Capacity-Building Workshop Caesars Palace Las Vegas February.

Slides:



Advertisements
Similar presentations
ENTITIES FOR A UN SYSTEM EVALUATION FRAMEWORK 17th MEETING OF SENIOR FELLOWSHIP OFFICERS OF THE UNITED NATIONS SYSTEM AND HOST COUNTRY AGENCIES BY DAVIDE.
Advertisements

Introduction to Monitoring and Evaluation
Educational Specialists Performance Evaluation System
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
GEAR UP Idaho  GEAR UP Idaho is a federal grant program that provides comprehensive, early intervention college access programming to selected Idaho.
1 Educator and Provider Support Grant Policy and Research Committee EEC Board October 6, 2014.
GEAR UP COMPETITIVE PRIORITY #3 DATA COLLECTION Kate Mahar, Ed.D. Charlotte Curtis, Ed.D.
Ohio Improvement Process (OIP) August Core Principles of OIP  Use a collaborative, collegial process which initiates and institutes Leadership.
New and Emerging GEAR UP Evaluators
Illinois High School to College Success Report High School Feedback Reporting In Cooperation with ACT, Inc. Illinois Community College Board Illinois Board.
Comprehensive M&E Systems
Quality evaluation and improvement for Internal Audit
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Standards and Guidelines for Quality Assurance in the European
Working Toward a Statewide Information System to Track the Effectiveness of Student Aid Financial Programs in Maryland Michael J. Keller Director of Policy.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Needs Analysis Session Scottish Community Development Centre November 2007.
GUIDANCE SYSTEM OF SUPPORT COLLEGE AND CAREER READY FOR ALL Guidance and Counseling Fall 2011.
Sub-theme Three The Self-Assessment Process and Embedding QA into the Life of an Institution by Terry Miosi, Ph.D. UAE Qualification Framework Project.
Program Evaluation Using qualitative & qualitative methods.
TITLEIIA(3) IMPROVING TEACHER QUALITY COMPETITIVE GRANTS PROGRAM 1.
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Assistant Principal Meeting August 28, :00am to 12:00pm.
Using Data to Improve Student Achievement Aimee R. Guidera Director, Data Quality Campaign National Center for Education Accountability April 23, 2007.
Year Seven Self-Evaluation Workshop OR Getting from Here to There Northwest Commission on Colleges and Universities.
PARENT COORDINATOR INFORMATION SESSION PARENT ACCOUNTABILITY Wednesday, July 20, 2011 Madelene Chan, Supt. D24 Danielle DiMango, Supt. D25.
COLLEGE AND CAREER READINESS EVALUATION CONSORTIUM ED Affinity Webinar March 20, 2014 Presented by Members and Partners of the College & Career Readiness.
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
DATA COLLECTION, STORAGE AND REPORTING FOR COLLEGE ACCESS PROGRAMS The University of North Carolina General Administration February 22, 2012.
OECD/INFE Tools for evaluating financial education programmes Adele Atkinson, PhD Policy Analyst OECD With the support of the Russian/World Bank/OECD Trust.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Tracking of GEF Portfolio: Monitoring and Evaluation of Results Sub-regional Workshop for GEF Focal Points Aaron Zazueta March 2010 Hanoi, Vietnam.
Alaska Staff Development Network – Follow-Up Webinar Emerging Trends and issues in Teacher Evaluation: Implications for Alaska April 17, :45 – 5:15.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
@EdDataCampaign Mining the Data: What States Have and Where to Find It February 7, 2012 Elizabeth Laird Director, Communications and External Affairs Data.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
South Western School District Differentiated Supervision Plan DRAFT 2010.
W HAT IS M&E  Day-to-day follow up of activities during implementation to measure progress and identify deviations  Monitoring is the routine and systematic.
Longitudinal Data Systems: What Can They Do for Me? Nancy J. Smith, Ph.D. Deputy Director Data Quality Campaign November 30, 2007.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
P-20 Statewide Longitudinal Data System (SLDS) Update Center for Educational Performance and Information (CEPI)
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Project 3 Supporting Technology. Project Proposal.
W W W. C E S. C L E M S O N. E D U / G E / Planning Engineering Education Research Facilitator: Matthew W. Ohland.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Moving Title IA School Plans into Indistar ESEA Odyssey Summer 2015 Presented by Melinda Bessner Oregon Department of Education.
Broward County Public Schools BP #3 Optimal Relationships
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Development of Gender Sensitive M&E: Tools and Strategies.
Applied Research Consultants: Applied Research Consultants: Assessment Services and External Evaluation.
Kansas Education Longitudinal Data System Update to Kansas Commission on Graduation and Dropout Prevention and Recovery December 2010 Kathy Gosa Director,
External Review Exit Report Campbell County Schools November 15-18, 2015.
Linking information for better lives in Connecticut
Assessing Academic Programs at IPFW
Technical and Advisory Meeting
Presentation transcript:

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 1 GEAR UP Evaluation 101 NCCEP/GEAR UP Capacity-Building Workshop Caesars Palace Las Vegas February 4, 2013 Chrissy Y. Tillery NCCEP Director of Evaluation Capacity-Building Workshop 2013

GEAR UP Evaluation Page 2 National GEAR UP Objectives  National Objective 1: Increase the academic performance and preparation for postsecondary education for GEAR UP students.  National Objective 2: Increase the rate of high school graduation and participation in postsecondary education for GEAR UP students.  National Objective 3: Increase GEAR UP students’ and their families’ knowledge of postsecondary education options, preparation and financing.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 3 Evaluation Terminology Qualitative Analyses Analysis that involves descriptions and narrative; data is observed. Analysis can focus on different types of qualitative analyses including interpretive and narrative, critical theory, participatory action research, phenomenology, etc. Some examples include:  Focus groups  Case studies  Interviews  Ethnography

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 4 Evaluation Terminology Quantitative Analyses Analysis that involves numbers/inferential statistics; data is measured for growth or significance. Embedding quantitative analysis into specific research studies within the overall evaluation is a way to measure more specific outcomes. Some examples include:  Descriptive Statistics  Frequencies, Averages, Percentages  t-test  ANOVA  Regression  Propensity Score Matching

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 5 Evaluation Terminology Formative Evaluation Evaluation conducted and reported on an ongoing basis throughout the project to continuously assess the project. Provides program staff with knowledge of how the quality and impact of project activities can be improved. Allows for ongoing data-driven decisions to be made.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 6 Evaluation Terminology Summative Evaluation Evaluation conducted at the conclusion of the project to assess the overall impact of the project in terms of meeting goals and utilizing efficient resources. Used to report final program outcomes.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 7 Evaluation Terminology  National GEAR UP Objective National Objective 1: Increase the academic performance and preparation for postsecondary education for GEAR UP students. National Objective 2: Increase the rate of high school graduation and participation in postsecondary education for GEAR UP students. National Objective 3: Increase GEAR UP students’ and their families’ knowledge of postsecondary education options, preparation and financing.  Project Objective – GPRA (Government Performance and Results Act)Performance Indicators Individualized by grant Each Project Objective should fall under one of the three National GEAR UP Objectives  Performance Measure Should include the following:  Baseline Data  Target Benchmarks  Performance Indicators

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 8 Types of Data Baseline Data/Pre-Intervention Data Data collected on students in target schools prior to GEAR UP intervention Intervention Data Data collected on students in target schools receiving the GEAR UP intervention Post-Intervention Data Data collected on students in target schools after the GEAR UP intervention

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 9 A Model for Program Evaluation Continuous Data Collection Formative Data Analyses Program Implementation and Revisions Policy RecommendationsSummative Data Analyses

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 10 Data Collection Partners State Education Agency Local Education Agencies University System Community College System Private/Independent Colleges and Universities State Education Assistance Authority Business Partners Standardized Testing Agencies – ACT/College Board National Student Clearinghouse

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 11 Evaluation 101: Worksheet 1

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 12 Characteristics of Effective Data Collection A relational database that is linked by a unique identifier. A data system that defines all variables consistently allowing for comparisons. A data system that allows for customization related to grant activities. A data system that allows for formative and summative evaluation and longitudinal data tracking. A data system compliant with FERPA regulations.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 13 Levels of Data Collection Student Level Data School Level Data State Level Data National Data

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 14 Student Level Data GEAR UP Student Services GEAR UP Parent/Family Services GEAR UP Professional Development services Student level demographic data Student level attendance and discipline data Student level academic data including GPA, state assessment scores, and course data Student level dropout and promotion data Standardized assessment data Survey data FAFSA data National Student Clearinghouse data for enrollment, persistence, and graduation Postsecondary data, i.e., remediation data, etc. *Link data using a unique identifier.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 15 School Level Data Percentage of students receiving free and reduced-price lunch Percentage of advanced college preparatory courses Cohort graduation rate Average daily attendance Percentage of fully licensed teachers Percentage of highly qualified teachers Teacher turnover rate Percentage of GEAR UP dollars spent in relation to how much each school was allocated College Going Culture Data

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 16 Evaluation 101: Worksheet 2

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 17 Setting Up Your Data Non-Technical Build Relationships Define Legal Agreements (MOA) Define Data Elements Test & Validate DataTrain Staff & Document Technical Data SystemLinking Tables of DataWeb Interface Data EntryData LoadingReporting

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 18 Data Exchange Considerations Define file layouts Various layout options: CSV, XML, etc. Clearly define the file layout. Insist on precision from data provider, i.e. requires no manual manipulation on your end. Insist on consistency across data feeds, i.e. the file layout does not change. Ensure clarity in communication. Define data exchange protocol Secure FTP, Direct access to partner’s system to extract data, or Secure Website, etc. *Define data change process, i.e., how will changes to data outline be addressed.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 19 Data Inputs and Outputs

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 20 Legal Considerations Guidance from Legal Counsel Institutional Review Board (IRB) review Family Educational Rights and Privacy Act (FERPA) Confidentiality Agreements Confidentiality Agreements for GEAR UP Personnel (GEAR UP staff, Coordinators, etc.) Confidentiality Agreements for External Consultants (Consultants, External Evaluators, etc.)

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 21 Security Considerations Encryption: Make sure steps are taken to encrypt sensitive data elements. Efficiency: Monitor databases to ensure data are cleaned and linked. Security: Keep the number of users with direct database access to a minimum. Have users sign a Confidentiality Agreement. Disaster Recovery: Make sure your databases are being backed up nightly and that a clear plan for restoration and recovery is outlined. Understand now how long you intend to store data and put measures in place to ensure that can happen.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 22 National Student Clearinghouse Postsecondary Data Tracking StudentTracker for High Schools answers the following questions :  Which of your high school graduates enrolled in college?  Where did they enroll?  Did they enroll where they applied? Was it their first choice?  Did they graduate after six years? The National Student Clearinghouse’s database is the only nationwide collection of collegiate enrollment and degree data. These are actual student records provided to the Clearinghouse every days by our more than 3,300 participating postsecondary institutions, which enroll over 92% of all U.S. higher education students. After StudentTracker matches your records against their database, you’ll receive a comprehensive report containing the information you need to better assess the college attendance, persistence and achievement of your graduates. See:

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 23 National Student Clearinghouse Interpreting National Student Clearinghouse Data and setting up files with a unique identifier.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 24 Internal and External Evaluation  GEAR UP must have “implementation of a mechanism to continuously assess progress toward achieving objectives and outcomes, and to obtain feedback on program services and provisions that may need to be altered.”  Internal Evaluator(s):  Important to continuously assess the program.  Important to have a complete understanding and connection to the program.  Important as a trainer for GEAR UP Coordinators and staff in the schools.  Important to continuously manage the data for data integrity.  Important for day-to-day oversight of evaluation activities.  External Evaluator(s):  Important to assess the program from an outside perspective.  Important to conduct parallel or independent analysis separate from internal evaluator(s) for integrity of results.  Important that they have knowledge of one or more of the following: (1) GEAR UP; (2) long term program evaluation; (3) best practices in research methodologies for accurate analysis; and (4) longitudinal analysis.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 25 Evaluation Points to Consider  Research design should match and be appropriate for data collection and analysis.  Evaluation framework should be built around already known local, state, and national data on college-access.  Use prior GEAR UP data to build upon what was successful or what could be strengthened.  Embedded research projects within the overall evaluation can strengthen your proposal and program outcomes.

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 26 Evaluation Resources  The Program Evaluations Standards: A Guide for Evaluators and Evaluation Users (3 rd Edition) published by the Joint Committee on Standards for Educational Evaluation (2011)  The Institute for Educational Sciences (IES) Practice Guides  The What Works Clearinghouse   American Educational Research Association (AERA)   American Evaluation Association (AEA) 

Capacity-Building Workshop 2013 GEAR UP Evaluation Page 27 Capacity-Building Workshop 2013 Thank you for attending the For additional information regarding the Evaluation 101 session, please contact Chrissy Tillery at , extension 108 or NCCEP/GEAR UP Capacity-Building Workshop