Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew.

Slides:



Advertisements
Similar presentations
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Advertisements

Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation – April Update.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
Managing Multi-State Assessment Consortia Lessons From American Diploma Project Network Assessment Consortium.
21 st CENTURY COMMUNITY LEARNING CENTERS 2008 BIDDER’S Workshops.
Toolkit Series from the Office of Migrant Education Webinar: SDP Toolkit August 16, 2012.
Improving Quality Systemwide October 11, What is your role in afterschool?
Afterschool Youth Outcomes Copyright © 20XX American Institutes for Research. All rights reserved. Leadership Institute Neil Naftzger.
Sub Grantee Training September 19, Welcome and Overview.
Copyright © 2007 Learning Point Associates. All rights reserved. TM Introduction to PPICS for Washington 21st CCLC Grantees Michael Hutson Research Associate.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
Frequently Asked Questions TELL Texas Survey 2014 February 2014.
1 Oregon K-12 Literacy Framework and K-3 Statewide Outreach.
Middle Level Best Practice and Student Achievement in Texas D. Michelle Williams AMLE Nashville, TN November 6, 2014.
FOR YOUTH DEVELOPMENT FOR HEALTHY LIVING FOR SOCIAL RESPONSIBILITY PUTTING SUCCESS INTO WORDS Y Readers Charlotte, NC | Y READERS | ©2012 YMCA OF GREATER.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Professional Development Day October 2009 Data Matters! Finding and Accessing Information at SPC.
OAVSNP 2014 Charlotte Alverson, NPSO Pattie Johnson, TRI Sally Simich, ODE 1.
Collaboration I nstruction Assessment 1st AnalysisReflection Intervention Assessment 2nd COMING FULL CIRCLE Mallard Creek and UNCC PDS Work Plan Outcomes.
June 19 th – PLC Day June 19 th – PLC Day Year In Review – Year In Preview District Road Map District Road Map TPEP Early Release Collaboration Early Release.
WELCOME. AGENDA  LCFF/LCAP Review  LCAP Process  Community Input/Findings  2014/15 LCAP  Plan Alignment- LEAP/LCAP/SPSA  Planning and Input.
ENHANCE Update Research Underway on the Validity of the Child Outcomes Summary (COS) Process ECO Center Advisory Board Meeting March 8, 2012 Arlington,
Cindy M. Walker & Kevin McLeod University of Wisconsin - Milwaukee Based upon work supported by the National Science Foundation Grant No
Copyright © 2007 Learning Point Associates. All rights reserved. TM Overview of the Oregon Attendees Module Neil Naftzger Principal Researcher To hear.
Evaluating a Literacy Curriculum for Adolescents: Results from Three Sites of the First Year of Striving Readers Eastern Evaluation Research Society Conference.
Miami-Dade County Public Schools Smaller Learning Communities Grant Program 2008 Cohort Nicki Brisson, Director School Choice and Parental Options.
Project Evaluation. Your Evaluators Carol Combs – Matt Courser – Paul.
Accessing and Reporting State Student Achievement Data for GPRA Purposes Amy A. Germuth, Ph.D. Compass Consulting Group, LLC.
DPI 21 st Century Community Learning Center New Grantee Orientation: Part 2.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
1 The Bill & Melinda Gates Foundation – An Overview of What We Know Now in Washington State May 11, 2006.
Danielle Biselli & Mary Masla Mission To support, expand and advocate for quality out-of-school time programs and activities for children,
IES Evaluations and Data Collection Instruments Lauren Angelo National Center for Education Evaluation and Sally Atkins-Burnett Mathematica Policy Research.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
0 Emerging Findings from the Employment Retention and Advancement (ERA) Evaluation Gayle Hamilton, MDRC Workforce Innovations 2005 Conference.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
Federal Support for World-Class Schools Gwinnett County Public Schools 4/18/13.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Title III Desk Monitoring Oregon Department of Education September 24,
Certifying Your Data The Annual Performance Report (APR) is due each fall. Data collected in APlus will be used to generate sections of the APR for each.
Ohio’s State Assessments: What do families need to know? November 2015.
What is Title I & How Can I be Involved? Annual Parent Meeting (School Name) (Date) Rowland Unified School District.
Quality Enhancements in After- School and Out-of-School Time (ASOST-Q) Competitive Grant (FC 530) Grant Information Session (ESE, Malden) June 6, 2014.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Overview of Student Learning Objectives (SLOs) for
Updates to Sub-Grantees September Welcome and Overview.
Copyright © 2011 American Institutes for Research All rights reserved Washington 21st CCLC Evaluation March 1 Webinar Neil Naftzger and Samantha.
Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved Data Collection Activities.
21 st CCLC APR System Webinar Tanya Morin Gary Sumnicht Alison Wineberg April 25 and 26, 2016.
Pathway to Excellence. School’s Out Washington provides services and guidance for organizations to ensure all young people have safe places to learn and.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Our State. Our Students. Our Success. DRAFT. Nevada Department of Education Goals Goal 1 All students are proficient in reading by the end of 3 rd grade.
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
Informational Webinar Troy Grant Assistant Executive Director for P-16 Initiatives Tennessee Higher Education Commission.
How Can Non-Profit Evaluators Best Support Field Staff in Data Collection? American Evaluation Association Annual Conference October 19 th, 2013 Kelly.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Youth CareerConnect Programs:
Thanks for coming. Introduce 21st Century and team.
Continuous Quality Improvement Process
Washington 21st CCLC Data Collection Webinar Samantha Sniegowski
Highly Capable Education
Washington 21st CCLC Annual Performance Report Technical Support Webinar Samantha Sniegowski November 2015 Copyright © 2015 American Institutes for Research.
Presentation transcript:

Ohio 21 st CCLC Evaluation Presentation to Ohio Grantees September 10, 2015 Copyright © 20XX American Institutes for Research. All rights reserved. Matthew Vinson Neil Naftzger

 Who is AIR?  Evaluation Overview  Work To Date Project Director Survey – Some Results Evaluation Advisory Group  Upcoming Individual Student Data Collection  Questions? Agenda 2

AIR:  Matt Vinson  Samantha Sniegowski  Neil Naftzger  Nicole Adams Introductions 3

 Undertaken a number of state and local research and evaluation projects, including: Statewide evaluations of 21st CCLC and related afterschool programs in New Jersey, Oregon, Rhode Island, South Carolina, Texas, Washington, and Wisconsin Local evaluations of 21st CCLC and related afterschool programs, most recently in Chicago, Nashville, and Palm Beach Research grants funded by both the William T. Grant and Charles Stewart Mott Foundations  Held a number of federal contracts related to the 21st CCLC program  Known for our comprehensive toolkit on starting and running high quality afterschool program known as Beyond the Bell American Institutes for Research 4

Conceptual Framework 5

Evaluation Overview 6

 Evaluation will cover Year 1 and Year 2 21 st CCLC grantees following Paths A, B, and C  Will last approximately three years (through June 2018)  Particular focus on literacy, career readiness, and school attachment  Comprise implementation and outcome components Evaluation Overview 7

 RQ1. How does implementation of the 21st CCLC program vary across the three primary programming paths associated with the 2015 and 2016 RFPs, particularly in relation to supporting literacy, career readiness, and school attachment?  RQ2. In what ways are 21st CCLC programs trying to take steps to ensure the quality of the programming they are delivering?  RQ3. What kinds of experiences are youth having in programming? How do these experiences vary across the three primary programming paths associated with the 2015 and 2016 RFPs? Evaluation Overview: Implementation Research Questions 8

 RQ4. To what extent do youth that participate in 21st CCLC-funded programming more frequently (high attending youth) demonstrate greater annual growth on short-term measures of targeted skills, beliefs, and knowledge compared to youth participating in the program less frequently (low attending youth)?  RQ5. To what extent do youth that participate in 21st CCLC-funded programming more frequently (high attending youth) for multiple years perform better on school-related outcomes compared to similar youth attending the same schools not participating in programming? Evaluation Overview: Outcomes Research Questions 9

 Individual student data collection  21 st CCLC data (Federal, once established)  Case study visits (to collect best practices/etc.)  Surveys (project director, teacher, staff, youth)  Literacy activity information  Local evaluation reports  ODE youth demographic data  ODE youth performance data (school-related outcomes, assessments, etc.) Evaluation Overview: Data Collection 10

 Basic Descriptive analysis (sums, averages, etc.)  Case Study write-ups  Hierarchical linear modeling (correlational analysis) Links program characteristics with survey responses, etc  Propensity score matching (PSM) Enables creation of a comparison group by matching participant population of interest with other youth on key variables (e.g., demographics, prior-year assessment scores, etc.) Next-best to random assignment Has shown interesting results in other state assessments Evaluation Overview: Methods 11

 Project Director Survey (finished, ~80% return rate)  Establish an Evaluation Advisory Group  Individual Student data collection (Open Today)  Arrange initial site visits  Review Local Evaluation Reports  Surveys Expect more information soon! Evaluation Overview: Near-Term Research Efforts 12

 Administered during June and July  About 80% return rate  58 Programs responded and completed the survey  Covering paths A, B, and C Work to Date: Project Director Surveys 13

To what extent do you see it as a goal of the program to impact youth in the following ways related to the development of Literacy skills? Work to Date: Project Director Surveys 14 This is not a goal of the program This is a minor goal of the program This is a moderate goal of the program This is a major goal of the program Not Sure Development and practice of basic literacy skills 0%7%3%88%0% Development of literacy-related learning strategies 0%9%10%79%0% Enhance student confidence as readers 0%3%10%84%0% Cultivation of a positive mindset related to reading (e.g., growth mindset - if I put forth the effort I can succeed) 0%2%19%78%0% Cultivation of interest in reading 0%5%7%86%0%

Please select the various types of staff that provide literacy activities within your program. (select all that apply) Work to Date: Project Director Surveys 15

For activities that are especially meant to support student growth and development in literacy, what is the typical staff to student ratio? Work to Date: Project Director Surveys 16

Are you using any published or externally developed curriculum selected specifically to support literacy activities delivered in the afterschool program? Work to Date: Project Director Surveys 17

In the typical week, how many hours, if any, are dedicated to providing direct instruction in LITERACY to at least some participating youth? (Please enter a value greater than or equal to zero) Work to Date: Project Director Surveys 18

Approximately what percentage of youth served by your program are participating in direct instruction-related LITERACY activities each week? (Please enter a value greater than zero) Work to Date: Project Director Surveys 19

For those students participating in direct instruction-related LITERCY activities, approximately what percentage of time do they spend in direct instruction activities as a percentage of their total weekly participation in the program? Work to Date: Project Director Surveys 20

Does your program include activities that are meant to get parents and other adult family members more involved in supporting the literacy development of students enrolled in the program? Work to Date: Project Director Surveys 21

Work to Date: Evaluation Advisory Group (EAG) 22  About 12 Non-AIR, Non-ODE Advisors  Grant Directors and Local Evaluators  Conference Call on September 2 (introduction and individual student data collection)  Conference Call on September 7 (important concerns of local evaluators)  We will continue to meet concerning evaluation tasks, data-collection instruments, grantee concerns, and so on.

Upcoming: Individual Student Data Collection 23  Purpose: Collects data not otherwise available: – 21 st CCLC participant names – Participant attendance (number of days for summer 2014, fall 2014, and spring 2015) – Estimated hours of literacy instruction per week SSID (state student identifier) Provides AIR with youth participation data Enables linkage to ODE state data

Upcoming: Individual Student Data Collection 24  What about student privacy? All data collected in the system, along with all other data collected as part of the evaluation, are strictly for use by AIR to assess Ohio 21 st CCLC program impact. The data will be put to no other use. Further, AIR has strict, industry-standard security measures in place to ensure all data kept absolutely private. At no time will any youth identifying information be released in any report (or otherwise). AIR is working with ODE to ensure all data are kept secure, and all legal aspects are covered.

Upcoming: Individual Student Data Collection 25

Upcoming: Individual Student Data Collection 26

Upcoming: Individual Student Data Collection 27

Upcoming: Individual Student Data Collection 28

Upcoming: Individual Student Data Collection 29

Upcoming: Individual Student Data Collection 30

Upcoming: Individual Student Data Collection 31

Upcoming: Individual Student Data Collection 32

Upcoming: Individual Student Data Collection 33

Upcoming: Individual Student Data Collection 34

Upcoming: Individual Student Data Collection 35

Upcoming: Individual Student Data Collection 36

Upcoming: Individual Student Data Collection 37

Upcoming: Individual Student Data Collection 38

Upcoming: Individual Student Data Collection 39  The Individual Student Data Collection System is launching in two stages.  The “delegation” component (for district staff) is forthcoming: We want to make sure it works properly. We want to incorporate feedback we received to make sure the process flows as smoothly as possible.  You will receive an from AIR when the delegation function is ready.

Upcoming: Individual Student Data Collection 40  Next Steps: You will receive an with the url link to the individual student data collection system The will have login instruction Begin entering: – SUMMER 2014 – SCHOOL YEAR Send questions to Send feedback to

Matthew Vinson Neil Naftzger Questions? 41