Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved Data Collection Activities Samantha Sniegowski Researcher
Statewide Evaluation Report Findings Leading Indicator Report Updates Data Dashboard Updates Case Studies APR Data Submission Updates Other Spring Data Collection Activities Agenda 2
Statewide Evaluation Report Findings 3
Covers two program years: & Grantee & Center Characteristics covered in both years Program Quality covered in both years Focus: Impact Analysis on Youth Outcomes Focus: Youth Motivation, Engagement, and Beliefs Survey Report Background 4
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 5
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 6
Key Findings – Evaluation Q1 7 Table 2. Grants by Maturity, 2012–13 Washington GrantsAll Grants Nationwide Grant MaturityN Grants% GrantsN Grants% Grants New % % Mature %1, % Sustaining %1, % Total grantees %3, % Source. PPICS. Table 3. Grants by Maturity, 2013–14 Washington GrantsAll Grants Nationwide Grant MaturityN Grants% GrantsN Grants% Grants New %1293.4% Mature %1, % Sustaining %1, % Total grantees %3, % Source. PPICS.
Key Findings – Evaluation Q1 8 Figure 8. Percentage of Centers per Grade-Level Cluster per Year, 2008–2014
Key Findings – Evaluation Q1 9 Figure 9. Attendees and Regular Attendees in Washington State by APR Year, 2006–2014
More than 90 percent of centers were school based in both programming periods. On average, 21st CCLC regular participants attended 61 days of programming during 2012–13 and 63 days during 2013–14. Overall, centers had approximately 73 regular attendees and 123 total attendees during the 2012–13 programming period, while centers had approximately 70 regular attendees and 114 total attendees during 2013–14. Key Findings – Evaluation Q1 10
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 11
Organizational Practices Strengths: – Staff reported supportive, collaborative program climates (Staff Survey) – Consistent meetings to discuss program improvement efforts (higher frequency reported among staff vs. site coordinators) Area for improvement – Opportunity for staff to observe peers delivering programming to provide feedback on practice (Staff Survey) – Use data to set program improvement goals with other staff (Staff Survey) Key Findings – Evaluation Q2 12
Instructional Practices Strengths: – Site Coordinators and staff report frequent delivery of practices associated with program design. – Most programs considered high functioning as defined by the PQA Form A. Areas for Improvement – Staff report struggling to find adequate time to plan activity lessons and offerings. – Most programs operate at the moderate level as defined by the PQA Form B. Key Findings – Evaluation Q2 13
Partnership Practices Strengths – Programs typically communicate with families once or twice a semester o Most common strategies: Communicating about program events, collaborating to enhance student success, and providing family literacy or social events. – Programs adopt strategies to establish meaningful linkages to the school-day o Most common strategy: hiring regular school-day teachers Areas for Improvement – Programs typically communicate with families once or twice a semester o Least common strategies: sending info home about student progress; asking for input from family members about what and how activities are provided. – Programs adopt strategies to establish meaningful linkages to the school-day o Least common strategy: ensuring activities are aligned with schoolwide improvement targets related to student performance Key Findings – Evaluation Q2 14
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 15
Key Findings – Evaluation Q3 16 Table 12. Impact of 21st CCLC on Achievement Pooled Across Grades, 2011– –12 Program Year2012–13 Program Year SubjectTreatmentEffect Size SE of Effect Size pEffect Size SE of Effect Size p Reading a 30+ day − day Mathematics b 30+ day < day Cumulative GPA c 30+ day− − day < Percentage of credits earned c 30+ day < day <0.001 Note. SE, standard error. a Includes Grades 4–8, 10. b Includes Grades 4–8. c Includes Grades 9–12.
Key Findings – Evaluation Q3 17 Table 13.Impact of 21st CCLC on Number of Unexcused Absences and Number of Disciplinary Incidents Pooled Across Grades, 2011– –12 Program Year2012–13 Program Year OutcomeTreatmentEffectSEp Weighted Mean Ratio (Treatment/ Comparison) EffectSEp Weighted Mean Ratio (Treatment/ Comparison) Number of Unexcused Absences a 30+ days− < − < days− < − < Number of Disciplinary Incidents b 30+ daysNA − daysNA − < Note. NA, not applicable; SE, standard error. a Includes Grades 6–12. b Includes Grades 3–12.
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? ( ) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? ( ) Evaluation Questions 18
Pilot year of the Motivation, Engagement, and Beliefs Survey (in 38 centers) Total of 1,199 surveys completed; average of 32 surveys per center Grades 4-12 Students who were likely to meet the definition of a regular attendee Key Findings – Evaluation Q4 19
Major Scales Sense Belonging and Engagement in the Program Program Impact on Student Social and Emotional Development Majority of youth fell within the positive end of the response scale. Relationship with School-Related Outcomes Key Findings – Evaluation Q4 20
Key Findings – Evaluation Q4 21 Table 19. Summary of HLM Results by Survey Subscale and School Outcome, 2014 CoefficientStandard Errorp Value Academic Identity Reading assessment ** Reading growth percentile Mathematics assessment *** Mathematics growth percentile Unexcused absences − * Disciplinary incidents− ** Intervention days− *** Mindset Reading assessment Reading growth percentile Mathematics assessment * Math growth percentile Unexcused absences− * Disciplinary incidents− Intervention days− Note. N = 867 youth in Grades 4–8 with complete survey data; actual n varies by analysis. ***p <.001, **p <.01, *p <.05, + p <.10.
Key Findings – Evaluation Q4 22 Table 19. Summary of HLM Results by Survey Subscale and School Outcome, 2014, continued CoefficientStandard Errorp Value Self-Management Reading assessment Reading growth percentile Mathematics assessment Mathematics growth percentile− Unexcused absences− Disciplinary incidents− * Intervention days− * Interpersonal Skills Reading assessment * Reading growth percentile Mathematics assessment Mathematics growth percentile Unexcused absences− Disciplinary incidents− * Intervention days− ** Note. N = 867 youth in Grades 4–8 with complete survey data; actual n varies by analysis. ***p <.001, **p <.01, *p <.05, + p <.10.
Washington 21st Century Community Learning Centers Program Evaluation: 2012–13 and 2013– StatewideEvaluationReport.pdf 21 st CCLC Evaluation and Accountability Webpage on the OSPI website Full Report Available At: 23
Leading Indicator Report Updates 24
State Assessment Data School-Day Absence & Disciplinary Incidents Timeline (end of February) Population of Outcome Data 25
Data Dashboard Updates 26
Purpose Specifications Impact of new federal reporting Data Dashboard History 27
Purpose Specifications Intended Use Data Dashboard – Where we are now 28
Case Studies 29
Purpose Who is involved? What activities will take place? Timeline (February to April 2016) Case Study Overview 30
APR Data Submission Updates 31
Now through Feb. 11 Enter your Activity and Staffing information for Spring 2015 Feb. 12 through Feb. 19 Make any adjustments to Summer 2014 data Feb. 20 through Feb. 26 Make any adjustments to Fall 2014 data Feb. 27 through March 4 Enter Participation and Outcome data for Spring 2015 Make any adjustments to Activity and Staffing information for Spring 2015 APR Data Submission Timeline 32
AIR will provide support for entering these data. Each grantee will receive center-level reports of attendance broken down by the required categories: – Participation o Grade Level o Regular Attendee status o Race/Ethnicity o Gender o Population Specifics – Outcomes o State Assessment data Expect to receive these reports by February 25 th. Spring 2015 Participation & Outcomes 33
What do I do if I have already entered Participation and/or Outcome data in the system for Spring 2015? What do I do if the numbers I have entered do not match the reports that AIR has given me? Further Guidance 34
Other Spring Data Collection Activities 35
Used to populate the leading indicator reports Data obtained from federal reporting systems Surveys PQA data Youth outcome data Purpose is to inform quality improvement efforts Leading indicator surveys collect data from two groups: Site coordinators Staff working directly with youth in the delivery of programming Leading Indicator Surveys 36
Survey Administration: Recorded webinar from AIR will be posted to assist project directors in how to navigate the online survey system – available February 12th s to project directors containing a link to the survey management system and a username and password will be sent during the week of February 15th Complete surveys from site coordinators and afterschool program staff are due March 31 st Leading Indicator Survey Timeline 37
Student Survey – Theory of Change 38
Administered the Student Engagement, Motivation, & Beliefs Survey in all centers during Spring 2015 Intended for grades 4-12 AND for students who were or likely to be regular attendees by the end of the programming period Surveyed as many students as possible A total of 4,952 surveys were collected from 21st CCLC participants Student Survey – 2015 Preliminary Results 39
Current version comprised of the following scales (47 items): Academic identity Mindsets Self-management School belonging Interpersonal skills Retrospective program impact on academic behaviors Retrospective program impact on self-management Program belonging and engagement Student Survey – 2015 Preliminary Results 40
Student Survey – 2015 Preliminary Results 41 Not at all trueSomewhat trueMostly trueCompletely true Youth Skills and Beliefs ACADEMIC IDENTITY 1.0%9%37%53% POSITIVE MINDSETS 1%12%46%41% SELF-MANAGEMENT 3%18%47%32% SCHOOL BELONGING (dropped for ) NA INTERPERSONAL SKILLS 1%11%50%38% Program Experiences and Impact PROGRAM BELONGING AND ENGAGEMENT 3%13%30%54% ACADEMIC BEHAVIORS (RETROSPECTIVE) 4%16%37%43% SELF-MANAGEMENT (RETROSPECTIVE) 8%16%39%37%
All centers serving youth in grades 4-12 will be asked to collect youth survey data in the spring of 2015 Survey is intended for students are regular attendees, or are likely to be regular attendees by the end of the program year. Survey as many students as possible. At least 25 if possible. Preference to collect data online Let us know if this is going to be an issue by sending an to Student Survey – 2016 Administration 42
The statewide student identifier will be collected in relation to youth taking the survey or some similar method will be employed to connect survey data with the state data warehouses Grantees will have access to a webinar on how to administer the survey on April 1. Online collection of youth survey data will begin on April 4 and continue through May 29 Results from the youth survey will be made available in the leading indicator reports and will have access to raw de-identified data Student Survey – 2016 Administration 43
Timeline Recap 44 DateItem February 5 (Friday) 12:00-1:30pmState Evaluation Update Spring Data Collection Process February 11 (Thursday)Spring 2014 APR Federal Data Due February 12-19Update Summer 2014 APR Data February 12 (Friday)LI Support Webinar Recording Available February 15Leading Indicator Surveys Launched February 20-26Update Fall 2014 APR Data February 27 - March 4Update Spring 2015 APR Data February 28Outcomes populated in LI Reports February-AprilCase Studies March 31stAIR Leading Indicator Staff and Site Coordinator Surveys Due April 1Student Survey Administration Webinar April 4 (Monday)Student Survey Opens May 29Student Survey Closes *These are just things we have covered today. A full revised calendar of events will be provided by OSPI.
Samantha Sniegowski S. River Plaza, Suite 600 Chicago, IL General Information: