Download presentation
Presentation is loading. Please wait.
Published byEaster Harper Modified over 8 years ago
1
Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved. 2016 Data Collection Activities Samantha Sniegowski Researcher
2
2012-14 Statewide Evaluation Report Findings Leading Indicator Report Updates Data Dashboard Updates Case Studies APR Data Submission Updates Other Spring Data Collection Activities Agenda 2
3
2012-14 Statewide Evaluation Report Findings 3
4
Covers two program years: 2012-13 & 2013-14 Grantee & Center Characteristics covered in both years Program Quality covered in both years 2012-13 Focus: Impact Analysis on Youth Outcomes 2013-14 Focus: Youth Motivation, Engagement, and Beliefs Survey Report Background 4
5
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? (2012-13) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? (2013-14) Evaluation Questions 5
6
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? (2012-13) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? (2013-14) Evaluation Questions 6
7
Key Findings – Evaluation Q1 7 Table 2. Grants by Maturity, 2012–13 Washington GrantsAll Grants Nationwide Grant MaturityN Grants% GrantsN Grants% Grants New 12 21.4%62116.3% Mature 3257.1%1,70944.8% Sustaining 1221.4%1,48338.9% Total grantees 56100.0%3,813100.0% Source. PPICS. Table 3. Grants by Maturity, 2013–14 Washington GrantsAll Grants Nationwide Grant MaturityN Grants% GrantsN Grants% Grants New 10 18.2%1293.4% Mature 1323.6%1,76647.1% Sustaining 3258.2%1,85249.4% Total grantees 55100.0%3,747100.0% Source. PPICS.
8
Key Findings – Evaluation Q1 8 Figure 8. Percentage of Centers per Grade-Level Cluster per Year, 2008–2014
9
Key Findings – Evaluation Q1 9 Figure 9. Attendees and Regular Attendees in Washington State by APR Year, 2006–2014
10
More than 90 percent of centers were school based in both programming periods. On average, 21st CCLC regular participants attended 61 days of programming during 2012–13 and 63 days during 2013–14. Overall, centers had approximately 73 regular attendees and 123 total attendees during the 2012–13 programming period, while centers had approximately 70 regular attendees and 114 total attendees during 2013–14. Key Findings – Evaluation Q1 10
11
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? (2012-13) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? (2013-14) Evaluation Questions 11
12
Organizational Practices Strengths: – Staff reported supportive, collaborative program climates (Staff Survey) – Consistent meetings to discuss program improvement efforts (higher frequency reported among staff vs. site coordinators) Area for improvement – Opportunity for staff to observe peers delivering programming to provide feedback on practice (Staff Survey) – Use data to set program improvement goals with other staff (Staff Survey) Key Findings – Evaluation Q2 12
13
Instructional Practices Strengths: – Site Coordinators and staff report frequent delivery of practices associated with program design. – Most programs considered high functioning as defined by the PQA Form A. Areas for Improvement – Staff report struggling to find adequate time to plan activity lessons and offerings. – Most programs operate at the moderate level as defined by the PQA Form B. Key Findings – Evaluation Q2 13
14
Partnership Practices Strengths – Programs typically communicate with families once or twice a semester o Most common strategies: Communicating about program events, collaborating to enhance student success, and providing family literacy or social events. – Programs adopt strategies to establish meaningful linkages to the school-day o Most common strategy: hiring regular school-day teachers Areas for Improvement – Programs typically communicate with families once or twice a semester o Least common strategies: sending info home about student progress; asking for input from family members about what and how activities are provided. – Programs adopt strategies to establish meaningful linkages to the school-day o Least common strategy: ensuring activities are aligned with schoolwide improvement targets related to student performance Key Findings – Evaluation Q2 14
15
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? (2012-13) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? (2013-14) Evaluation Questions 15
16
Key Findings – Evaluation Q3 16 Table 12. Impact of 21st CCLC on Achievement Pooled Across Grades, 2011–2013 2011–12 Program Year2012–13 Program Year SubjectTreatmentEffect Size SE of Effect Size pEffect Size SE of Effect Size p Reading a 30+ day0.0270.0080.001−0.0010.0120.459 60+ day0.0330.0110.0040.0170.0160.142 Mathematics b 30+ day0.0440.008<0.0010.0030.0110.380 60+ day0.0350.0110.0020.0210.0150.079 Cumulative GPA c 30+ day−0.0220.0260.399−0.0720.0290.006 60+ day0.1950.049<0.0010.0820.0460.037 Percentage of credits earned c 30+ day0.0340.0270.2120.1240.022<0.001 60+ day0.1440.0480.0030.0630.010<0.001 Note. SE, standard error. a Includes Grades 4–8, 10. b Includes Grades 4–8. c Includes Grades 9–12.
17
Key Findings – Evaluation Q3 17 Table 13.Impact of 21st CCLC on Number of Unexcused Absences and Number of Disciplinary Incidents Pooled Across Grades, 2011–2013 2011–12 Program Year2012–13 Program Year OutcomeTreatmentEffectSEp Weighted Mean Ratio (Treatment/ Comparison) EffectSEp Weighted Mean Ratio (Treatment/ Comparison) Number of Unexcused Absences a 30+ days−0.3120.009<0.0010.657−0.0750.005<0.0010.856 60+ days−0.6380.017<0.0010.393−0.1400.044<0.0010.666 Number of Disciplinary Incidents b 30+ daysNA −0.0420.0270.0630.934 60+ daysNA −0.1850.009<0.0010.840 Note. NA, not applicable; SE, standard error. a Includes Grades 6–12. b Includes Grades 3–12.
18
1.What were the primary characteristics associated with the grants and centers funded by 21st CCLC and the student population served by the program? 2.To what extent was there evidence that centers funded by 21st CCLC implement research-supported practices related to quality afterschool programming? 3.To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on youth outcomes as compared with similar students not participating in the program? (2012-13) 4.What does youth completion of the Youth Motivation, Engagement, and Beliefs Survey indicate both about youth experiences in programming and youth functioning on social and emotional learning and noncognitive areas? (2013-14) Evaluation Questions 18
19
Pilot year of the Motivation, Engagement, and Beliefs Survey (in 38 centers) Total of 1,199 surveys completed; average of 32 surveys per center Grades 4-12 Students who were likely to meet the definition of a regular attendee Key Findings – Evaluation Q4 19
20
Major Scales Sense Belonging and Engagement in the Program Program Impact on Student Social and Emotional Development Majority of youth fell within the positive end of the response scale. Relationship with School-Related Outcomes Key Findings – Evaluation Q4 20
21
Key Findings – Evaluation Q4 21 Table 19. Summary of HLM Results by Survey Subscale and School Outcome, 2014 CoefficientStandard Errorp Value Academic Identity Reading assessment 0.2690.0830.005** Reading growth percentile 0.0870.1030.409 Mathematics assessment 0.3350.0730.000*** Mathematics growth percentile 0.2090.1000.050 + Unexcused absences −0.2800.1330.042* Disciplinary incidents−0.7070.2260.004** Intervention days−0.8980.1790.000*** Mindset Reading assessment0.1440.0890.123 Reading growth percentile0.0580.1060.590 Mathematics assessment0.1760.0820.047* Math growth percentile0.1930.118 Unexcused absences−0.3130.1380.030* Disciplinary incidents−0.2700.2300.249 Intervention days−0.5530.2820.058 + Note. N = 867 youth in Grades 4–8 with complete survey data; actual n varies by analysis. ***p <.001, **p <.01, *p <.05, + p <.10.
22
Key Findings – Evaluation Q4 22 Table 19. Summary of HLM Results by Survey Subscale and School Outcome, 2014, continued CoefficientStandard Errorp Value Self-Management Reading assessment0.1630.0990.117 Reading growth percentile0.1350.1280.307 Mathematics assessment0.101 0.329 Mathematics growth percentile−0.0110.1650.947 Unexcused absences−0.2390.1810.195 Disciplinary incidents−0.7700.3240.023* Intervention days−0.8800.3780.026* Interpersonal Skills Reading assessment0.1790.0830.047* Reading growth percentile0.1160.0980.253 Mathematics assessment0.0740.0800.365 Mathematics growth percentile0.0290.1210.815 Unexcused absences−0.2210.1090.050 + Disciplinary incidents−0.6360.2540.017* Intervention days−1.0900.3250.002** Note. N = 867 youth in Grades 4–8 with complete survey data; actual n varies by analysis. ***p <.001, **p <.01, *p <.05, + p <.10.
23
Washington 21st Century Community Learning Centers Program Evaluation: 2012–13 and 2013–14 http://www.k12.wa.us/21stCenturyLearning/pubdocs/Final20 12-14StatewideEvaluationReport.pdf 21 st CCLC Evaluation and Accountability Webpage on the OSPI website Full Report Available At: 23
24
Leading Indicator Report Updates 24
25
State Assessment Data School-Day Absence & Disciplinary Incidents Timeline (end of February) Population of Outcome Data 25
26
Data Dashboard Updates 26
27
Purpose Specifications Impact of new federal reporting Data Dashboard History 27
28
Purpose Specifications Intended Use Data Dashboard – Where we are now 28
29
Case Studies 29
30
Purpose Who is involved? What activities will take place? Timeline (February to April 2016) Case Study Overview 30
31
APR Data Submission Updates 31
32
Now through Feb. 11 Enter your Activity and Staffing information for Spring 2015 Feb. 12 through Feb. 19 Make any adjustments to Summer 2014 data Feb. 20 through Feb. 26 Make any adjustments to Fall 2014 data Feb. 27 through March 4 Enter Participation and Outcome data for Spring 2015 Make any adjustments to Activity and Staffing information for Spring 2015 APR Data Submission Timeline 32
33
AIR will provide support for entering these data. Each grantee will receive center-level reports of attendance broken down by the required categories: – Participation o Grade Level o Regular Attendee status o Race/Ethnicity o Gender o Population Specifics – Outcomes o State Assessment data Expect to receive these reports by February 25 th. Spring 2015 Participation & Outcomes 33
34
What do I do if I have already entered Participation and/or Outcome data in the system for Spring 2015? What do I do if the numbers I have entered do not match the reports that AIR has given me? Further Guidance 34
35
Other Spring Data Collection Activities 35
36
Used to populate the leading indicator reports Data obtained from federal reporting systems Surveys PQA data Youth outcome data Purpose is to inform quality improvement efforts Leading indicator surveys collect data from two groups: Site coordinators Staff working directly with youth in the delivery of programming Leading Indicator Surveys 36
37
Survey Administration: Recorded webinar from AIR will be posted to assist project directors in how to navigate the online survey system – available February 12th Emails to project directors containing a link to the survey management system and a username and password will be sent during the week of February 15th Complete surveys from site coordinators and afterschool program staff are due March 31 st Leading Indicator Survey Timeline 37
38
Student Survey – Theory of Change 38
39
Administered the Student Engagement, Motivation, & Beliefs Survey in all centers during Spring 2015 Intended for grades 4-12 AND for students who were or likely to be regular attendees by the end of the programming period Surveyed as many students as possible A total of 4,952 surveys were collected from 21st CCLC participants Student Survey – 2015 Preliminary Results 39
40
Current version comprised of the following scales (47 items): Academic identity Mindsets Self-management School belonging Interpersonal skills Retrospective program impact on academic behaviors Retrospective program impact on self-management Program belonging and engagement Student Survey – 2015 Preliminary Results 40
41
Student Survey – 2015 Preliminary Results 41 Not at all trueSomewhat trueMostly trueCompletely true Youth Skills and Beliefs ACADEMIC IDENTITY 1.0%9%37%53% POSITIVE MINDSETS 1%12%46%41% SELF-MANAGEMENT 3%18%47%32% SCHOOL BELONGING (dropped for 2014-15) NA INTERPERSONAL SKILLS 1%11%50%38% Program Experiences and Impact PROGRAM BELONGING AND ENGAGEMENT 3%13%30%54% ACADEMIC BEHAVIORS (RETROSPECTIVE) 4%16%37%43% SELF-MANAGEMENT (RETROSPECTIVE) 8%16%39%37%
42
All centers serving youth in grades 4-12 will be asked to collect youth survey data in the spring of 2015 Survey is intended for students are regular attendees, or are likely to be regular attendees by the end of the program year. Survey as many students as possible. At least 25 if possible. Preference to collect data online Let us know if this is going to be an issue by sending an email to WA21stcclc@air.orgWA21stcclc@air.org Student Survey – 2016 Administration 42
43
The statewide student identifier will be collected in relation to youth taking the survey or some similar method will be employed to connect survey data with the state data warehouses Grantees will have access to a webinar on how to administer the survey on April 1. Online collection of youth survey data will begin on April 4 and continue through May 29 Results from the youth survey will be made available in the leading indicator reports and will have access to raw de-identified data Student Survey – 2016 Administration 43
44
Timeline Recap 44 DateItem February 5 (Friday) 12:00-1:30pmState Evaluation Update Spring Data Collection Process February 11 (Thursday)Spring 2014 APR Federal Data Due February 12-19Update Summer 2014 APR Data February 12 (Friday)LI Support Webinar Recording Available February 15Leading Indicator Surveys Launched February 20-26Update Fall 2014 APR Data February 27 - March 4Update Spring 2015 APR Data February 28Outcomes populated in LI Reports February-AprilCase Studies March 31stAIR Leading Indicator Staff and Site Coordinator Surveys Due April 1Student Survey Administration Webinar April 4 (Monday)Student Survey Opens May 29Student Survey Closes *These are just things we have covered today. A full revised calendar of events will be provided by OSPI.
45
Samantha Sniegowski 312-690-7371 ssniegowski@air.org 10 S. River Plaza, Suite 600 Chicago, IL 60606-2901 General Information: 312-288-7600 www.air.org 45
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.