Download presentation
Presentation is loading. Please wait.
Published byGabriel Chapman Modified over 6 years ago
1
Boston Summer Learning Community Evaluation Results, Katie Tosh, Director of Measurement & Outreach Boston After School & Beyond
2
Contents Enrollment & Attendance Findings ACT Framework Update
Measurement Findings Program Practices Youth Social-emotional Skill Growth Trends over Time Summary & Recommendations Reference report in folders for more info on program and participant characteristics.
3
Focus on Best Practices and Quality Improvement
Program Practices Youth Perspective Best Practices Program Staff Perspective Youth Social-Emotional Skill Development Observer Perspective You’re getting data from us about your programs from the observer and youth perspectives. You also bring your own perspective and experience to this. Where the data and your experience intersect is where we find best practices. BASB has worked to curate these best practices and put them on the Summer Insight Center for you to reference and learn from. The focus here is on program practice and identifying what’s working and what might need to be improved, so that we can help youth develop social-emotional skills to the best of our ability. As I go through the presentation, keep in mind that I’m giving you the broader community context, so data from all 127 programs in the BSLC, and that after today you’ll need to consider what this means for your program, how your program’s data relates to the city-wide trends, and what your next steps will be for improvement.
4
Enrollment and Attendance
5
We suspect the no show rate is actually higher than this, and it has to do with how data is entered into the system. If students who signed up for your program but never came were not entered into YouthServices, they aren’t included in our no show rate. What this doesn’t tell us is whether or not programs are operating at capacity, which is an important consideration in the access question.
6
We learned in the RAND presentation this morning that we need to start looking beyond average rates of attendance and consider the dosage of programming which students are receiving. Our programs on average were 6.5 weeks long, ranging from 1 to 10 weeks. What we’re looking at here is the percent of students by the number of program days which they attended. You’ll see our drop outs there on the end, 6%, who attended five or fewer days, and then students who attended up to 54 days all the way at the other end of the spectrum. The majority of students fall into the 11 to 30 day range, though. Importantly, 60% of students attended 20 days or more of programming, regardless of duration of programming. If we look at only students who were scheduled to attend at least 20 days, we find that 70% of those students ended up attending at least 20 days. So we know that two factors are at play here: duration of program and retention of students in the program. We’re not suggesting that shorter programs change their model to be longer. The point is to utilize the time you have with students wisely to ensure they have the best chance at benefitting from being in your program, and employ strategies to make sure students show up and keep coming throughout the summer. The RAND study on summer learning found significant advantages in math, ELA, and social-emotional skills for high attenders (≥ 20 days). Two factors are at play here: duration of program and retention of students in the program. The above graph is for all BSLC students, regardless of program duration. However, among students who were scheduled to attend at least 20 days, 70% of those students attended 20 or more days of summer programming.
7
Avg: 84.8% Here we can see how the average rate of attendance varies across programs. Each bar represents a program, and the range of attendance goes from 58% on the low end to 100% on the high end (those are overnight camps). Our average rate of attendance is sitting at 84.8%. The majority of programs fall pretty close to the average, but there is a subset of programs whose attendance rate falls below 80%, so attendance and retention strategies could be of greater importance for those programs to focus on in the upcoming year. Make note of consistent ARA for SLP programs since 2012 at 80% to 80% in 2016, with slight fluctuations in between. Range: 58% to 100%
8
BSLC By Date Average Rate of Attendance, 2015 and 2016
July 3, 2015 July 4, 2016 Lastly on attendance, we can look at the by date average rate of attendance. This is last year’s data. We saw that attendance was slow to pick up for programs that started at the end of June, and saw that attendance trailed off as we got towards the end of summer. We also saw a drop in attendance on the Friday of July 4th weekend, which was July 3rd. Overlaying 2016 data on top, we can see some differences. First, the line overall is higher, reflecting better attendance and student retention in the programs. We notice the slow uptick in attendance at the start of the summer wasn’t as prominent, but also that programs started a few days later this year. We can also see that the drop off in attendance at the end of the program is pretty similar to last year. Although our overall attendance rates are higher, the slope of these two lines is similar, so the trend is still occurring even with higher rates of attendance on those days. We also see a much bigger drop in attendance for the holiday weekend because July 4th fell on a Monday this year. For attendance, keep in mind: good recruitment efforts so that students who sign up for the program are coming and not dropping out, strategies to ensure students become high attenders, and that students attend throughout the entire summer. 2015 dates 2016 dates 2015 by date average rate of attendance 2016 by date average rate of attendance
9
ACT Framework Update Before we move on with the data findings, we’re going to switch gears and discuss some updates to the ACT Framework, because we’ll be reviewing the rest of the data through the lens of the ACT Framework. This is to tee-up how we’re approaching professional development this year, and we’ll be working this year to create program-level reports so you can view your own data this way, too.
10
Potential Revision to the ACT Framework (draft)
ACT Skills Framework ACHIEVING Critical Thinking Creativity Perseverance CONNECTING Social Awareness & Relationships Communication Teamwork THRIVING Growth Mindset Efficacy Self-Regulation You’re all familiar with the ACT Framework, which was created in 2010 to unite the city’s out-of-school time programs around a common youth agenda, and outlines the skills that research and practice tell us are important for youth to have in order to be successfully in school, careers, and life. We’ve been working over the past year to update the ACT Framework, and this work will continue for about another year. You wouldn’t necessarily think that a framework would need revision after only six years, but since it was created, there has been a lot of research and work nationally on youth development and in particular on social-emotional learning. So we want to update the ACT Framework to reflect that research and remove redundant outcomes, as well as align terminology of the framework and measurement tools so that the links between the framework and data are more clear, and establish shared definitions of skills to guide our collective work. We’ve been doing this in close collaboration with our research partners, NIOST and PEAR, with the RAND Corporation, and with valuable feedback from a group of summer and school year partners who have graciously given their time to serve on our Partner Advisory Group on Measurement. So here’s the latest draft of a potential revised framework. We’ve piloted new measures in collaboration with NIOST and PEAR to capture more of these outcomes and associated program practices with our tools, but gaps still remain in our ability to measure creativity, growth mindset, and self-efficacy, nor is this list and arrangement of skills final or set in stone. But it is how we’ll be framing the results of this summer’s data collection work. Update to reflect recent research and youth development work nationally Align terminology of ACT Framework and measurement tools Established shared definitions of skills
11
Associated Outcomes from Research Review
Outcomes Associated with ACT Skills: Achievement (academics, life) and Behavioral (personal, social) ACT Skills Framework Associated Outcomes from Research Review ACHIEVING Critical Thinking College success and workforce readiness; Employee selection by management Creativity Improved academic achievement Perseverance High education attainment; Higher undergraduate GPAs; Fewer career changes; Work habits and task persistence; Math achievement CONNECTING Social Awareness & Relationships Improved academic test scores; Better job performance; Improved grades; Healthier relationships; Improved social skills; Improved self-efficacy, problem-solving Communication Healthier social relationships; Improved self-efficacy Teamwork Improved academic achievement; Improved self-efficacy THRIVING Growth Mindset Improved performance and academic achievement; Lower stress levels Efficacy Improved academic achievement; Academic persistence; College GPA, persistence; Improved work performance; Improved perseverance Self-Regulation Positive academic outcomes (grade promotion, test scores, course grades); College GPA and retention; Positive social outcomes (better impulse control, more stable relationships, increased empathy and perspective taking, more constructive response to anger) Briefly, I wanted to highlight the research base on these skills, which provides evidence for their importance in achievement in school, college, and the workplace, and personal and social outcomes. You’ll notice that the Achieve domain is very much supported by evidence of gains in school, college, and careers, while the Connect and Thrive domains evidence base includes both achievement and behavioral outcomes.
12
Measurement Results So, with that in mind, let’s proceed to the results around program practice and youth skill development.
13
What Questions Are We Answering?
Program Practices To what extent did BSLC Programs deliver practices that build ACT skills “most of the time”? Youth Social-Emotional Skill Growth What percent of summer program participants demonstrated skills “usually/often” or “always” at the start and end of the summer program? What percent of summer program participants reported improvements in their skills over the summer? So as we look at our data from summer 2016, it’s important keep in mind the questions guiding this work and the underpinning philosophy. It’s easy to lose sight of that when looking at data. Youth are influenced by experiences in a variety of settings: home, school, out-of-school time programs, informal experiences and the media. A summer learning program is just one of the many Organized Activities in which youth could be engaged. We have a short window of time with students in the summer, and it is small compared to all the other time they are learning in various settings throughout the rest of the year. However, we just learned from RAND that significant and meaningful advantages in both academics and social-emotional skills are possible as a result of summer programs that met a specific set of criteria. We learned that it is critical that not only students show up to the program in order to benefit, and we’ve just reviewed how our programs are doing in terms of attendance, but that staff also be intentional with their activities while they have youth present at the program. This was true for ELA instruction in the RAND study. We hypothesize that intentionality in practice also matters a great deal for youth social-emotional skill development, and that is the guiding theme of the rest of this presentation. Having such a large group of summer programs all using the same tools to capture program practice and youth skill outcomes allows us to dig deeper into this hypothesis, and it’s something RAND will be looking at in a correlational way with our data over the next year. For today, we’re looking at: -To what extent did BSLC programs deliver practices that build ACT skills “most of the time”? So that’s the benchmark we’ll look for. -What percent of participants demonstrated skills “usually” or “always” at the start and end of the summer, as rated by program staff? -What percent of summer program participants self-reported improvements in their skills over the summer? Here’s the cheat sheet for the presentation. Anything in solid color is from the adult perspective, and anything in an outlined box is from the youth perspective. The colors correspond to the Achieve, Connect, or Thrive domains. In your folders there is a handout that looks like this. It has the definitions of each skill, examples of program practice that intentionally build that skill, and on the reverse has the results so you can follow along. Presentation Legend = Adult rating (observer if program practice data; teacher/staff if youth skill data) = Youth rating (of either program practice or self-report on skills)
14
Achieve: Program Practices
On average, observers and youth rate Programs as Meeting the Benchmark in Achieve Practices Almost Always Most of the Time Sometimes Never % of programs meeting the benchmark in practices that build: 75% Perseverance 62% Critical Thinking (observer perspective) Perseverance contains items related to whether or not youth felt challenged, which seems like an acceptable proxy for opportunities to persevere. Youth preserve: 67/94 (72%) Adult perseverance: 89/119 (75%) Youth critical thinking: 9/15 (60%) Adult critical thinking: 74/119 (62%) Perseverance both perspectives: 54/90 (60%) Critical Thinking: 8/14 (57%) Starting with the Achieve domain and looking at program practice. What we see here is the frequency with which programs delivered practices that develop perseverance and critical thinking, and we want to be hitting the benchmark of “most of the time.” We see that on average, observers and youth rate programs as meeting that benchmark. Again, observer is the solid bar, youth is the outlined bar. To go a layer deeper, we can see how many programs met the benchmark. We know that on average as a cohort we’re hitting it, but this doesn’t yet tell us about score distribution across programs. So, 75% of programs met the benchmark in perseverance and 62% in critical thinking. This is good, but still 25%-40% of programs aren’t there yet. This is also from the observer perspective, and youth are rating fewer programs as hitting the benchmark. = Observer Rating, BSLC 2016 (n=119 programs) = Youth Rating, BSLC 2016 (n=94 programs; *pilot scale, n=15 programs)
15
Achieve: Youth Skill Growth
% of youth reporting improvements at the end of the program in: 82% Perseverance 88% Critical Thinking (n= 1279 students, 25 programs) Next, we can look at youth skill growth over the summer. We’ll start with the % of youth meeting the benchmark of skill demonstration at the start and end of the summer, as rated by program staff. Again, the benchmark for youth is to demonstrate these skills “usually” or “always.” First thing we notice is staff noted improvements from pre to post. But we also see that by the end of the program, only about 40% of students were meeting the benchmark, so we have some work to do. We know from years of data from summer and school year programs that the Achieve domain is the most challenging for both students and programs. But if we look at the % of youth self-reporting improvements, we see that 82% reported improvements in perseverance and 88% in critical thinking. Although we’re seeing from the staff ratings that a majority of students aren’t where we want them to be in these skills, the vast majority of students felt they improved nevertheless, which is encouraging. = Pre Staff/Teacher Rating of Youth Skills (n=1478 students, 41 programs) = Post Staff/Teacher Rating of Youth Skills (n=1478 students, 41 programs)
16
Connect: Program Practices
% of programs meeting the benchmark in practices that build: 91% Relationships with Adults 87% Relationships with Peers 66% Communication 77% Teamwork (observer perspective) On average, observers and youth rate Programs as Meeting the Benchmark in Connect Practices Almost Always Most of the Time Sometimes Never Youth adults: 91/94 (97%) Adult adult: 108/119 (91%) Youth peers: 68/94 (72%) Adult peer: 104/119 (87%) Youth teamwork: 11/15 (73%) Adult teamwork: 92/119 (77%) Adult comm: 78/119 (66%) Both Perspectives Peers: 61/90 (68%) Adults: 83/90 (92%) Teamwork: 9/14 (64%) Similarly, we’ll look at program practices in the Connect domain. We see that on average, programs are rated as hitting the benchmark in peer relationships, adult relationships, communication, and teamwork. You’ll notice we only have observer data for communication. You’ll also notice youth rated programs lower in peer relationships. It’s always important when reviewing your own data to look for differences and similarities in how adults and youth rate your program and consider why the program environment might be perceived that way from each perspective. Youth rated 97% of programs as meeting the benchmark in relationships with adults. You all are doing strong in this domain with some room for improvement in practices that promote communication skills. = Observer Rating, BSLC 2016 (n=119 programs) = Youth Rating, BSLC 2016 (n=94 programs; *pilot scale, n=15 programs)
17
Connect: Youth Skill Growth
% of youth reporting improvements at the end of the program in: 83% Relationships with Peers 78% Relationships with Adults 94% Empathy (n= 1279 students, 25 programs) First thing we notice again is improvement from pre to post. We can also see that a lot more students are meeting the benchmark in these skills. Youth reported improvement across the board, with 94% making gains in empathy, a skill critical for positive relationships, good communication, and effective teamwork. = Pre Staff/Teacher Rating of Youth Skills (n=1478 students, 41 programs) = Post Staff/Teacher Rating of Youth Skills (n=1478 students, 41 programs)
18
% of programs meeting the benchmark in practices that build:
Thrive: Program Practices On average, youth rate Programs lower than observers in Thrive Practices Almost Always Most of the Time Sometimes Never % of programs meeting the benchmark in practices that build: 91% Self-Regulation (observer perspective) Youth: 5/15 (33%) Adult: 108/119 (91%) Both perspectives: (3/14) 21% We see that on average youth rate programs lower than observers do in practices that promote self-regulation. The observer is looking at how program staff respond to youth when the student is experiencing a problem or maybe exhibiting poor self-regulation, and how staff help youth resolve conflicts. It does not reflect any practices to teach youth about self-regulation or emotion control before there is an issue, which is exactly what the youth were being asked about. The main difference is intentionality in teaching about self-regulation as a concept, versus reactively responding to instances of poor self-regulation. If you look at your handout, you can see more clearly the differences in practice that observers and youth are rating. However, 91% of programs are meeting the benchmark from the observer perspective, so we’re doing well in how staff respond to youth, resolve conflicts, and redirect emotional outbursts. But we see from the youth perspective that there needs to be a greater degree of intentionality in teaching youth about the concept, how to handle stress, control emotions, deal with conflict, both before, during, and after situations arise. = Observer Rating, BSLC 2016 (n=119 programs) = Youth Rating, BSLC 2016 (n=94 programs; *pilot scale, n=15 programs)
19
% of youth reporting improvements at the end of the program in:
Thrive: Youth Skill Growth % of youth reporting improvements at the end of the program in: 54% Self-Regulation (n= 1279 students, 25 programs) Not as high as the relationships skills, but also not as low as the achieve skills. Growth pre to post, just over half of students are meeting the benchmark at the end of the program. However only 54% of students felt like they improved in this skill. Self-regulation is hard, even for adults, but we can be more intentional about how we model and teach the concept so students can improve. = Pre Staff/Teacher Rating of Youth Skills (n=1478 students, 41 programs) = Post Staff/Teacher Rating of Youth Skills (n=1478 students, 41 programs)
20
Cross-cutting Practices help build all ACT Skills
Almost Always Most of the Time Sometimes Never Both Perspectives: Engage: 66/90 (73%) Lead Choice: 4/90 (4%) OrgStructure: 94% These practices all create an environment conducive to learning all the ACT skills. We know leadership and choice is hard, but it’s something we’ll continue to focus on because we know how important it is for positive youth development. = Observer Rating, BSLC (n=119 programs) = Youth Rating, BSLC 2016 (n=94 programs; *pilot scale, n=15 programs)
21
Program Practice, Observer Perspective: BSLC 2013-2016
Program Organization and Structure Supportive Environment Engagement in Activities and Learning Almost Always Most of the Time Sometimes Never = 2013, 2014, 2015 = 2016 On average, the BSLC met or exceeded the benchmark in all areas of program quality, with scores very similar to the previous year.
22
Program Practice, Youth Perspective: BSLC 2014-2016
Supportive Environment Engagement in Activities and Learning Helps youth academically could be because not all programs focus on academics. Not sure we should keep measuring this for all programs. = 2014, 2015 = 2016 Youth rated programs very similarly on aspects of program quality in Summer 2016 as compared to the previous two summers. Program strengths were maintained from one year to the next and room for improvement still exists in identifying and incorporating best practices in areas of challenge.
23
How did returning programs fare?
There was no difference for Returning sites in youth or observer ratings for 2016 versus 2015 across 18 out of 23 areas of program quality Observers rated Returning sites lower* in 2016 than 2015 in: Organization of Activity Space Adequacy Social-Emotional Environment Staff Build Relationships and Support Individual Youth Youth Relationships with Peers However, Returning sites exceeded the benchmark in all 23 domains of program quality (*paired samples t-test, p<0.05) We’re looking an overall very high performing group of programs, with fluctuations year to year that could be due to changes in program structure or staffing, or changes due to having a different group of students in the program every year, which influences the feel of the program. Furthermore, programs all received a written summary of their observation visit within a few days of the observation having taken place. So what the data does not tell us is if any improvements or alterations were made by the program to its practices after receiving the summary. If you’ve been doing this work with us for several years now, you’ll have noticed that your scores vary year to year, and that it is unrealistic to expect continual upward trends. The important thing is that we are collectively performing well, and our goal should be to perform at or above the benchmark. But for each individual program, the focus should be on maintaining your strengths, and working to improve areas of challenge that align to your program’s goals.
24
Summary & Recommendations
25
Summary: Reach and Dosage
Reach: 127 programs serving 10,084 youth Average program duration: 6.5 weeks (range 1-10 wks) Average Rate of Attendance: 84.8% (range 58%-100%)
26
Summary: Program Practice
On average, observers and youth rated programs at or above the benchmark in 9 out of 10 areas of practice ACT practices and cross-cutting practices 66% - 91% of programs hit the benchmark of quality practice, depending on which skill the practices are meant to build Room for improvement remains in: Achieve and Thrive practices (critical thinking, perseverance, self-regulation) Opportunities for Leadership & Choice Over time, the growing BSLC has maintained strong positive trends in above-benchmark performance
27
Summary: Youth Skill Growth
On average, youth participants achieved statistically significant growth in all ACT skills, as rated by program staff and youth self-report Areas of the lowest skill performance or growth: Students self-report: self-regulation Program staff rating: perseverance, critical thinking, communication, and self-regulation Areas for improvement in program practice mirror the lower areas of skill growth, suggesting an area of focus for program development. Programs and youth continue to show the strongest results in the Connect domain, building on several years of trend data showing programs are strong in building a supportive social environment. Areas of highest skill performance or growth: Student self-report: empathy Program staff rating: peer relationships, adult relationships
28
Recommendations & Next Steps
Review PRISM Report with staff What are your program’s goals for youth? How does your program practice support those goals? Staff hiring and training? What does the data reveal about your strengths? Challenges? How does your staff’s experience align to or differ from your data? What are 1-2 areas to focus on maintaining? Improving? Attend upcoming professional development sessions Read up on best practices Summer Insight Center Measurement Tools Potential areas of focus for BSLC partners: Strategies to maintain high program attendance or improve low attendance Maintain and/or strengthen program practices that relate to your program’s content and goals Incorporate best practices for academic instructional quality if your program has an academic focus Create opportunities to give youth leadership, autonomy, and choice Refer programs to the handout of next steps and Summer Insight Center; contact me with questions
29
2016 SUMMER LEARNING SEMINAR
From Research to Practice: Maximizing the Potential of Summer Learning Workshop 1 - National Study RCT Findings: Implications for Program Leaders (first floor) Workshop 2 - Unpacking PRISM Data: Quality of Program Practice (downstairs) Workshop 3 - Supporting Social-Emotional Development (downstairs)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.