Download presentation
Presentation is loading. Please wait.
1
Session 5 – Where do we go from here
Session 5 – Where do we go from here? Data and Tools from The Partnership in Education and Resilience Institute
2
Speakers Patricia J. Allen, Ph.D., Senior Manager of Research & Evaluation Tracy Callahan, Ph.D., STEM Training, Coaching & Network Manage Gil Noam, Ed.D., Founder and Director Kristin Lewis-Warner, Ed.M., Project Manager of STEM Research
3
Data, tools, and resources to promote
STEM quality and learning in afterschool Virtual STEM Summit September 17, 2019
4
Presenters Gil Noam, Ed.D., Ph.D. (Habil) Founder and Director
The PEAR Institute Patty Allen, Ph.D. Senior Manager of Research Kristin Lewis-Warner, M.Ed. Manager of STEM Research Tracy Callahan, Ph.D. STEM Training, Coaching & Network Manager
5
The PEAR Institute
6
Data and Technical Assistance Center
7
Continuous Improvement
How PEAR Supports State Networks & Overall Mott-STEM Next Initiatives Measures Trainings/TA Dashboarding Continuous Improvement
8
Dimensions of Success (DoS)
9
Inconsistent Evidence
Example DoS Rubric Evidence Absent Inconsistent Evidence Reasonable Evidence Compelling Evidence There is minimal evidence that the students are engaged with hands-on activities where they can explore STEM content. There is weak evidence that the students are engaged with hands-on activities where they can explore STEM content. There is clear evidence that the students are engaged with hands-on activities where they can explore STEM content. There is consistent and meaningful evidence that students are engaged with hands-on activities where they can explore STEM content. 1 2 3 4
10
Common Instrument Suite for Students (CIS-S)
STEM Engagement STEM-Related Constructs Relationships with Peers SEL/D Constructs STEM Career Interest Relationships with Adults Common Instrument Suite Student (CIS-S) STEM Career Knowledge Critical Thinking STEM Activity Participation Perseverance STEM Identity STEM/SEL Interface
11
Common Instrument Suite for Educators (CIS-E)
Identity/Background Qualities of School/Program Experience Leading STEM PD Received and Desired Interest/Confidence in STEM Perceptions of Youth Learning Strengths/Challenges Common Instrument Suite Educator (CIS-E)
12
PEAR Data Dashboard
13
What We Have Found
14
2016 2018 National Mott/STEM Next Studies
11 state afterschool networks 160 informal STEM programs Nearly 1,600 youth 145+ educators 10 state afterschool networks 110 informal STEM programs Nearly 1,700 youth 200+ educators
15
Sources of Information
Program Quality (Planning and Observations) Youth Self-Report (Student Survey) Educator Self-Report (Educator Survey) Network Ratings* (Level of Program Support) *New in 2018
16
Quality: “Double-Dip”
Mean Evidence Ratings N = 103 observations (2018)
17
Comparison of 2016 (dark) and 2018 (light) DoS Program Quality Data
Quality: Persistent Strengths and Challenges Comparison of 2016 (dark) and 2018 (light) DoS Program Quality Data Mean Evidence Rating 2016 darker colors (N = 250) 2018 lighter colors (N = 103)
18
* * * * * Quality: Linking with Youth Outcomes
Youth STEM Attitudes by Observed STEM Program Quality * * * * * Higher Quality = Better Outcomes (in 2016 and 2018)
19
* Dosage: Linking with Youth Outcomes
Youth STEM Attitudes by STEM Activity Dosage * More Participation = Better Outcomes (in 2016 and 2018)
20
* * * * Network Support: Influence on Youth Outcomes
Youth STEM Attitudes by Network Support to Program * * * * More Support for Programs = Better Youth Outcomes
21
Educator: Self-Beliefs
Educators’ ratings of their comfort, interest, confidence, and capability leading STEM activities
22
Educator: Beliefs about Youth Growth
Educators’ ratings of their students’ growth in confidence and skills in five STEM-related areas
23
Where we’re going
24
and Select Planning States
Common Measures DoS Trainings & Tools Data Collection System Technical Assistance Data Reports Data Dashboards Virtual DoS* Quality Training Modules* 2019 Initiative At No Cost To Afterschool State Networks – System-Building and Select Planning States *Pilot Opportunities
25
Observation Certification / Recertification DoS Framework Overview
DoS Trainings and Tools Observation Certification / Recertification Multi-step certification process to utilize the DoS quality observation tool for data collection, includes a feedback coaching guide Observing Program Planning Tool A training/tool that aligns with the DoS framework and guides practitioners in the planning and implementation of high-quality STEM activities Planning DoS Framework Overview Learning An overview training of the DoS framework and promotion of high quality STEM programming To learn more about DoS training opportunities and tools, please visit:
26
Leveraging Data Demonstrating the importance of afterschool for workforce development Engaging partners such as businesses and policymakers
27
Questions? Questions?
28
Recent Research Allen, P. J., Chang, R., Gorrall, B. K., Waggenspack, L., Fukuda, E., Little, T. D., & Noam, G. G. (in press). From quality to outcomes: A national study of afterschool STEM programming. International Journal of STEM Education. Allen, P. J., Lewis-Warner, K. M., & Noam, G. G. (accepted). Partnerships to transform STEM learning: A case study of Tulsa, Oklahoma’s STEM Learning Ecosystem. Afterschool Matters. Retrieved from Peabody, L., Browne, R. K., Triggs, B., Allen, P. J., & Noam, G. G. (2019). Planning for quality: A research-based approach to developing strong STEM programming. Connected Science Learning. Retrieved from Shah, A. M., Wylie, C., Gitomer, D., & Noam, G. G. (2018). Improving STEM program quality in out-of- school-time: Tool development and validation. Science Education, 102(2). Sneider, C., & Noam, G. G. (2019). The Common Instrument Suite: A means for assessing student attitudes in STEM classrooms and out-of-school environments. Connected Science Learning, (11). Retrieved from csl.nsta.org/2019/07/the-common-instrument-suite Want a copy? Submit a request to PEAR at:
29
DoS Certification Steps
Training Video Calibration Exercises One-Hour Calibration Call (to establish rater reliability) Field Exercises (2) Certification & Ongoing Technical Assistance
30
DoS Program Planning Tool DoS Feedback Report and Coaching Guide
Collecting data about program quality DoS Observation Tool DoS Framework DoS Program Planning Tool DoS Feedback Report and Coaching Guide Designing high quality STEM activities Using data to improve quality
31
Continuous Improvement
Continuous Improvement Cycle (Planning tool) PLAN Continuous Improvement DO ACT (STEM programming) (Adjust methods/ curriculum) CHECK (Observations, surveys)
32
MAP OF STATEWIDE AFTERSCHOOL NETWORKS
33
Timeline for 2019 (4 months) End of STEM activities
Data Analysis and Development of Dynamic Dashboards Start of STEM activities End of STEM activities DoS Observation(s) Student Survey (CIS-S) Educator Survey (CIS-E) Recruitment *Network Support Rating Distribution of Data Collection Materials
34
CRITERIA FOR PROGRAM PARTICIPATION
Delivers STEM, aiming for 1-2 hrs/wk for 4 weeks+ Serves 15+ youth in STEM in Grades 4+ Staff willing to be observed 1-2 times (30-90 minutes per observation) Staff willing to administer one online student survey (10 mins) to students 4th grade and above Staff who are running STEM willing to complete one online educator survey (10 mins)
35
Quality: Persistent Strengths and Challenges
% of STEM Activities Exhibiting Reasonable to Compelling Evidence of Quality by Year 2016 darker colors (N = 250) 2018 lighter colors (N = 103)
36
Contact Information Patricia J. Allen – Tracy Callahan – Gil Noam – Kristin Lewis-Warner –
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.