Do After-school Programs Affect Important Youth Outcomes? If So, Do We Know Why? Robert C. Granger, Ed.D. Remarks prepared for “Making a Difference in.

Slides:



Advertisements
Similar presentations
Accountability for Quality: Policy Innovation in Out-of-school Time Programs November 2, 2007.
Advertisements

RIDE – Office of Special Populations

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
August 2006 OSEP Project Director's Conference 1 Preparing Teachers to Teach All Children: The Impact of the Work of the Center for Improving Teacher Quality.
Improving Quality Systemwide October 11, What is your role in afterschool?
Family Resource Center Association January 2015 Quarterly Meeting.
Educational Outcomes: The Role of Competencies and The Importance of Assessment.
Assessing & Improving Youth Program Quality Nicole Yohalem, Forum for Youth Investment Quality Youth Development Practice Webinar April 2, 2009.
HEInnovate A self-assessment tool for higher education institutions (HEIs) wishing to explore their entrepreneurial and innovative potential.
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
Milwaukee Partnership Academy An Urban P-16 Council for Quality Teaching and Learning.
SYSTEM-BUILDING AND QUALITY: WHAT’S AT STAKE? Charles Smith, Ph.D. Executive Director, David P. Weikart Center for Youth Program Quality Vice President.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
United Way of Greater Toledo - Framework for Education Priority community issue: Education – Prepare children to enter and graduate from school.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Outline of Presentation 1.Mission, Vision and Values for Task Force 2.Definition of Engagement 3.Explanation of Research-Informed Framework 4.Characteristics.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Ready by 21 is a trademark of the Forum for Youth Investment, core operating division of Impact Strategies, Inc. The SCPA uses the Forum.
Improving Program Quality in Central Texas. Agenda Overview Accomplishments Improvements Future Opportunities Break Assessors Methods Trainers.
PARENT COORDINATOR INFORMATION SESSION PARENT ACCOUNTABILITY Wednesday, July 20, 2011 Madelene Chan, Supt. D24 Danielle DiMango, Supt. D25.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
Ready by 21 Quality Counts Site Lead Meeting Highlights from Year One.
DIPLOMAS NOW SUMMER INSTITUTE JULY 7-9, 2011 BOSTON, MA Welcome to Day 2!
The Instructional Decision-Making Process 1 hour presentation.
Standards, Assessment and Accountability: Administration of Environmental Rating Scales by EEC Regional Staff Board of Early Education and Care December.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
1 Quality Matters: Building Capacity and Investment in Youth Program Quality The Center for Youth Program Quality.
ASSESSING AND IMPROVING YOUTH PROGRAM QUALITY Nicole Yohalem October 2008.
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
Out of School Time Part II: Quality Improvement Systems: Panel on City and State Models. Gina Gallo-Asheville, NC, Candy Markman - Nashville, TN Erica.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
NAZ as a Promise Neighborhood….. Where opportunities rise to meet their promising future! NAZ Family Academy Graduates.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Full-Service Community Schools TASK FORCE Staff Lead: Jane Nicholson.
A state-wide effort to improve teaching and learning to ensure that all Iowa students engage in a rigorous & relevant curriculum. The Core Curriculum.
STARTALK: Our mission, accomplishments and direction ILR November 12, 2010.
Tier 2/ Tier 3 Planning for Sustainability Rachel Saladis WI PBIS Network/Wi RtI Center Katrina Krych Sun Prairie Area School District.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Plain Talk Lorelei Walters Program Officer Plain Talk Replication Public/Private Ventures Replication and Expansion Services.
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
A State Staff Guide for Managing Statewide Initiatives Presented by Dr. Lennox McLendon.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
A Capacity Building Program of the Virginia Department of Education Division Support for Substantial School Improvement 1.
School Accreditation School Improvement Planning.
Destination 1 Local Evaluators. Who Are You?  Roles  Responsibilities  Perspective.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Florida Charter School Conference Orlando, Florida November, 2009 Clark Dorman Project Leader Florida Statewide Problem-Solving/RtI Project University.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Planning Next Year’s Staff Development Beryl Johnson Director of Staff Development City of Sacramento ~ Sacramento START Beryl Johnson Director of Staff.
Common Core Parenting: Best Practice Strategies to Support Student Success Core Components: Successful Models Patty Bunker National Director Parenting.
ACS WASC/CDE Visiting Committee Final Presentation South East High School March 11, 2015.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Using Cost-Effective Processes to Develop Large-Scale Data-Driven Continuous Improvement Systems for Local Programs Laurie A. Van Egeren, Jamie Wu, Michigan.
Pathway to Excellence. School’s Out Washington provides services and guidance for organizations to ensure all young people have safe places to learn and.
Out of School Time Part III: Quality Connections to Quality Rating & Improvement Systems (QRIS) and School Age Settings Jennifer Harris, Arkansas State.
ACS WASC/CDE Visiting Committee Final Presentation Panorama High School March
STEM and Expanded Learning in Providence N ATIONAL A CADEMY OF E NGINEERING N ATIONAL R ESEARCH C OUNCIL — B OARD ON S CIENCE E DUCATION Committee on Integrated.
Leading Indicators: Evaluation for Site-Level Improvement and System- Level Planning Samantha Sugar Research Associate/Analyst David P. Weikart Center.
SIOP Implementation in Manatee County A Title I and Title III Partnership Presented by: Debra Estes, ESOL Coordinator.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
External Review Exit Report Campbell County Schools November 15-18, 2015.
Improving Quality Systemwide
Implementation Guide for Linking Adults to Opportunity
Presentation transcript:

Do After-school Programs Affect Important Youth Outcomes? If So, Do We Know Why? Robert C. Granger, Ed.D. Remarks prepared for “Making a Difference in After-school - Measuring and Improving Program Quality” Sacramento, CA / March 17, 2009

Do after-school programs improve academic performance? Do we know why some programs make a difference while others do not? Two questions 2

Yes* Starting too… *Yes, but… Two answers 3

Background Policymakers and practitioners want to know if after-school programs affect academic achievement. Goal Review strong evidence regarding the effects of after-school programs and examine the practices of effective programs. Method Summarize the results from three rigorous reviews of over 90 evaluations of after-school programs. The review 4

5 Society for Research in Child Development. (2008, April). After-school Programs and Academics: Implications for Policy, Practice, and Research. (Social Policy Report Vol. XXII, No. 2). Ann Arbor, MI: Robert C. Granger. Society for Research in Child Development. (2008, April). Improving After-school Programs in a Climate of Accountability. (Social Policy Report Brief Vol. XXII, No. 2). Ann Arbor, MI.

On average after-school programs improve important academic outcomes like test scores and grades. A subset of the evaluated programs that achieved outstanding results account for the overall positive picture. The most effective programs had explicit goals, activities aligned with those goals, and got youth actively involved in their own learning. The findings 6

The two most important questions facing policymakers and practitioners in education and youth programs: 7 What do effective teachers, youth workers, or mentors do differently than their less effective colleagues? Can you make teachers, youth workers, or mentors more effective?

Practitioner consensus on best practices (Forum for Youth Investment, 2003) In-depth studies of program practices (Halpern, Larson, Hirsch) Practitioner efforts to improve program effectiveness (Many) Measures of program quality (Forum for Youth Investment, 2009) Sources of useful information about both questions 8

Importance of the point-of-service. Good measures have clear, unambiguous items. The best measures also teach. Measuring what matters 9

Making a Difference in After School: Measuring and Improving After School Quality Nicole Yohalem, Forum for Youth Investment Sacramento, CA March 17, 2009

© The Forum for Youth Investment 2008 Quality assessment tools Assessing Afterschool Program Practices Tool (APT) National Institute on Out-of-School Time and the MA Department of Education CORAL Observation Tool (CORAL) Public/Private Ventures Out-of-School Time Observation Instrument (OST) Policy Studies Associates Program Observation Tool (POT) National Afterschool Association Program Quality Observation (PQO) Deborah Vandell and Kim Pierce Promising Practices Rating Scale (PPRS) WI Center for Education Research and Policy Studies Associates, Inc. Quality Assurance System (QAS) Foundations Inc. Program Quality Self-Assessment Tool (QSA) New York State Afterschool Network School-Age Care Environment Rating Scale (SACERS) Frank Porter Graham Child Development Center, UNC Youth Program Quality Assessment (YPQA) High/Scope Educational Research Foundation Measuring Youth Program Quality A Guide to Quality Assessment Tools Updated January 2009

© The Forum for Youth Investment 2008 Quality assessment tools There is a lot of similarity in how quality practice is defined. All tools assess: Relationships Environment Engagement Social/Behavioral Norms Skill Building Opportunities Routine/Structure Note: CA self-assessment tool includes items that address these areas.

© The Forum for Youth Investment 2008 Measuring what matters Importance of the point-of-service. Good measures have clear, unambiguous items. The best measures also teach.

© The Forum for Youth Investment 2008 Emphasis on point-of-service CA Tool: 16 of 77 items focus on POS SACERS & NAA < half focus on POS APT & YPQA > half focus on POS

© The Forum for Youth Investment 2008 Clear and unambiguous? Examples from the CA tool: High inference Ensures staff & volunteers have respectful interactions with participants & families. Low inference: Regularly provides families with program information in multiple languages and literacy levels.

© The Forum for Youth Investment 2008 Measures that teach? Examples from the CA Tool: Diagnostic Provides opportunities & support for participants to take on leadership roles. Diagnostic and prescriptive Regularly provides collaborative partners with program information, such as program progress and evaluation reports and information about program events, in a variety of formats and in multiple languages if appropriate.

© The Forum for Youth Investment 2008 Quality improvement Key components of quality improvement systems: Quality standards that include what should happen at the point of service Ongoing assessment of how well services compare to the standards Targeted plans for how to improve Training and coaching that fits improvement plans

© The Forum for Youth Investment 2008 Emerging examples and lessons Afterschool Program Assessment System (APAS) National Institute on Out-of-School Time Youth Program Quality Intervention (YPQI) Weikart Center for Youth Program Quality

© The Forum for Youth Investment 2008 APAS pilot Conducted by NIOST, Wellesley College October 2006-July 2008 Atlanta, Boston, Charlotte, Middlesex Cnty NJ 65 individuals, 28 programs, 3 intermediaries Well-established K-8 after-school programs Low stakes Emphasis on continuous improvement, flexibility

© The Forum for Youth Investment 2008 Core APAS tools and supports Tools Survey of Afterschool Youth Outcomes Tool (SAYO) Assessing Afterschool Program Practices Tool (APT) Web-Based Data Management System Supports Training (2 days up front, online training ongoing) 1-day site visit Local coach

© The Forum for Youth Investment 2008 Findings from the APAS pilot APAS helped programs identify areas for improvement and staff development Most sites said they made program changes as a result. Coaches are key to implementation and useful to sites Engagement across staff levels is important Engaging funders is important (even with low stakes) based on follow-up phone interviews with sites and coaches For more on APAS:

© The Forum for Youth Investment 2008 mbus etroit Minneapolis Kentucky Iowa Oklahoma New York Rhode Island Austin Sacramento/ Georgetown Divide Columbus Indianapolis Grand Rapids Nashville St. Louis Washington* West Palm Beach County Rochester Chicago Youth Program Quality Intervention Systemic quality improvement systems (QIS) anchored by the YPQA being developed in: –Statewide strategies: MI, ME, RI, KY, NM, AR, MN, IA, WA, NY –Cities and Counties: Austin, Chicago, Rochester, Detroit, Grand Rapids, Palm Beach County, Baltimore, Nashville, St. Louis, Louisville, Georgetown Divide/Sacramento, Columbus IN, Indianapolis IN, Tulsa OK New Mexico Arkansas Baltimore Seattle Minnesota Maine

© The Forum for Youth Investment 2008 YPQI Focus: POS quality in context POS Point-of-Service Engagement Interaction Support Safety PLC Professional Learning Community SAE System Accountability Environment Org policies/practices Management values Performance feedback Continuity/staffing Standards and metrics Staff development Youth PQA Form A Youth PQA Form B

© The Forum for Youth Investment 2008 The Providence AfterSchool Alliance (PASA) Quality Improvement Strategy -What exists -What we know -What works -Based on national examples Quality Standards -Measure of standards -Promising practices -Provider/Community Input Quality Indicators -Partnership with High/Scope -Rhode Island Program Quality Assessment Tool (RIPQA) -Adopted by 21st CCLC initiative and in use statewide Self-Assessment Tool -Youthservices.net -Participation & retention data -Citywide data management system Tracking Tool -Staffing & Prof. Dev. Survey -Workshop series tied to RIPQA -BEST Youth Worker Training -Standards workshops aligning academics with enrichment Capacity Building/ Professional Development -Learning communities -Site visits -Model curricula -School alignment Improvement Efforts

© The Forum for Youth Investment 2008 Incentivizing participation PASA “endorsed” programs must: Maintain certain enrollment and retention benchmarks Have a written curriculum Undergo self-assessment using RIPQA annually In exchange for: Streamlined grant application process Small administrative funding supplement

© The Forum for Youth Investment 2008 Requiring participation Excerpt from Rhode Island 21 st CCLC RFP “Applicants must participate in the 21 st CCLC Rhode Island Youth Program Quality Assessment Process (RIPQA), which includes the use of a self-assessment tool, outside observations, development and implementation of action plans to strengthen the program over time, working with a Technical Advisor, including designation of staff to coordinate the process.”

© The Forum for Youth Investment 2008 Rhode Island 21 st CCLC pilot Assessment & Planning 1.Kick-off, 2-day training on RIPQA 2.Quality Advisor (QA) meets with programs individually to orient 3.Observation visits (3-8 programs per site) 4.QA develops progress report, teams meet with instructors to share reports and develop action plans 5.ED and other key staff complete Form B individually 6.QA summarizes, meets with team to discuss scores and improvement strategies 7.QA generates overall report on strengths and improvement steps Training & Technical Assistance Series of 2-hour workshops focused on RI-PQA content Additional training on behavior management AYD training (32 hours) offered twice annually 4-session supervisor training 5 hours of on-site coaching per site from QA

© The Forum for Youth Investment 2008 RI 21 st CCLC pilot – lessons Lessons Learned Programs liked tool and found process worthwhile Initial data collection model was time consuming Timing is important to ensure changes get implemented Needs across sites are very similar Strong desire for on-site TA/coaching Adjustments for Cohort 2 Smaller observation teams, fewer observations per site One program report as opposed to individualized reports Additional TA/training Start with Form B, then observations (Form A) For more information:

© The Forum for Youth Investment 2008 Palm Beach County QIS Pilot PD Training Centerpiece of the Prime Time Initiative 38 providers in pilot; now working with 90 January 2006 – fall 2007 Based on the PBC-PQA Financial incentives for programs

© The Forum for Youth Investment 2008 Findings from the Palm Beach pilot Most programs completed all phases of QIS Quality improved Quality improvement is a long-term process On-site TA very important component Clarity of purpose is critical Spielberger & Lockaby,

© The Forum for Youth Investment 2008 Coaching Characteristics: Willing to listen Experienced Accessible Flexible Responsive Creative Resourceful Roles/functions: Keep programs engaged Deliver training Answer questions on tools, process Participate in observations Generate reports Facilitate improvement planning Provide on-site feedback, modeling Key considerations: Program vs. system-level coaching, role of intermediaries Dosage

© The Forum for Youth Investment 2008 Purposes and methods Lower StakesHybrid ApproachesHigher Stakes Methods Site-based self- assessment teams Trained, reliable assessors recruit site-based self-assessment teams to co-produce quality scores Trained, reliable assessors not connected to the program Purposes Rough data to get staff thinking & discussing program quality in the context of best practice Rough & precise data co-mingled. Supports planning & staff development but not appropriate for evaluation or accountability Precise data for internal & external audiences for evaluation, monitoring, accountability, improvement, reporting Resources Less time, lower cost Most expensive, potentially highest learning impact More time, higher cost Audience Impact internal audiences Impact internal & external audiences Smith, Devaney, Akiva & Sugar forthcoming in New Directions

© The Forum for Youth Investment 2008 Lessons for California 1. Have well defined purposes for the system. 2. Focus on the point of service. 3. Anchor quality improvement efforts with data about the POS. 4. Create incentives for continuous improvement. 5. Build in on-site, ongoing technical assistance/coaching. 6. Be intentional about pilot participation. 7. Build learning communities. 8. Recognize that management is a key lever. 9. Worry about the quality of your measures and data.

For more information: Nicole Yohalem, Program Director Forum for Youth Investment