Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.

Slides:



Advertisements
Similar presentations
A Presentation to the Cabinet A Presentation to Stakeholders
Advertisements

WV High Quality Standards for Schools
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Copyright © 2010 American Institutes for Research All rights reserved. Washington 21st Century Community Learning Centers Program Evaluation February 2011.
Self-Study Tool for Alaska Schools Winter Conference January 14, 2010 Jon Paden, EED Deborah Davis, Education Northwest/Alaska Comprehensive Center.
Core Knowledge and Competencies, Professional Standards for Working with Children Birth Through Age Eight and in Afterschool Programs NJ Instructor Approval.
 A strategic plan is a guiding document for an organization. It clarifies organizational priorities, goals and desired outcomes.  For the SRCS school.
PORTFOLIO.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger.
Parents as Partners in Education
Elementary School Counselor
HR Manager – HR Business Partners Role Description
Using training packages to meet client needs Facilitator: Gerard Kell.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Individualized Learning Plans A Study to Identify and Promote Promising Practices.
Early Achievers Overview Starting Strong – August 15, 2012.
Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st CCLC Leading Indicators Breakout Session Neil Naftzger and Deborah Moroney.
Neil Naftzger Principal Researcher Washington 21st CCLC Evaluation February 2015 Copyright © 20XX American Institutes for Research. All rights reserved.
NRCOI March 5th Conference Call
Comprehensive Guidance and Counseling
Improving Secondary Education and Transition Using Research-Based Standards and Indicators An initiative of the National Alliance on Secondary Education.
Introduction to Standard 2: Partnering with consumers Advice Centre Network Meeting Nicola Dunbar October 2012.
Best Practices in Action in Special Education Kim Sweet, Advocates for Children of New York On the Same Page Summit September 2011.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Comprehensive Guidance and Counselling South Shore Regional School Board May, 2010.
November 18 th, 2010 San Diego, California CERA Conference The California Statewide ASES Evaluation.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Pacific TA Meeting: Quality Practices in Early Intervention and Preschool Programs Overview to Trends and Issues in Quality Services Jane Nell Luster,
1 Adopting and Implementing a Shared Core Practice Framework A Briefing/Discussion Objectives: Provide a brief overview and context for: Practice Models.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
INSTRUCTIONAL EXCELLENCE INVENTORIES: A PROCESS OF MONITORING FOR CONTINUOUS IMPROVEMENT Dr. Maria Pitre-Martin Superintendent of Schools.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Thomas College Name Major Expected date of graduation address
Military Family Services Program Participant Survey Training Presentation.
Building State Capacity: Tools for Analyzing Transition- Related Policies Paula D. Kohler, Ph.D., Western Michigan University National Secondary Transition.
1. 2 Collaborative Partnerships It’s that evolution thing again! Adult education has been partnering and collaborating for years.
Evidence-based Evaluation for Afterschool Programs Denise Huang CRESST/UCLA 1/22/07.
DPI 21 st Century Community Learning Center New Grantee Orientation: Part 2.
Guideposts for Success Strategic Service Delivery Component Disability Employment Initiative.
Copyright © 2012 American Institutes for Research. All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Mariel Sparr.
© 2010 NATIONAL TECHNICAL ASSISTANCE CENTER FOR CHILDREN’S MENTAL HEALTH, GEORGETOWN UNIVERSITY Expanded School Mental Health Services (ESMH) in Baltimore.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Guidance for Completing Interim Report I Evaluation Webinar Series 3 Dec 2013.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
1. Administrators will gain a deeper understanding of the connection between arts, engagement, student success, and college and career readiness. 2. Administrators.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Section 1. Introduction Orientation to Virginia’s QRIS.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Broward County Public Schools BP #3 Optimal Relationships
Understanding Your LI Reports October 16, 2015 October 2015 Copyright © 2015 American Institutes for Research. All rights reserved. Samantha Sniegowski.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Office of Service Quality
Copyright © 2011 American Institutes for Research All rights reserved Washington 21st CCLC Evaluation March 1 Webinar Neil Naftzger and Samantha.
Washington 21 st CCLC Evaluation February 2016 Copyright © 2016 American Institutes for Research. All rights reserved Data Collection Activities.
Developed by: July 15,  Mission: To connect family strengthening networks across California to promote quality practice, peer learning and mutual.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
The PDA Center is funded by the US Department of Education Office of Special Education Programs Stories from the Field and from our Consumers Building.
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
Authentic service-learning experiences, while almost endlessly diverse, have some common characteristics: Positive, meaningful and real to the participants.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Clinical Practice evaluations and Performance Review
Understanding Your LI Reports October 19, 2016
Thanks for coming. Introduce 21st Century and team.
Continuous Quality Improvement Process
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Presentation transcript:

Copyright © 2011 American Institutes for Research All rights reserved. Oregon 21st Century Community Learning Centers Program Evaluation Neil Naftzger & Deborah Moroney April 16, 2012

2 Agenda Part 1: Evaluation and Leading Indicators, 1:30-2:15 Welcome and Introductions Sharing Roles Purpose of the Meeting Quality Framework Overview Discussion of the quality framework in relation to the evaluation, defining what quality programming is, and the desired outcomes of the 21 st Century program Leading Indicators (LI)

3 Key Questions How do LI relate to program quality? In our various roles supporting the academic and social success of children their families, how can we best work together and use the information we have here? What should be our next steps in thinking about this work?

4 The Evaluation Team American Institutes for Research Gibson Consulting Group, Inc. Oregon Department of Education and the Leading Indicators Work Group (LIAG)

5 Quality Framework

6 Children and youth bring their own contribution to the afterschool setting and to their own success in that setting.

7 Youth Characteristics 1.Academic performance and skills 2.Demographics 3.Communication, relationship and collaboration, critical thinking and decision making, and initiative and self-direction 4.Quality of learning experience at center feeder 5.Access to other key external developmental assets (e.g., family support and involvement, caring neighborhood)

8 Quality Framework The resources and characteristics of the local and school community support the development of program goals, program design, and allows for meaningful partnerships and program guidance.

9 Community Context 1.Locale (urban, suburban, rural) 2.School-based or center-based 3.Financial resources 4.Administrator support/Role of influential stakeholders and decision 5.School status relative to AYP/Status relative to other accountability 6.Program maturity 7.Grade level of youth 8.Community stability/safety

10 Quality Framework Program quality is based both on observable dimensions of quality and processes that are foundational to program quality, including Organizational Processes, Quality at the Point-of-Service, and Opportunities for Engagement.

11 Program Quality Organizational Processes 1.Definition of service population/enrollment 2.Recruitment approaches 3.Staffing (hiring, orientation, development, and evaluation) 4.Access to and use of youth data 5.Establishing linkages to the school day 6.Selection of key partners and partner engagement 7.Program improvement/evaluation processes 8.Approaches to parent/family engagement 9.Provision of developmentally appropriate opportunities for youth choice, voice, ownership, and program leadership 10.Alignment of youth needs, program objectives, and programming approach/theory of change (in relation to both academic and non-academic outcomes) 11.Selection and utilization of quality frameworks Instructional/Point of Service Quality 1.Safe, supportive, interactive, and engaging settings 2.Activities are sequenced, active, and focused 3.Intentional activity and session design/embedding content 4.Evidence of emotional support, activity organization, and instructional support

12 Quality Framework Children and youth are more likely to experience benefits from afterschool program participation if they attend consistently, over time, and in a variety of types of activities.

13 Program Participation 1.Duration of participation 2.Intensity of participation 3.Breadth of participation 4.Degree of interaction with a consistent set of staff

14 Quality Framework Afterschool program participants are most likely to reap positive youth outcomes if we take into account: 1.what participants bring to the program (Youth Characteristics); 2.how well the program reflects and involved the resources in the community (Community Context); 3.the extent in which they participate in the program (Participation) 4.the quality of the program (Program Quality).

15 Positive Youth Outcomes 1.Improved communication, relationship and collaboration, critical thinking and decision making, and initiative and self- direction skills 2.Enhanced bonding to school 3.Decrease in problematic, at-risk behaviors/disciplinary incidents 4.Improved school day attendance 5.Improved reading and mathematic achievement 6.Improved grade promotion 7.Improved college and career readiness/ACT-SAT scores

16 Goals of the Leading Indicator System  Provide information about how well an individual center and the state as a whole are doing in implementing programming that is likely to achieve the goals and objectives specified for the program  Help establish a standard of quality that grantees should be striving toward in the implementation of their program  Influence grantee behavior by detailing service delivery expectations and their performance relative to these expectations  Help inform state staff on what steps need to be taken from a training, technical assistance, and policy development front to support grantees in the achievement of program improvement goals

17 LI: Partners associated with the center are actively involved in planning, decision making, evaluating, and supporting the operations of the afterschool program. LI: Staff from partner organizations are meaningfully involved in the provision of activities at the center. LI: Staff at the center will be engaged in intentional efforts to collaborate and communicate frequently about ways to improve program quality. LI: Steps are taken by the center to establish linkages to the school day and use data on student academic achievement to inform programming Leading Indicators: Collaboration & Partnership

18 LI: Staff at the center are provided with training and/or professional development. LI: Staff at the center complete one or more self-assessment during the programming period. LI: Staff at the center are periodically evaluated/assessed during the program period. Leading Indicators: Staff

19 +Students+ LI: There is evidence of alignment between (a) program objectives relative to supporting youth development, (b) student needs, and (c) program philosophy/model AND frequency/extent to which key opportunities and supports are provided to youth. LI: There is evidence of alignment between(a) program objectives relative to the academic development of students, (b) student needs, and (c) program philosophy/model AND activities being provided at the center. LI: Intentionality in activity and session design among staff responsible for the delivery of activities meant to support student growth and development in mathematics and reading/language arts. Leading Indicators: Intentional Activities

20 +Families+ LI: Steps are taken by the center to reach out and communicate with parents and adult family members of participating students. LI: There is evidence of alignment between (a) program objectives relative to supporting family literacy and related development, (b) family needs, and(c) program philosophy/ model AND activities being provided at the center. Leading Indicators: Intentional Activities

21 Leading Indicator Reports  Goal is to embed leading indicator reports into PPICS in the interest of supporting program improvement efforts  Provide a snapshot of center status  Needs to be understandable and interpretable  Needs to convey meaningful information  Needs to support discussions and conversations with 21st CCLC staff  Facilitate an advisory group to guide and support the leading indicator development process

22 Part 2: Discussion of Methods Data Collection and Analysis, 2:30 – 4:00 Presentation of Evaluation Design Research Questions Primary Evaluation Design and Deliverables Role of the Leading Indicator Advisory Group Preliminary Findings Proposed Method(s) of Analysis Feedback on the Evaluation

23 Key Questions Are there particular ways we should look at the data we have presently? How should this work look in relation to evaluations in other states and in relation to other evaluative efforts to understand quality and impact?

24 Implementation Questions What is the spectrum of program quality across the programs under consideration?  Organizational Processes  Instructional/Point of Service Quality What organizational processes are found to be drivers of instructional/point of service quality at high performing centers? What instructional approaches are associated with high levels of student engagement at the point of service? What is the relationship between (1) the characteristics of individual youth, (2) program context, and (3) center quality and levels of student participation in 21st CCLC programming?

25 Evaluation Questions: Program Outcomes To what extent is there evidence that students participating in services and activities funded by 21st CCLC demonstrated better performance on the outcomes of interest as compared with similar students not participating in the program? To what extent is there evidence that students participating in services and activities funded by 21st CCLC more frequently demonstrated better performance on the outcomes of interest? To what extent is there evidence of a relationship between center and student characteristics and the likelihood that students demonstrated better performance on desired program outcomes?

26 New Data Collection Activities: Youth Outcomes Modified PPICS to allow for the collection of student identifiable information Data will be used to run queries against the state assessment data warehouse to obtain reading and mathematics scores and other relevant outcome data for 21st CCLC participants and non-participating students attending the same schools Data will be used to support impact analyses predicated on comparing 21st CCLC program participants with non- participants Method of analysis allows us to sort out preexisting differences between students who attend and those who do not

27 Data Collection Activities: Program Quality Site Coordinator Survey Focus on practices, policies, and procedures adopted by 21st CCLC- funded programs:  Collaboration & Partnership  Intentionality in activity and session design  Linkages to the school day  Data on student academic achievement to inform programming  Practices supportive of positive youth development  Practices supportive of family engagement Site Visits Highlight promising activity delivery practices:  Visit a small number of especially programs that have reported adopting high quality practices  Conduct program observations employing the CLASS observation tool

28 Report Functionality  Goal is to ensure reports can support meaningful comparisons  Against statewide averages  Over time  By key center characteristics  Grade level  Recruitment and retention policies  Staffing model  Activity model  Maturity  May attempt to include recommendations and action planning tools as well

29 Notable Implementation Findings - Other States Typically a fair degree of variation is found across both programs and staff within programs on the adoption of practices and approaches associated with quality implementation Documentation of a relationship between instructional practices theoretically associated with supporting youth engagement and student reports of engagement Importance of intentionality and youth ownership in activity session design and delivery Positive program climate and engaging settings tended to be predicated on relationships defined by knowledge of the students’ needs, interests, and personal lives Ongoing challenges demonstrated by programs in using student data to inform and drive the design programming, even in programs where the data was accessible to program staff

30 Contact Oregon Evaluation general Neil NaftzgerDeborah Moroney P: P: American Institutes for Research General Information: