QRIS Standards Learning Table

Slides:



Advertisements
Similar presentations
Invest in Children Child Care Quality Fund: Accreditation and Literacy
Advertisements

Massachusetts Quality Rating and Improvement System (QRIS) Overview of revised standards and initial pilot design.
Response to Recommendations by the National Association of Child Care Resource & Referral Agencies (NACCRRA) The Massachusetts Child Care Resource & Referral.
1 Income Eligible Re-Procurement Board of Early Education and Care January 14, 2009.
TWS Aid for Supervisors & Mentor Teachers Background on the TWS.
A C OUNT FOR Q UALITY : C HILD C ARE C ENTER D IRECTORS ON R ATING AND I MPROVEMENT S YSTEMS Karen Schulman National Women’s Law Center NARA Licensing.
Orientation for New Site Visitors CIDA’s Mission, Value, and the Guiding Principles of Peer Review.
Early Achievers Overview Starting Strong – August 15, 2012.
Embedding the Early Brain & Child Development Framework into Quality Rating and Improvement Systems Meeting Name Presenter Name Date 1.
National Center on Child Care Quality Improvement QRIS Standards Learning Table Session #4: Efficiency: Streamlining QRIS using your State Knowledge and.
Beth Rous University of Kentucky Working With Multiple Agencies to Plan And Implement Effective Transitions For Head Start Children Beth Rous University.
The Revised Strengthening Families Self-Assessments: What’s Different?
QRIS Oregon's Quality Rating and Improvement System Overview OREGON’S QUALITY RATING AND IMPROVEMENT SYSTEM The Teaching Research Institute Center on Inclusion.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
1 EEC Board Policy and Research Committee October 2, 2013 State Advisory Council (SAC) Sustainability for Early Childhood Systems Building.
2012 Implementation: A QUALITYstarsNY Recruitment Session.
National Center on Child Care Quality Improvement QRIS Standards Learning Table Session #5 -- Effective Cross-Sector QRIS: Challenges and Opportunities.
Assessment in the early years © McLachlan, Edwards, Margrain & McLean 2013.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Diane Schilder, EdD and Jessica Young, PhD Education Development Center, Inc. Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Smart Start National Conference May 6, 2015
EEC Board Policy and Research Committee Meeting April 7, 2014 Race to the Top Early Learning Challenge Grant (RTT-ELC)
First, a little background…  The FIT Program is the lead agency for early intervention services under the Individuals with Disabilities Education Act.
Combined Grades Making Them Work Fall 2007 Building Classes of Combined Grades “In successful schools, classrooms are organized to meet the learning.
1 Peer Assistance and Coaching (PAC) Race to the Top – Early Learning Challenge Grant.
Implementation School Readiness Assessment System and the READ Act Webinar January 23, :00-6:00 pm.
Coaching for School Readiness
1 QUALITYstarsNY Field Test Community Information Session 2010 WELCOME!
A Quality Rating and Improvement System (QRIS) for Early Care and Education Settings.
January 31, 2014 Data Use Information and Guide. Copyright © 2014 New Teacher Center. All Rights Reserved. What is “TELL Oregon” ? TELL Oregon is an anonymous.
Public School-Operated UPK Information Session. Goals Increase your understanding of QUALITYstarsNY Answer your questions and concerns about participating.
EEC Board Preliminary Recommendations Quality Rating and Improvement System (QRIS) Provisional Standards Study.
Oregon’s Quality Rating and Improvement System and Education Service Districts OAESD Spring Conference May 15, 2015.
National Center on Child Care Quality Improvement Supporting the Cost of QRIS Administration QRIS Financing Learning Table Session Two July 26, 2012.
A collaborative venture among state agencies, the Governor’s Office, and state and local organizations.
TOGETHER WE’RE BETTER Collaborative Approaches to Including Children With and Without Disabilities Camille Catlett & Jennie CoutureNovember 9, 2012.
HECSE Quality Indicators for Leadership Preparation.
Georgetown University National Technical Assistance Center for Children’s Mental Health 1.
OREGON’S QUALITY RATING AND IMPROVEMENT SYSTEM AND SUPPORTING CHILDREN WITH DISABILITIES AND THEIR FAMILIES: IMPLICATIONS FOR EI/ECSE PERSONNEL Gary Glasenapp.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
Crosswalk of Public Health Accreditation and the Public Health Code of Ethics Highlighted items relate to the Water Supply case studied discussed in the.
National Center on Child Care Quality Improvement Re-alignment and Re-purposing: How can states maximize existing funding to support cross-sector QRIS?
Vermont’s Early Childhood & Family Mental Health Competencies A story of Integration & Collaboration  How can they help me?
EEC Board Preliminary Recommendations Quality Rating and Improvement System (QRIS) Provisional Standards Study.
1 Free Help: State Support Team Technical Assistance Services 2012 MIS Conference February 15, 2012 Corey Chatis, State Support Team Jan Petro, CO Department.
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
Resources for Supporting Engagement for Each and Every Family 1.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Section 1. Introduction Orientation to Virginia’s QRIS.
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
1 Early Childhood Assessment and Accountability: Creating a Meaningful System.
Minnesota's Approach to Comprehensive Assessment Megan E. Cox, Ph.D. Principal Leadership Academy January 11, 2016 Minnesota’s Approach to Comprehensive.
Introduction to the Pennsylvania Kindergarten Entry Inventory.
EEC Annual Legislative Report January Context Legislative language requires EEC to submit an annual report on Universal Pre- Kindergarten (UPK)
1 A Multi Level Approach to Implementation of the National CLAS Standards: Theme 1 Governance, Leadership & Workforce P. Qasimah Boston, Dr.Ph Florida.
Trends in Quality Rating and Improvement Systems September 2013.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Improving Data, Improving Outcomes Conference Washington, DC Sept , 2013 Planful Changes: Using Self-Assessments to Improve Child and Family Outcome.
First Things First Grantee Overview.
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
2018 OSEP Project Directors’ Conference
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Professional Development:
Linda Mayo Willis and Carolyn Pope Edwards
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Presentation transcript:

QRIS Standards Learning Table Session #4: Efficiency: Streamlining QRIS using your State Knowledge and Data-based Experience

Introductions and Updates Introduce the state team (Name, title, agency) AL, CA, CT, GA, HI, NV, OR, VI Update us on what your state team has been working on in the development of your QRIS since our last call. If a certain resource or idea has been particularly helpful, tell us about that. What is your current, most pressing challenge?

Homework Discussion AL, CA, CT, GA, HI, NV, OR, VI What did your state consider in the development of QRIS standards? What type of data are you collecting to inform future revisions? How is your state using research to inform your selection of standards?

Overview – of Presentation Today Data systems and standards Using data for decision-making in QRIS design and revision Oregon experience using data NAEYC experience using data Data efforts (national) KY – slides and notes at the end as a resource

QRIS Data Systems Support Implementation Online application (provider portals for uploading documents, connecting to relevant resources) Data import from other systems (regulation, registry, onsite assessment reports, etc.) Calculating ratings, relationship between standards/policies and program participation and levels of quality Supporting the QI/TA functions …Data!

Use Data to Eliminate Criteria If your state data show that all or most providers meet a criterion (no variation by level), consider dropping it. Or move the criterion to Level 1. Or if it’s an essential element defining quality, keep it, but don’t use it to determine ratings.

Use Data to Move/Revise Criteria Suppose your state data show that very few or no providers meet a criterion. If it’s not an essential element of quality, consider dropping it completely. If it is an essential element of quality, consider moving the criterion to the top Level or moving it into the CQI section of your QRIS focusing TA and PD on improvement on it, and not including it in ratings until practice has advanced.

Use Data to Find ‘Predictor’ Criteria With research partners, explore the relationships among criteria. Is there a set of items that consistently are met? It is possible to determine statistically if one of them is a “predictor” (if it’s met, very highly likely that the others are also met)

Use Data to Revise QRIS Suppose the data shows that programs in your state QRIS are meeting many criteria (but not all) in the block above where they are now. Use criteria level data from the programs currently participating in QRIS to model how programs might score in alternative rating structures – points or hybrid. KY has done that (as resource at the end) OR – will tell us about OR’s use of research to inform QRIS development

Oregon’s Process to Streamline QRIS Standards

Brought together two groups Structural Indicators of Quality (QI) Environmental Indicators of Quality (OPQ) Vision, Mission and Guiding Principles

Workgroups Charge Merged indicators of quality together with intensive input from the Standards Workgroup. (Dec - March) Reviewed input to the standards. (Jan-Aug) Provided final recommendations based on input. (May-Sept)

Input to the Standards Gather input from as many interested parties as possible. Give interested groups both access and time to provide input. Seek input in a variety of ways. Work for a balance between achievability and perfection. Remember TQRIS isn’t a silver bullet. Considered Recommendations in the larger context of whole system. Few and powerful Understandable, relevant and intuitive Measureable and feasible to monitor Progressive/distinct among the levels Goals of the State

Sources of input to the Standards Development Standards Workgroup of Statewide Partners Research from Oregon’s Quality Indicators Research from Oregon Program of Quality Field Test Monitoring Learning Labs with North Carolina Early Learning Guidelines including the Head Start Child Development Early Learning Framework and Birth to Three Early Learning Guidelines Race to the Top Grant Feedback Cost Modeling from national TQRIS experts Cultural and Linguistic Competency Technical Assistance from Build Foundation Oregon’s Licensing Regulations

Focus Group input to the Standards Development Focus Groups of 250 child care and early education providers and programs across Oregon Focus Groups of 13 Child Care Resource and Referral agencies Focus Groups of Oregon’s licensing specialists Focus Groups of health and nutrition specialists across Oregon Focus Groups of child care union members Focus Groups of Oregon’s Professional Development Committee

NAEYC Accreditation Reliability and Validity Study Why NAEYC Accreditation is important and can inform QRIS development Findings of note in re QRIS and accreditation Validity: Meaningful and significant differences in the percent of criteria met in several standards (Teaching, Relationships, Assessment of Child Progress) between programs that achieve accreditation and those that do not. Content: Strong positive relationship between meeting lead teacher qualifications and meeting higher proportion of criteria in Relationships; Content: On overall diversity and cultural competence criteria, significant difference between programs that achieve accreditation (91% met) and those that do not (77% met)

NAEYC Accreditation as a Mark of Program Quality Kyle Snow, Ph.D. Senior Scholar and Director Center for Applied Research National Association for the Education of Young Children Research Policy Practice

Goals Short Overview of NAEYC Accreditation What do we know about Accreditation NAEYC Accreditation & QRIS Congruence

About NAEYC Accreditation NAEYC Accreditation is a meaningful tool for program quality improvement for programs serving children birth through kindergarten. Developed in the early 1980s A comprehensive system review and reinvention was fully implemented in fall 2006. In 2010 an independent review of the site visit and decision protocols was completed validating these processes.

A Portrait of Accredited Programs As of 11/24/12, there are 6,748 accredited programs serving 592,675 children Corporate Structure:  Non Profit 60.3% Public Agency 19.0% For Profit 19.0% Not stated 2 1.6% Special Populations: None 47.6% Migrant workers 4.8% Teen parents 23% Homeless families 17.5% Other: 19.0% (incl. 13.5% low income) Program Affiliations: College/University 5.6% Employer-Sponsored 7.1% Faith-based Institution 9.5% Head Start 31.7% Hospital 2.4% Migrant services 1.6% Military Installation 2.4% Public School 19.8% US Government Facility 3.2% Parent Cooperative 11.1% Indian Tribe .8% Alaskan Native Village .8%

About NAEYC Accreditation 4-Step Process Meet and Maintain Standards Becoming a Candidate Becoming an Applicant Enrollment in Self-Study 1 2 3 4 Quality Improvement Self-Assessment Site Visit

NAEYC Program Standards and Criteria 1 – Relationships 2 – Curriculum 3 – Teaching 4 – Assessment of Child Progress 5 – Health 6 – Teachers 7 – Families 8 – Community Relationships 9 – Physical Environment 10 – Leadership and Management Standard Topic Criteria Indicator(s) Sources of Evidence

NAEYC Program Standards and Criteria Possible Outcomes: Accredited Deferred Denied To be accredited: 80% of all assessed criteria in each standard 70% on all criteria assessed in each group All Required Criteria

NAEYC Accreditation - Recap Programs strive to meet NAEYC program standards Programs self-assess Assess programs against 10 standards that are research based Performance based upon multiple indicators and multiple sources of evidence Process allows for self-assessment and NAEYC performance feedback Process includes quality indicator and improvement systems But – does it really define quality, can programs attain it, can they maintain it, and can it be monitored?

What do we know about Accreditation? Reinvention and Criteria validation During field tests for reinvention, NAEYC (2005) reported significant correlations between criteria (at the standard level) and Early Childhood Environment Rating Scale (ECERS) scores among 70 early childhood programs. The strongest relationships were found between overall quality and program standards for relationships, curriculum, and teaching. Validation studies Sachs and Weiland (2010): schools engaged in accreditation scored higher on subscales of the ECERS-R, and children had higher scores on the Peabody Picture –Vocabulary Test (PPVT-III) compared to peers in programs not accredited (even after controlling for initial PPVT scores). State-level data within QRIS systems PA Keystone STARS program (OCDEL, 2010) showed significant correlations between accreditation and environmental ratings of program quality (ECERS, ITERS, SACERS)

What do we know about Accreditation? Trend Briefs (http://www.naeyc.org/academy/primary/trendbriefs) communications intended to share data on programs seeking accreditation and to connect the findings to early childhood research trends. Releases to date: Teaching: Accreditation of Programs for Young Children Standard 3 Assessment of Child Progress: Accreditation of Programs for Young Children Standard 4 Relationships: Accreditation of Programs for Young Children Standard 1 Supporting Cultural Competence: Accreditation of Programs for Young Children Cross-Cutting Theme in Program Standards Upcoming: Family Engagement: Accreditation of Programs for Young Children Cross-Cutting Theme in Program Standards

What do we know about Accreditation? Trend Briefs: Data source: Sample included 130 programs receiving accreditation site visits between September 2009 and July 2010. Data captured on all 417 NAEYC criteria Comparisons between accredited and not accredited programs’ performance on all criteria

What do we know about Accreditation? Trend Briefs - Selected findings: Relationships (NAEYC Standard 1) Differences are noted in terms of programs’ means of dealing with challenging behavior, but even more so in the degree to which programs provide a “predictable, consistent, and harmonious” classroom. Teaching (NAEYC Standard 3) Programs differ primarily among criteria that assess the use of scaffolding strategies in the classroom. Assessment of Child Progress (NAEYC Standard 4) Programs accredited by NAEYC demonstrate a planned, intentional use of child assessment and communication of assessment results: using assessments to improve instruction and program design, and to effectively communicate assessment results to other teachers and families.

What do we know about Accreditation? Trend Briefs - Selected findings: Supporting Cultural Competence (Cross-Standard) Many of the same criteria that prove the most challenging overall also differentiate between programs that became accredited and those that did not. Differences in how programs can connect with diverse families and engage them in the child’s program Differences in programs’ ability to understand, and respect, diversity in family values, especially when they may differ from those of the teacher. Differences in hiring diverse staff and ensuring staff receive training that includes working with diverse families. Differences in providing children with varied and deep experiences to support their own cultural competence.

What do we know about Accreditation? Some data to suggest valid indicator of quality Need more validation studies and data Analysis of Accreditation data show differentiation between programs accredited and those not accredited, even when all attempt to reach same criteria Future analyses can identify performance clusters, possible examine program performance pre-self-study to site visit to examine potential for quality improvement processes

Accreditation and QRIS Congruence State recognition of accreditation within QRIS ratings Some states use NAEYC Standards for specific areas Alignment of program standards Streamlining for programs that meet accreditation standards Accreditation Facilitation (Program Quality Improvement) Project models

Accreditation and QRIS Congruence State QRIS systems include accreditation in various ways: Not recognized Awarding additional points towards rating (overall or in specific areas, varying by system) Enter at top (or near-top) rating Some combine accreditation with ERS visits Some differentiate accrediting bodies

Accreditation and QRIS Congruence In what ways can states benefit from NAEYC experience through accreditation in designing and implementing QRIS systems for program quality recognition and improvement, and in communicating with families?

Data Can Facilitate Cross-State Sharing and Comparison What data elements does your system need? Are there common definitions of data elements? National data efforts to be aware of…

Common Education Data Standards Early Learning is one domain in the overall P-20 data model https://ceds.ed.gov/Default.aspx

Quality Initiatives Research and Evaluation Consortium (INQUIRE) INQUIRE supports high quality, policy-relevant research and evaluation on quality rating and improvement systems (QRIS) and other quality initiatives by providing a learning community and resources to support researchers. The INQUIRE Consortium also provides input and information to state administrators and other policymakers and practitioners on evaluation strategies, new research, interpretation of research results, and implications of new research for practice. Child Trends helps to facilitate INQUIRE activities

INQUIRE and Data QRIS/QI Data Elements workgroup of INQUIRE worked with US Department of Education group focusing on Common Education Data Standards (CEDS) to create a recommended list of data elements, which is out now for public comment. developing a list of recommended data elements for QRIS and Quality Improvement purposes will be developing a set of data elements, especially for child care state administrators and CCDF reporting

Questions, Reflections, Comments?

Homework for January 17, 2013 Effective Cross-Sector QRIS: Challenges and Opportunities Cross-sector QRIS means one that aims for participation by most group early care and education providers, regardless of funding stream or auspice. At a minimum, this includes child care centers and family child care homes, Pre-K and Head Start, i.e., all publically supported and licensed settings, but not informal caregivers.   A survey monkey link will be emailed to you for use in completing the homework questions. – Due January 4th (for January 17, 2013 webinar)

Homework Questions for 1.17.12 Session Do you have a plan to include a cross sector approach in the QRIS? Why did you make that decision? Identify the phase in plan for different sectors (i.e. Are you beginning with ‘all in’ or phasing in over a few years)? What challenges have you experienced in your efforts to develop and/or implement a cross-sector QRIS? What successes have you had with cross-sector QRIS? How do license-exempt centers (e.g. preK programs located in public or private schools) participate in your QRIS? Have you created an 'equivalent' standard for licensing? What have you learned about strategies for effectively engaging the support systems of other sectors (e.g. the Head Start T/TA system or early intervention training) in QRIS supports? Have you tried to engage monitoring or accountability systems from other sectors (such as collaborating with Head Start or PreK monitoring)? Have you worked with systems like early intervention, child welfare, and others to ensure that they understand QRIS and prioritize child placements in higher-quality settings?

Follow-up Contacts: OCCQualityCenter@icfi.com Thank You National Center on Child Care Quality Improvement NCCCQI does not endorse any non-Federal organization, publication, or resource. Follow-up Contacts: OCCQualityCenter@icfi.com dmathias@buildinitiative.org tcamillo@Brightstars.org anne.walsh.mitchell@gmail.com louise.stoney@gmail.com www.qrisnetwork.org dawn.a.woods@state.or.us ksnow@naeyc.org

Presented with permission from Child Trends (2012)

Presented with permission from Child Trends (2012)

Presented with permission from Child Trends (2012)

Presented with permission from Child Trends (2012)

Presented with permission from Child Trends (2012)

Presented with permission from Child Trends (2012)

Presented with permission from Child Trends (2012)