What Data Can Tell Us – and What It Can’t

Slides:



Advertisements
Similar presentations
Six Year Plan Meeting the state targets Region Meeting August 16, 2007.
Advertisements

HOW TO EXAMINE AND USE FAMILY SURVEY DATA TO PLAN FOR PROGRAM IMPROVEMENT Levels of Representativeness: SIOBHAN COLGAN, ECO AT FPG BATYA ELBAUM, DAC -
Office for Civil Rights Ensuring Equal Access to Gifted Education Summary of Issues and Recommendations Mary Ruth Coleman, Ph.D. FPG Child Development.
The Center for IDEA Early Childhood Data Systems The Power and Challenges of Early Childhood Integrated Data Systems: Implications for Researchers Kathleen.
The Center for IDEA Early Childhood Data Systems Using Needs Assessments to Identify and Evaluate Technical Assistance: Results of a National Survey about.
The Center for IDEA Early Childhood Data Systems Kathleen Hebbeler Lauren Barton Suzanne Raber The DaSy Center at SRI International The Power and Challenges.
ELIZABETH BURKE BRYANT MAY 9, 2012 Building a Solid Foundation for Governors’ Education Reform Agendas through Strong Birth-to-3 rd Grade Policies.
Using State Data to Inform Parent Center Work. Region 2 Parent Technical Assistance Center (PTAC) Conference Charleston, SC June 25, 2015 Presenter: Terry.
The Center for IDEA Early Childhood Data Systems Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development.
SPR&I: Changes, New Measures/Targets, and Lessons Learned from Focused Monitoring Visits David Guardino, SPR&I Coordinator Fall 2009 COSA Conference.
Responding to Special Education Disproportionality Understanding your Data Presenters: Nancy Fuhrman & Dani Scott, DPI.
National, State and Local Inclusion Data: Accessing and Using Data to Increase Inclusive Opportunities Mary Peters, ECTA Debbie Cate, IDC, ECTA Inclusion.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
The Center for IDEA Early Childhood Data Systems National Meeting on Early Learning Assessment June 2015 Assessing Young Children with Disabilities Kathy.
What the data can tell us: Evidence, Inference, Action! 1 Early Childhood Outcomes Center.
Early Childhood Conference: Improving Data, Improving Outcomes September 10-11, 2014 New Orleans, LA.
National, State and Local Inclusion Data: Accessing and Using Data to Increase Inclusive Opportunities Debbie Cate, IDC, ECTA
Individual Family Service Plans vs
Time for Change: Examining Utah Data Relating to Student Performance
Addressing Significant Disproportionality: How One State and One LEA Are Using IDC Success Gaps Tools to Make Meaningful Change November 4, 2015 Presented.
Introduction to the toolS
OSEP Project Directors’ Conference August 1-3, 2016
Status of Part B 619 State Data Systems
Child Outcomes Summary Process April 26, 2017
Incorporating Early Childhood into Longitudinal Data Systems:
What is “Annual Determination?”
Part C Data Managers — Review, Resources, and Relationship Building
DISPROPORTIONALITY REGULATIONS
New Significant Disproportionality Regulations
Using Formative Assessment
Improving Data, Improving Outcomes Conference Aug , 2016
CClick here to get started
What’s New in the IDC Part C Exiting Data Toolkit
National, State and Local Educational Environments Data:
OSEP Project Directors Meeting
Checking Your Data With Outlier Analyses
High-Leverage Practices in Special Education: Assessment ceedar.org
IDEA Part C Section 618 Data Quality Webinar
Using Data to Reduce Suspension and Expulsion of Young Children
Questions, Data, and Reports: Connecting the Data Dots Using the DaSy Framework System Design and Development Subcomponent Taletha Derrington and Kathleen.
Child Outcomes Data: A Critical Lever for Systems Change
Improving Data, Improving Outcomes Conference, September 2014
DaSy Conference Monday, September 8
Improving and Using Family Survey Data
Integrating Outcomes Learning Community Call February 8, 2012
2018 OSEP Project Directors’ Conference
IDEA Part C and Part B Section 619 National Child Outcomes Results for
Early Childhood Conference: Improving Data, Improving Outcomes
DaSy Conference Monday, September 8
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
Using outcomes data for program improvement
Let’s Talk Data: Making Data Conversations Engaging and Productive
Grantee Guide to Project Performance Measurement
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Benefits and Challenges of Integrating Data in Early Childhood:
Starting Community Conversations
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Head Start Research Conference Washington, DC July 2014
Christina Kasprzak Frank Porter Graham Child Development Institute
Using the Child and Family Outcomes Analysis Tools
Data Culture: What does it look like in your program?
Communicating Your Data
Section 618 Public Reporting Requirements Thursday, September 11, 2014
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
The Annual Report to Congress on IDEA
Significant Disproportionality Fiscal Webinar
Examining Data for the 1% Waiver
Presentation transcript:

What Data Can Tell Us – and What It Can’t 2015 Leadership Conference “All In: Achieving Results Together” What Data Can Tell Us – and What It Can’t How to be sure we know what our data means Dana Manning, University of Kentucky HDI Tom Munk, IDC & Westat Debbie Cate, IDC at FPG, UNC Chapel Hill Siobhan Colgan, IDC at FPG, UNC Chapel Hill

Outline of Session Overview of critical issues in interpreting data Indicator examples Part C: C3, C4 Part B: B6, B7, B9 & B10 Discussion

The components of our measurement system determine the meaning of our results

Defining the Question

Designing a Measurement Strategy

Sampling Frames & Sampling

Data Collection & Data Entry

Data Preparation & Analysis

Limits of Interpretation

Indicator examples

Part C Indicator 4 Family Data

Part C Indicator 4 Description Early Intervention programs are required to report the percent of families participating in Part C who report that early intervention services have helped their family: Know their rights; Effectively communicate their children's needs; and Help their children develop and learn.

Data Collection System Program Helpfulness vs. Family Outcomes Nationally – All states use surveys Majority of states use three surveys Others use unique surveys Survey question(s) vary across and within survey types Scoring metric used (i.e. cut-points) varies across and within survey types

Data Collection System, continued Variations in approaches nationally Distribution and return methodologies Timing of survey administration Family population included Sampling and census models are both seen Comparison data (i.e. to analyze representativeness)

Data Quality Issues: C4 Survey methodology issues Response rates Representative results Response bias Reliability and validity of the survey tool

What can these data tell us? Can illustrate comparative differences among subgroups within the state Generalize within population surveyed Generalize within response pool (e.g. region, race/ethnicity) Reporting helpfulness of the program, not family capacity or true outcomes

Cautions in Interpretation: C4 Helpfulness to the Family ≠ Family Outcomes Consider methodological difference when making comparisons to other states Consider who is missing from your data- who do your data really represent? Consider your scoring metric in interpreting percentages

Solutions & Suggestions: C4 Use comparison data matching your population exactly Use data analysis techniques like weighting Use your other survey data (beyond Indicator 4) Use other family data collection modalities to inform program improvement

Early childhood outcomes: Indicators c3 & b7

Indicator C3: Percent of infants-toddlers with improved outcomes Indicator B7: Percent of preschool children with improved outcomes

Which Questions? How well do our children perform on an assessment? How much progress are our children making? Over what time period? Are instructors doing a good job teaching our children? Are programs doing a good job teaching our children? Is our state doing a good job teaching our children? How well is our state doing compared to another state?

Comparing Programs LOW CHILD OUTCOMES HIGH CHILD OUTCOMES

preschool settings: Part b indicator 6

Indicator B 6 Educational Environments 3-5 Where do children ages 3-5 attend and receive IDEA services? Snapshot of environment All children ages 3-5 Point in time October 1- December 1 Includes children aged 5 in kindergarten

Coding - Computer vs. People Computer uses IEP fields and determines appropriate code People use their knowledge and/or tools to determine the appropriate code What might the differences be if using varied collection systems?

Indicator B 6 Educational Environments 3-5 Where do children ages 3-5 attend and receive IDEA services? Preschool? or Kindergarten? Which program?

Indicator B 6 Educational Environments 3-5 Data can NOT be disaggregated by Preschool or kindergarten setting in 618 data collection Data can NOT identify specific program in 618 data collection State pre-k, Head Start, child care, other Requires additional state data

Indicator B 6 Educational Environments 3-5 Can be disaggregated by Age Race - Ethnicity Disability Category English Language Learner status

Educational Environments Ages 3-5, December 1, 2013 Educational Environments Ages 3-5, December 1, 2013 All Children 3-5 Including 5 year olds in Kindergarten Indicator 6 -- Percent of Children

Educational Environments Ages 3-5, December 1, 2013 Educational Environments Ages 3-5, December 1, 2013 All Children 3-5 Excluding Children in Kindergarten Compared to Kindergarten Only -- Percent of Children

Looking Across Settings

Disproportionality: Part b Indicators 9 & 10

States and Territories Must Report B9. Percent of districts with disproportionate representation of racial and ethnic groups in special education and related services that is the result of inappropriate identification; and B10. Percent of districts with disproportionate representation of racial and ethnic groups in specific disability categories that is the result of inappropriate identification.

Question 1: Disproportionality? For each district in your state… Are students from some racial/ethnic groups of students more (or less) likely to be identified for special education services than other students? … for services in any of these particular categories? ED, SLI, ID, Autism, OHI, SLD

Question 2: Inappropriate Identification? If so, is it the result of inappropriate policies, practices, and procedures?

How do States Measure Disproportionality? Differently Mostly starts with risk: # of students with disabilities in racial/ethnic group # of enrolled children in racial/ethnic group Risk ratio, alternate risk ratio, weighted risk ratio… OR e-formula, composition difference, # affected… OR some combination In 2012-13, 45 of 52 States used the risk ratio, 7 of these in combination with another method

Who Counts? Sampling Frames at the District Level No sampling; all students included; many smaller groups excluded.

Data Collection and Data Entry Fairly solid. These are official counts of students with and without disabilities.

Data Preparation and Analysis: Check your work! Are your formulas accurate? Do the numbers look reasonable? For example, remember that an overrepresentation of one group usually has to be balanced by an underrepresentation of another group

Data Preparation and Analysis: Thresholds Disproportionate Representation is defined as a risk ratio that exceeds the state threshold Thresholds vary from state to state! 3.0 for 16 states 2.0 for 10 states

Percent of Districts with Disproportionate Representation

Interpretation Is zero a sign of no problem in the state? Or Is it the result of state analysis choices?

Data Preparation and Analysis: Inappropriate Identification Some states look very narrowly at what constitutes inappropriate policies, practices, and procedures. Was there an obvious, documented, lack of compliance? A broader conception is possible – see, for example, IDC’s Success Gaps documents. Appropriate identification begins in general education!

Number of States Reporting Various Percentages of Districts with Disproportionate Representation That Was the Result of Inappropriate Identification for B9: 2005–06 Through 2012–13

Number of States Reporting Various Percentages of Districts with Disproportionate Representation that was the Result of Inappropriate Identification for B10: 2005–06 Through 2012–13

Interpretation Is zero a sign of no problem in the state? Or Is it the result of state analysis choices?

Elements of an Appropriate Identification Did the overrepresented group have access to high quality Data-based decision making Cultural responsiveness Strong core instructional program Universal screening and progress monitoring Multi-tiered interventions and supports With strong parent involvement throughout?

What Can We Say? Districts that are identified with disproportionate representation should look very closely at the reasons behind the numbers, using a tool like the Success Gaps tool

What Can’t We Say: Limits of Interpretation States with fewer districts identified have less disproportionality. States with fewer districts identified have less inappropriate identification. Until the methods are standardized, don’t compare indicator B9 and B10 results across states.

Discussion Questions  What’s the difference between child outcomes and program performance?  Can your data tell you which states, districts, or programs are performing better?  What policy questions would you like to know about that your data cannot currently give you answers to?

Comments or Questions?

For More Information Visit the IDC website http://ideadata.org/ Follow us on Twitter https://twitter.com/ideadatacenter

Grant Information The contents of this presentation were developed under a grant from the U.S. Department of Education, #H373Y130002. However, the contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Richelle Davis and Meredith Miceli