IDEA Assessment Data Anne Rainey, IDEA Part B Data Manager, Montana

Slides:



Advertisements
Similar presentations
> Special Education Data … Communication is Critical for Reporting Special Ed Data to PennData & PIMS… Congruency Matters for.
Advertisements

State Directors Conference Boise, ID, March 4, 2013 Cesar D’Agord Regional Resource Center Program WRRC – Western Region.
Part B Indicators 1 & 2 Graduation and Dropout Western Regional Resource Center APR Clinic 2010 November 1-3, 2010 San Francisco, California.
Minnesota Assessment System Update Jennifer Dugan “Leading for educational excellence and equity. Every day for every one.”
OSEP New Data Manager Overview. Objectives Briefly review the IDEA data collections Outline OSEP’s data quality review process and associated documents.
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
ESEA FLEXIBILITY: QUESTIONS AND ANSWERS October 5, 2011.
Pouring a Foundation for Program Improvement with Quality SPP/APR Data OSEP’s message regarding Indicators 1, 2, 13 and 14 - data collection and improvement.
Submitting IDEA 618 Data to EDFacts: Pacific Entities Workshop September 8, 2014.
All Aboard! Child Outcomes Measurement Across Early Childhood Systems Dawn McGrath ICAN/ISTAR Director Ann Ruhmkorff Indiana GSEG Project Coordinator.
Assessing Students With Disabilities: IDEA and NCLB Working Together.
Welcome! Please join us via teleconference: Phone: Code:
Data in Perspective: A view of national, state, and local data collection, compilations and systems Presented by : Beth Hartness Program Specialist, National.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
Mathematics and Science Partnerships: An Introduction for New State Coordinators February /2013.
Adequate Yearly Progress Kansas State Department of Education 2007 Fall Assessment Conference Judi Miller,
1 Student Assessment Update Research, Evaluation & Accountability Angela Marino Coordinator Research, Evaluation & Accountability.
IDEA and NCLB Standards-Based Accountability Sue Rigney, U.S. Department of Education OSEP 2006 Project Directors’ Conference.
MIS DATA CONFERENCE 2012 JULY 23, 2012 Mississippi Department of Education Office of Federal Programs.
December 15, 2014 ESEA Flexibility Analysis. The flex analysis was designed to examine the characteristics of schools identified by each SEA’s differentiated.
Welcome to the San Francisco Mathematics and Science Partnerships Regional Meeting March 21-23, 2011.
Accessing Assessment Data  Marcy Tidwell Associate Director, Assessment Literacy.
Measuring the Power of Learning.™ California Assessment of Student Performance and Progress (CAASPP) Accessibility and Accommodations December 16, 2015.
Including analysis and self-help tools for coordination with Section 618: Table 6.
October REPORTING REQUIREMENTS FOR PRESCHOOL EDUCATIONAL ENVIRONMENTS.
Ohio’s Alternate Assessments for Students with Disabilities Thomas Lather Office for Exceptional Children (614)
Determining AYP What’s New Step-by-Step Guide September 29, 2004.
Florida Standards Alternate Assessment
Florida Standards Alternate Assessment
American Institutes for Research
Helping your Child Prepare for Testing
IDEA EDFacts File Specifications 101
Part C Data Managers — Review, Resources, and Relationship Building
New SELPA Directors November 2016 Presented by Heather DiFede
Terri Tommasone & Diana Abinader
Generate, An Automated federal Reporting Tool
What States are Doing That Meet the 1% Cap
KY Alternate Assessment
IDEA Part B Section 618 Data Quality Webinar
Part C State Performance Plan/Annual Performance Report:
Federal Policy & Statewide Assessments for Students with Disabilities
IDEA Part C Section 618 Data Quality Webinar
Webinar: ESSA Improvement Planning Requirements
Kim Miller Oregon Department of Education
Indicator 13, Secondary Transition IEP Record Reviews
Perkins IV Secondary Accountability
SPR&I Regional Training
Gary Carlin, CFN 603 September, 2012
CSPR Part II: Data Quality and the MEP Data Check Sheet
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Accessibility Supports Training
Update from ECO: Possible Approaches to Measuring Outcomes
Solving the Riddle That Is APR Indicator 3
School Improvement Ratings Rule 6A , F.A.C.
Welcome to today’s Webinar We will begin shortly
Grade 3 Reading Student Portfolio
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Supplemental Educational Services (SES)
Course Preparation Check List
Course Preparation Check List
Understanding Indicator 6: Early Childhood Special Education Settings for Children Ages Birth-Five Hello and welcome to Understanding Indicator 6: Early.
Accessibility Supports Training
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Module 6: Parental Placement
Assessing Students With Disabilities: IDEA and NCLB Working Together
Summerour Middle Planning Meeting
Information July 15, 2015.
News you can use Updates on IDEA Section 618 data
Examining Data for the 1% Waiver
Staff Turnover and Silos in Our State, Oh My!
Presentation transcript:

IDEA Assessment Data Anne Rainey, IDEA Part B Data Manager, Montana Nick Easter, Ed.D., IDEA Part B Data Manager, Nevada Meredith Miceli, Research to Practice Division Data Team Lead, OSEP

Welcome and Introductions Introduction of speakers

Objectives What is IDEA Assessment Data? Where is the data reported? Assessment MetaData Survey OSEP Review and Feedback How do they review? Data Quality Reports Data Notes Ensuring Data Quality Challenges Discussion

What is IDEA Assessment Data? Statewide Assessment Data Alternate Assessment Subset of all Assessment Data

Where is this data reported? EDFacts Files SPP/APR CSPR

Where is the Assessment MetaData reported? Assessment MetaData Survey – submitted via EMAPS Open / reopen periods align with assessment data open /reopen periods Respondents: State Assessment Director Read only access for Part B Data Managers & EDFacts Coordinators How is it used? Used to cross validate State’s assessment data submission Used in OSEP’s evaluation of completeness Used to determine which counts are considered proficient/ to calculate percent proficient Resource: Assessment Metadata Survey User Guide https://www2.ed.gov/about/inits/ed/edfacts/index.html

When are the data submitted/ resubmitted? Assessment data have 3 open/ reopen periods: December due date Data used for OSEP’s evaluation of timeliness, completeness, and accuracy of the data submission Feb/ March reopen period: Opportunity for States to resubmit assessment data or data notes to explain data quality inquires April reopen period: Final resubmission period Assessment data in the system as of the end of this resubmission period is used for public reporting and associated data products

How does OSEP Review the Assessment Data? Timeliness Are the data in the appropriate EDFacts system by the due date? Completeness Are data for all relevant file specifications submitted? Are data for all category sets, subtotals, and totals submitted? Do data match responses in the metadata sources? Accuracy Do data meet our edit checks? LEA Roll up Comparison Are there large differences between the sum of the counts reported at the LEA level compared to the counts reported at the SEA level?

Data Quality Review - Timeliness SEA level assessment data on children with disabilities (IDEA) are submitted to the EDFacts submission systems (ESS) by the December due date and time: C175 — Academic Achievement in Mathematics C178 — Academic Achievement in Reading (Language Arts) C185 — Assessment Participation in Mathematics C188 — Assessment Participation in Reading/Language Arts

Data Quality Review - Completeness Achievement data reported in ESS aligns to data reported in EMAPS by subject (M, RLA), assessment type, grade and performance level. Data are flagged if not aligned. Participation data reported in ESS aligns to data reported in EMAPS by subject (M, RLA), assessment type and grade. Note: Zero counts are required at the SEA level.

Data Quality Review - Accuracy For mathematics/ reading by grade and assessment type, the number of students with disabilities (IDEA) who took an assessment and received a valid score (reported in C175/ C178) should equal the number of students with disabilities (IDEA) who participated in an assessment (reported in C185/ C188). Data are flagged if counts do not match or data are blank. By assessment type, the number of students with disabilities (IDEA) who took an assessment and received a valid score reported in C175/ C178 should be reported at the same grade level as the number of students with disabilities (IDEA) who participated in an assessment reported in C185/ C188. Data are flagged if grade levels are misaligned. Limited to HS grade misalignment

Communication of Data Quality Inquiries OSEP’s comments/ data quality inquiries communicated via Data Quality Report (DQR) During February: Partner Support Center (PSC) sends out an email to Assessment Directors, EDFacts Coordinators, and Part B Data Managers with the comments on the assessment data submissions from OESE/ OSS and OSEP OSEP also posts the DQR for the IDEA Assessment data on OMB Max. During March: If data quality inquiries remain, PSC sends out another email to same group of recipients

Data Quality Reports OSEP’s communication tool regarding the State’s data submission Provides State with: OSEP’s evaluation of the timeliness, completeness, and accuracy of the State’s data submission OSEP’s data quality inquiries OSEP’s expectation to either resubmit data/ metadata or provide an explanation in the form of a data note Locations: Posted on OMB Max Emailed with the CSPR comments to Part B Data Managers (as well as the Assessment Director & EDFacts Coordinator)

Data Notes Opportunity for the State to clarify situation(s) associated with the submission of the assessment data and metadata Possible situations: Data collection or reporting anomalies Relevant changes in assessments or data collection/ reporting processes from previous year Implementation of new assessments, initiatives, processes that may impact data or metadata Opportunity for the State to respond to OSEP’s data quality inquiries Data Notes submitted by States may be provided to the public to accompany the public release data file

What does OSEP do with the Data? APR: Pre-populate the APR for Part B indicator 3 Evaluation of data submission Part B Results Matrix Data Display Public Release Data File Static Tables Ad hoc requests (i.e., targeted analyses of previously collected data)

Public reporting of IDEA assessment data If the difference between participation and performance counts results in more than a 1 percentage point increase or decrease in the percent proficient, the number of students with disabilities who scored at or above proficient on the assessment and the number of students with disabilities who took that type of assessment were suppressed from the public file. High school data is reported as HS and all grades are rolled up together into a total count grades 9-12.

How does that translate into what happens at the SEA Level? Lessons Learned from Nevada

Validating Assessment Data What to validate for: Accuracy is if the total participants are equal to the total number of performance scores. Completeness is if the EDFacts data matches the Assessment Metadata Survey.

Across file comparison Language Arts The number of students with disabilities (IDEA) who took a Reading assessment and received a valid score (FS 178) for regular assessments based on grade level achievement standards must equal the number of students with disabilities (IDEA) who participated in the assessment (FS 188).

Across file comparison Math The number of students with disabilities (IDEA) who took a Math assessment and received a valid score (FS 175) for regular assessments based on grade level achievement standards must equal the number of students with disabilities (IDEA) who participated in the assessment (FS 185).

Achievement metadata matching Achievement data (FS 175) submitted for regular assessments based on grade level achievement standards with and without accommodations has data for all performance levels (ex. PERF LEVEL 1-4) SEA data file must include all performance levels (ex. PERF LEVEL 1-4) for all data categories including zero counts.

Participation metadata Participation data (FS 185) submitted for assessments based on grade level achievement standards with and without accommodations must match the grades offered within EMAPS assessment metadata survey.

Comparison between C185 and CSPR

Comparison between APR and CSPR

Open Discussion What are the challenges you face with submitting and updating Assessment Data? What would make the process easier?

Questions? Any remaining questions? Contact Us Anne Rainey, Part B Data Manager, MT (arainey@mt.gov) Nick Easter, Ed.D., Part B Data Manager, NV (neaster@doe.nv.gov) Meredith Miceli, Research to Practice Division, OSEP (Meredith.Miceli@ed.gov)