Identifying Students in Need of Modified Achievement Standards and Developing Valid Assessments.

Slides:



Advertisements
Similar presentations
New Eligibility and Individualized Educational Program (IEP) Forms 2007 Illinois State Board of Education June 2007.
Advertisements

State-wide Assessment Update for What Does TNs Alternate Assessment Program Look Like Now? Alternate Assessment General Assessment Alternate.
Instructional Decision Making
Presented to the State Board of Education August 22, 2012 Jonathan Wiens, PhD Office of Assessment and Information Services Oregon Department of Education.
2015 SpEd Assessment Updates TETN Event # Presented June 5, 2013 TEA’s Student Assessment Division.
IDEA AND ENGLISH LANGUAGE LEARNERS WITH DISABILITIES Office of General Counsel Division of Educational Equity August 15, 2012.
Septembe DRAFT 5, 2014 TETN # STAAR Alternate is the state assessment for students with intellectual disabilities. STAAR Alternate, as it was originally.
Oregon’s Alternate Assessment: Past, Present, and Future Tense Oregon Department of Education Dianna Carrizales, PhD Office of Student Learning and Partnerships.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Digging Into the Data to Learn More About Low Performing Student with Disabilities Sheryl Lazarus Successfully Transitioning Away from the 2% Assessment.
ALTERNATE/ALTERNATIVE ASSESSMENTS VGLA AND VMAST UPDATES VIRGINIA DEPARTMENT OF EDUCATION AUGUST Regional Administrators Update Training.
WELCOME AND INTRODUCTIONS The presenters are: Erika Bolig – OSA Professional Development Coordinator Linda Howley – OSA Assessment Consultant for Students.
1 Alignment of Alternate Assessments to Grade-level Content Standards Brian Gong National Center for the Improvement of Educational Assessment Claudia.
Assessing “Students in the Gap” in Colorado Report from the HB Study Committee December, 2005.
National Center on Educational Outcomes (NCEO) Overview of Existing Alternate Assessments Based on Modified Academic Achievement Standards (AA-MAS) Sheryl.
Large Scale Assessment Conference June 22, 2004 Sue Rigney U.S. Department of Education Assessments Shall Provide for… Participation of all students Reasonable.
Who Are The “2% Students” …eligible to be judged as proficient based on modified grade-level academic achievement standards? Naomi Zigmond University of.
Meeting NCLB Act: Students with Disabilities Who Are Caught in the Gap Martha Thurlow Ross Moen Jane Minnema National Center on Educational Outcomes
Georgia Modification Research Study Spring 2006 Sharron Hunt Melissa Fincher.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Poetry Collaboration between Reading and English.
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
The Characteristics of Non-Proficient Special Education and Non-Special Education Students on Large-Scale Assessments Yi-Chen Wu, Kristi Liu, Martha Thurlow,
Facts About the Florida Alternate Assessment Created from “Facts About the Florida Alternate Assessment Online at:
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Students in the Gap: Understanding Who They Are & How to Validly Assess Them.
ALTERNATE ACCESS for ELLs 1 Alternate ACCESS for ELLs ™ Participation Criteria The Alternate ACCESS for ELLs was initially developed by a team led by Craig.
Wisconsin Extended Grade Band Standards
Identifying the gaps in state assessment systems CCSSO Large-Scale Assessment Conference Nashville June 19, 2007 Sue Bechard Office of Inclusive Educational.
Testing Students with Disabilities Office of Assessment Update Suzanne Swaffield Anne Mruz November
Oregon’s Statewide Assessment Options for Students with Disabilities Updates Dianna Carrizales ODE COSA Fall Conference October 4 th and 5 th.
NCEXTEND2 Assessments Mike Gallagher, NCDPI Nadine McBride, NCDPI Sheila Garner Brown, TOPS.
SB : The Great Teachers and Leaders Act State-wide definition of “effective” teacher and principal in Colorado Academic growth, using multiple measures.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Assessing Students With Disabilities: IDEA and NCLB Working Together.
1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.
Administrator Update January Individuals with Disabilities Education Act (IDEA) 1997 –Students with disabilities must participate in statewide assessment.
The Wisconsin RtI Center (CFDA #84.027) acknowledges the support of the Wisconsin Department of Public Instruction in the development of this presentation.
Special Education Law for the General Education Administrator Charter Schools Institute Webinar October 24, 2012.
IDEA and NCLB Standards-Based Accountability Sue Rigney, U.S. Department of Education OSEP 2006 Project Directors’ Conference.
PSSA-M January 19, 2012 LEA meeting January 19, 2012 LEA meeting.
Standard Setting Results for the Oklahoma Alternate Assessment Program Dr. Michael Clark Research Scientist Psychometric & Research Services Pearson State.
1% + 2% = ______________: ADDING UP WHAT WE KNOW & DON’T KNOW ABOUT ALTERNATE ASSESSMENTS Stephen N. Elliott, PhD Gerald Tindal, PhD Vanderbilt UniversityUniversity.
State Efforts to Improve Instruction and Assessment of Students who May be Candidates to Take the Alternate Assessment Based on Modified Academic Achievement.
Alternate Proficiency Assessment Erin Lichtenwalner.
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
Slide 1 National Center on Educational Outcomes (NCEO) States’ Data-Based Responses to Low Achieving Students on State Assessments Martha L. Thurlow National.
Ensuring Progress in the General Education Curriculum ED 222 Spring 2010.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
Options for Participation and Related IEP Decisions
GEORGIA’S CRITERION-REFERENCED COMPETENCY TESTS (CRCT) Questions and Answers for Parents of Georgia Students February 11, 2009 Presented by: MCES.
So What is Going to be Happening with State Assessment for Students with Disabilities for 2007/2008? Peggy Dutcher Fall 2007 Assessment and Accountability.
MontCAS CRT-Alternate (CRT-Alt) Spring 2011 Test Administrator Training Grades 3-8 and 10 in Reading and Math Grades 4, 8, and 10 in Science Presentation.
Ohio’s Alternate Assessments for Students with Disabilities Thomas Lather Office for Exceptional Children (614)
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Steps to Creating Standards-Based Individualized Education Programs The following highlights the major steps Committees on Special Education (CSEs) can.
How was LAA 2 developed?  Committee of Louisiana educators (general ed and special ed) Two meetings (July and August 2005) Facilitated by contractor.
Revisiting SPL/IIT/SAT/SLD AND OTHER ALPHABETIC ANOMOLIES!
State Testing for SPED Students: (Georgia Alternative Assessment vs. Regular State Testing with Accommodations)
Breakout Discussion: Every Student Succeeds Act - Scott Norton Council of Chief State School Officers.
Closing the Educational Gap for Students with Disabilities Kristina Makousky.
American Institutes for Research
Verification Guidelines for Children with Disabilities
Hartford Jt. 1 School District
KY Alternate Assessment
Hope College December 2, 2013 Laurie.
Federal Policy & Statewide Assessments for Students with Disabilities
WA-AIM 1% Participation Cap
Assessing Students With Disabilities: IDEA and NCLB Working Together
Presentation transcript:

Identifying Students in Need of Modified Achievement Standards and Developing Valid Assessments

Who are the students needing modified achievement standards? ….and thoughts for developing eligibility criteria Sue Bechard and Judy Snow January 16, 2008 Washington D.C.

MT GSEG: Bechard & Snow 3 Who are the students needing Modified Achievement Standards (MAS)? Can include: □2% of the total student population who can be counted as proficient on MAS, □students with disabilities, from any of the 13 disability categories, □students who are addressing grade level content standards on their IEPs, but are not expected to meet grade level achievement standards in the current year □students who need less difficult test items, covering the same breadth of content. US Dept of Ed., 2007

MT GSEG: Bechard & Snow 4 Determination of eligibility for MAS…. Must consider…. □objective evidence □multiple measures of progress over time □IEP goals that are based on grade level content standards □providing students the opportunity to show what they know and can do on an assessment that is based on grade-level academic achievement standards. Must not consider…. □a specific disability category □racial or economic background US Dept of Ed., 2007

MT GSEG: Bechard & Snow 5 Population Identification Issues □On average, students with disabilities comprise approximately 10% of the total student population. □If 10% - 1% = 9%, what characteristics should be used to distinguish the students appropriate for the 2% option within this group? □Are there test performance distinctions? □Are there specific learning characteristics? □Are there specific learning needs for access to the general curriculum? □What are the “modified” expectations relative to grade level content which distinguishes this population?

MT GSEG: Bechard & Snow 6 Population Identification Issues (cont.) □What is the purpose of the AA-MAS? □To allow students to do better (AYP improvement)? □To provide better information for instructional planning (data on what students know)? □To increase student self-esteem? □To provide greater alignment between cognition, expectations, instruction and assessment? □Is an AA-MAS needed at every grade level/content area?

MT GSEG: Bechard & Snow 7 Selected results from prior research…. □Colorado: Report from the HB Study Committee, December, 2005 □New England Compact (RI) Enhanced Assessment Grant, □Montana General Supervision Enhancement Grant,

MT GSEG: Bechard & Snow 8 CO report: Low Performers Who Score in the Bottom 1/3 of Scale Scores □Students who score in the bottom one third of scale scores on CSAP are almost twice as likely to be Black or Hispanic as students of other ethnicities. □Only 60% of students with IEPs scoring at lowest possible scale scores were able to be matched with a test the following year; thus, they may be more mobile than their counterparts who score at higher levels. □For those students scoring in the bottom one-third of scale scores, and where a match the following year was able to be made, it was found that these students did make substantial longitudinal growth.

MT GSEG: Bechard & Snow 9 CO report: Students with IEPs Who Do Not Make Longitudinal Growth □On the Colorado CSAP Reading Test, there were 250 students (of 444,407) across grade levels that were determined to be “Students in the Gap”. □On the CSAP Math Test, there were 658 students (of 444,910) that were determined to be “Students in the Gap”. □The CSAP as currently administered may not reflect their academic achievements; however, if appropriate accommodations and more intensive instruction were provided, these students too may make more gains.

MT GSEG: Bechard & Snow 10 Georgia EAG Also looked at snapshot vs. longitudinal growth: □Low Performing: lowest performance level in at least one assessment □Persistently Low: lowest performance level for three consecutive years Melissa Fincher, July 2007

MT GSEG: Bechard & Snow 11 Rhode Island (New England Compact - NEC) EAG Teacher judgments of class work were compared to test performances and revealed two gaps of students performing below proficient: Performance gap □The test may not reflect classroom performance. Teachers see students performing proficiently in class, but test results are below proficient. Information gap □The test may not be helpful for instructional planning. Teachers rate students’ class work as low as possible and test results are at “chance” level. No information is generated on what students can do. Parker & Saxon, 2007 Bechard & Godin, 2007

MT GSEG: Bechard & Snow 12 NEC EAG data sources □State assessment data – grade 8 mathematics results from two systems □General large-scale test results □Demographics (special programs, ethnicity, gender) □Student questionnaires completed at time of test □Accommodations used at time of test □State special education data □Disability classification □Free/reduced lunch □Attendance □Classroom teacher data □Individual interviews □Judgments of all students’ classroom work

MT GSEG: Bechard & Snow 13 NEC EAG findings The Information Gap in grade 8 mathematics comprised % of the total population □Included non-disabled students. □Test performance: □Students mostly guessed on the test items. □Most used multiple accommodations. □Teacher perceptions: □These students operate below grade level in class. □Teachers are not surprised by their low test results. □There is a disconnect in what is tested vs. what is taught. □These students need more supports in the classroom. □Student perceptions: □Think the test is harder than their classroom work. □They try hard on the test.

MT GSEG: Bechard & Snow 14 NEC EAG: Special program status of students in the Information Gap Non-gap comparison = students who performed at “chance,” on the test but higher in the classroom. The majority of students performing at “chance’ were students with IEPs.

MT GSEG: Bechard & Snow 15 NEC EAG: Disability designations of students in the Information Gap Learning disabilities: □Lower percentages of students with SLD were in the information gap than in the general population. Other disabilities: □deaf/blind □multiple disabilities □hearing impairments □mild to moderate cognitive disabilities □combinations of disabilities.

MT GSEG: Bechard & Snow 16 Montana GSEG, 2005 Students in the Sample □Grade 5 students statewide □Census sample of: □Students with an IEP □Who took Spring 2006 Grade 4 math CRT □A few (13) who scored well on the Alternate Assessment also included, selected by scores and recommendations from IEP teams □CRT-M = 672 students, CRT = 199 students Montana Office of Public Instruction and Measured Progress

MT GSEG: Bechard & Snow 17 MT GSEG, 2005 data sources □Pilot test results □Student survey □Test administrator survey □Standard setting results □Recorded discussions of standard setting panelists □Interviews with standard setting panelists

MT GSEG: Bechard & Snow 18 MT GSEG 05: Test Information Functions

MT GSEG: Bechard & Snow 19 MT GSEG 05: Performance Level Comparisons, CRT-M vs. CRT Up Three Levels 7 1%All 7 went to Advanced Up Two Levels %All 120 went from below proficient to proficient or advanced Up One Level %88 went from nearing to proficient, 96 went to advanced No Change % Down One 23 3%11 went from proficient to nearing proficiency Down Two 5.7% Down Three 1.1%

MT GSEG: Bechard & Snow 20 MT GSEG 05 student and teacher surveys: Difficulty Students taking the CRT-M found the test slightly less difficult when compared to classroom content than students taking the regular CRT. Teachers felt the modified test should be modified more to reach the students having with the greatest challenges.

MT GSEG: Bechard & Snow 21 MT GSEG 05 feasibility question: Is the CRT-M a better measure? □Students answered more items right □Student scores went up □Students moved up to another proficiency level □Validity indicators improved □More data analysis and study to determine which CRT-M students benefited most

MT GSEG: Bechard & Snow 22 Selected considerations from current research…. □Montana (+ NEC) EAG, : Adapting Test Items to Increase Validity of Alternate Assessments Based on Modified Achievement Standards □Montana GSEG, : Identifying Students in Need of Modified Achievement Standards and Developing Valid Assessments

MT GSEG: Bechard & Snow 23 MT EAG, 2007 Focus on high school reading comprehension to: □determine the processing requirements of test passages and items (use coding strategy) □describe the cognitive abilities and challenges of the target population □conduct cognitive labs □develop item modifications based on cognitive variables

MT GSEG: Bechard & Snow 24 Preliminary feedback from Expert Panel ( ) □Some noted cognitive variables: □Abstract reasoning that relies on information from entire passage □Long passages that require sustained attention □Limited experience with multiple meanings of vocabulary words □Dense passages that require large amounts of working memory □Location in the passage where necessary information is found to answer the question □Irrelevant information in passage makes mapping and sorting difficult □Emotional content difficult for students with ED to process □Answers to questions not found in passage (e.g. reliance on prior knowledge)

MT GSEG: Bechard & Snow 25 Montana GSEG, 2007 Focus on middle school reading and mathematics to: □Identify students in need of modified achievement standards (MAS). □Determine what content knowledge the student is lacking to achieve proficiency □Develop dynamic online assessment that provides scaffolding based on distractor selection.

MT GSEG: Bechard & Snow 26 Students who will be included in the MT GSEG study samples Middle school reading and mathematics □Sample for analyzing items and distractors: All students who took the tests of interest, disaggregated □Sample for cognitive labs: convenience sample of 48 students (24 per content area) □Sample for pilot test: Approximately 5% of the total population

MT GSEG: Bechard & Snow 27 MT CRT Grade 10 Reading Example, 2007

MT GSEG: Bechard & Snow 28 Implications of research for identification of target population Use of performance data : □Longitudinal performance data □Students who are so low performing, nothing is known about them □Match between classroom performance and test performance □Distractor analyses

MT GSEG: Bechard & Snow 29 Implications of research for identification of target population (cont.) Use of other data: □Teacher judgment data □Opportunity to learn variables □Mobility □Attendance □Program placement □Performance data analyzed by cognitive modeling information □Data from standards-driven IEPs

MT GSEG: Bechard & Snow 30 Data collected for “The Whole IEP Process” (C. Massanari) What is the desired outcome for this student? Three to four years from now Student’s desired post- school outcome What are the skills and knowledge essential to meeting the desired outcome? What are the expectations of the general curriculum relative to the student’s age/grade? Content Expectations for learning and demonstration of learning Extracurricular activities or events available

MT GSEG: Bechard & Snow 31 How do skills and knowledge essential to meeting the desired outcome compare with the general curriculum, including content and expectations for learning? Where are the similarities/connections? Where are the differences? Where within the general curriculum, including extracurricular, are the opportunities for learning the needed skills and knowledge? What are the student’s present levels of performance? What skills and knowledge does the student already possess? What other strengths does the student present? What are the areas of challenge? What accommodations, modifications, or other supports have proven beneficial for this student? Given all the information we have discussed thus far, what do we think are reasonable goals for this year? What are the objectives for each goal? What instructional accommodations are needed? What modifications to the general curriculum are needed? How will progress be reported and how often? Given the information we have discussed thus far, how will the student participate in state and district-wide assessments? With peers as given With peers and with accommodations or modifications Alternate assessment

MT GSEG: Bechard & Snow 32 Also, consider how model of 1% eligibility guidelines might apply For example: Montana's eligibility questions for the CRT-Alt. The student MUST: □Program: Have an active IEP □Learning characteristics: Have cognitive abilities and adaptive behaviors which require substantial adjustments to general curriculum □Learning objectives and expected outcomes: Focus on functional application of skills □Delivery of instruction: Requires direct and extensive instruction

MT GSEG: Bechard & Snow 33 Consider how model of 1% eligibility guidelines might apply (cont.) Montana's eligibility questions for the CRT-Alt. Decisions must NOT be based on: □Excessive or extended absence □Disability category □Social, cultural, or economic differences □Amount of time receiving special education services □Expectation of failure on general test

MT GSEG: Bechard & Snow 34 So…2% eligibility considerations might address: Learning characteristics: □What are the cognitive abilities and adaptive behaviors of the target population? □What adjustments are needed for the student to participate in the general curriculum (e.g., accommodations/modifications) Learning objectives and expected outcomes: □How does the student demonstrate application of learned knowledge, skills, and abilities?

MT GSEG: Bechard & Snow 35 2% eligibility considerations (cont.) Delivery of instruction: □What are the deconstructions of constructs necessary for the student to master the grade level content? □What adjustments must be made to simplify the materials used in instruction? Academic achievement: □How is the progress of this student different from the pattern of progress typical for all students at the targeted grade level?

MT GSEG: Bechard & Snow 36 References □Bechard, S. and Godin, K. (2007). Identifying and Describing Students in the Gaps in Large-Scale Assessment Systems. Paper submitted for publication. □Colorado Department of Education. (2005, December). Assessing “students in the gap” in Colorado: Report from the HB Study Committee. Denver: Author. □Montana Office of Public Instruction and Measured Progress. (2007, May). Determining the Feasibility of an Alternate Assessment Based on Modified Achievement Standards: A Planning Project and Pilot Test [Final Report for Montana’s General Supervision Enhancement Grant CFDA X – Priority B]. Helena: MT. □Parker, C. E., & Saxon, S. (2007). “They Come to the Test, and There is Nothing to Fold”: Teacher Views of Students in the Gaps and Large-Scale Assessments. Paper submitted for publication. □Title I—Improving the Academic Achievement of the Disadvantaged; Individuals With Disabilities Education Act (IDEA). Final rule. 72 Fed. Reg –17781, pts. 200 and 300 (2007, April 9).