Technology Assisted Reading Assessment National Accessible Reading Assessment Projects General Advisory Committee December 7, 2007 Overview of TARA project.

Slides:



Advertisements
Similar presentations
National Accessible Reading Assessment Projects Defining Reading Proficiency for Accessible Large Scale Assessments Principles and Issues Paper American.
Advertisements

Designing Accessible Reading Assessments National Accessible Reading Assessment Projects General Advisory Committee December 7, 2007 Overview of DARA Project.
National Accessible Reading Assessment Projects Goals of Project NARAP Collaboration General Advisory Committee Project Details (ETS and PARA) Plans for.
Copyright © 2004 Educational Testing Service Listening. Learning. Leading. Using Differential Item Functioning to Analyze a State English-language Arts.
Copyright © 2006 Educational Testing Service Listening. Learning. Leading. Using Differential Item Functioning to Investigate the Impact of Accommodations.
National Accessible Reading Assessment Projects Defining Reading Proficiency for Accessible Large Scale Assessments Discussion of the Principles and Issues.
Copyright © 2004 Educational Testing Service Listening. Learning. Leading. Using DIF to Examine the Validity and Fairness of Assessments for Students With.
PARA Project Overview, Results, Next Steps Martha Thurlow and Deborah Dillon National Accessible Reading Assessment Projects.
Designing Accessible Reading Assessments Examining Test Items for Differential Distractor Functioning Among Students with Learning Disabilities Kyndra.
National Accessible Reading Assessment Projects National Accessible Reading Assessment Projects General Advisory Committee December 8, 2006 Overview of.
Principal Investigators: Martha Thurlow & Deborah Dillon Introduction Assumptions & Research Questions Acknowledgments 1. What characteristics of current.
1 What Is The Next Step? - A review of the alignment results Liru Zhang, Katia Forêt & Darlene Bolig Delaware Department of Education 2004 CCSSO Large-Scale.
Targeted Assistance & Schoolwide Programs NCLB Technical Assistance Audio April 18, :30 PM April 19, :30 AM Alaska Department of Education.
SBA to GLE: The Road Les Morse, Director Assessment & Accountability Alaska Department of Education & Early Development No Child Left Behind Winter Conference.
Module 2 Sessions 10 & 11 Report Writing.
Making the Connection to Assessment. Three components: Common Core State Standards Excellent Matches to State Curriculum Essential Skills and Knowledge.
1 Quality Indicators for Device Demonstrations April 21, 2009 Lisa Kosh Diana Carl.
Selecting and Assigning Accessibility Features and Accommodated Test Forms in PearsonAccess 1 Accessibility Features and Accommodations.
Test Accommodations Students with Disabilities 2013 Presented by Janice Koblick, Curriculum Supervisor Exceptional Student Education 1.
Understanding the ELA/Literacy Evidence Tables. The tables contain the Reading, Writing and Vocabulary Major claims and the evidences to be measured on.
Improving Practitioner Assessment Participation Decisions for English Language Learners with Disabilities Laurene Christensen, Ph.D. Linda Goldstone, M.S.
Evaluation Orientation Meeting Teacher Evaluation System
1 Developing Tests for Departmental Assessment Deborah Moore, Assessment Specialist Institutional Research, Planning, & Effectiveness University of Kentucky.
What’s New with PARCC for ELA? January 30, 2014 Vincent Segalini.
Web Design Principles 5th Edition
Copyright © 2014 by Educational Testing Service. ETS, the ETS logo, LISTENING. LEARNING. LEADING. and GRE are registered trademarks of Educational Testing.
RTI Implementer Webinar Series: Establishing a Screening Process
1 Phase III: Planning Action Developing Improvement Plans.
Conducting a Comprehensive Needs Assessment. Objectives Identify the components of a comprehensive needs assessment Classify the types of data collected.
Goals for Tonight Present the new Georgia Milestones Assessment System Discuss Our Instructional Strategies Discuss Your Role in supporting your child.
East Meets West Conference October 30, Georgia Milestones Comprehensive – single program, not series of tests (e.g., CRCT; EOCT; WA); formative.
Data, Now What? Skills for Analyzing and Interpreting Data
Designing Accessible Reading Assessments Reading Aloud Tests of Reading Review of Research from the Designing Accessible Reading Assessments Projects Cara.
Smarter Balanced Accessibility and Accommodations Policies and Guidelines Chief Instructional Officers Update March 12, 2013.
Overview of the CCSSO Criteria– Content Alignment in English Language Arts/Literacy Student Achievement Partners June 2014.
JHLA Junior High Literacy Assessment. The school year saw the first administration of the Junior High Literacy Assessment. The assessment was.
PARCC Accommodation: Text-to-Speech, Screen Reader Version, ASL Video, Human Reader/Human Signer For the ELA/Literacy Assessment December 2014.
N C E O National Center on Educational Outcomes Accommodation Decisions: Policy, Training, and Monitoring as Critical Aspects of an Objective Approach.
Georgia Modification Research Study Spring 2006 Sharron Hunt Melissa Fincher.
Accessibility for Michigan’s Assessments Region IV Assistive Technology Consortium Fall Conference November 14, 2014.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Designing Accessible Reading Assessments Research on Making Large Scale Assessments More Accessible for Students with Disabilities Institute of Education.
Principles of Assessment
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
Donna McNear 1 Welcome to… Essential Supports for Beginning Readers in the 21st Century: Examining the Teacher’s Role Presented at:
Understanding Students with Visual Impairments
Nancy Lister Grant Administrator, Career, Standards, and Assessment Services Kansas State Department of Education Julia Shaftel, Ph.D. Principal Investigator,
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Junior High Literacy Assessment May 26-28, 2008.
Assessment Update Testing Students with Disabilities District Test Coordinator Meeting Douglas Alexander Anne Mruz Suzanne Swaffield June 11,
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
National Accessible Reading Assessment Projects Research on Making Large-Scale Reading Assessments More Accessible for Students with Disabilities June.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
By: Kathryn Sheriff Segers, PhD, NBCT, CTVI Program Specialist -Accessible Instructional Materials (AIMs) Georgia Department of Education.
Cara Cahalan-Laitusis Operational Data or Experimental Design? A Variety of Approaches to Examining the Validity of Test Accommodations.
 The Pennsylvania National Agenda (PANA) committee, with the help of the Pennsylvania Training and Technical Assistance Network (PaTTAN) and the support.
Technology Assisted Reading Assessment Cara Cahalan Laitusis Educational Testing Service Martha Thurlow NCEO.
Michigan Educational Assessment Program MEAP. Fall Purpose The Michigan Educational Assessment Program (MEAP) is Michigan’s general assessment.
In the State-Required Assessment and Accountability Programs 703 KAR 5:070 1.
Assessment and Testing
Alternate Proficiency Assessment Erin Lichtenwalner.
PARCC Accessibility Features and Accommodations Manual Training for Parents Presented on November 20, 2014 Presented by the: Office of Special Education.
So What is Going to be Happening with State Assessment for Students with Disabilities for 2007/2008? Peggy Dutcher Fall 2007 Assessment and Accountability.
MontCAS CRT-Alternate (CRT-Alt) Spring 2011 Test Administrator Training Grades 3-8 and 10 in Reading and Math Grades 4, 8, and 10 in Science Presentation.
Summer Institutes 2013 Changing Teacher Practice Changing Student Outcomes.
FACULTY MEETING OCTOBER 14, 2009 Standards-Based IEPs.
New Assessments and Accommodations
Perspectives on Equating: Considerations for Alternate Assessments
Presentation transcript:

Technology Assisted Reading Assessment National Accessible Reading Assessment Projects General Advisory Committee December 7, 2007 Overview of TARA project Psychometric work Test Development Elizabeth Stone Survey and Interviews of Teachers of Students with Visual Impairments (TVIs) Martha Thurlow

Technology Assisted Reading Assessment Overview of TARA project

Technology Assisted Reading Assessment Focus on improving state reading assessments for students with visual impairments Jointly directed by Cara Cahalan- Laitusis (ETS) and Martha Thurlow (NCEO) Subcontract to Center for Applied Special Technology (CAST)

Technology Assisted Reading Assessment Project Tasks Examining the performance of operational ELA tests for students with visual impairments Development of prototype Technology Assisted Reading Assessment Contribute research findings to NARAP Principles and Guidelines NARAP/TARA Field Test with VI students

Technology Assisted Reading Assessment Psychometric work

Technology Assisted Reading Assessment Nature and Purpose of DIF and DDF Studies Evaluate comparability of measurement characteristics for students without disabilities and students who are blind or visually impaired –No reason to consider overall abilities of groups to be different –Look for construct-irrelevant causes A priori DIF hypotheses (position, visual content, textual content, language, format)

Technology Assisted Reading Assessment Description of Test ELA component of large-scale state standards test Focused analyses on 4 th and 8 th grades 75 multiple choice questions per grade –56% Reading / 44% Writing –Essay component in 4 th grade, excluded from analyses

Technology Assisted Reading Assessment Description of Samples: Summary Statistics

Technology Assisted Reading Assessment Methods of Analysis DIF (Differential Item Functioning): See if groups seem to get item right or wrong in different proportions after being matched on ability –Mantel-Haenszel with purification –ELA score as matching criterion –ETS delta-DIF categories (-C, -B, A, B, C) DDF (Differential Distractor Functioning): See if groups seem to choose distractors in different proportions after being matched on ability –Standardization method –Typical cut-offs for significance

Technology Assisted Reading Assessment DIF Results Summary A priori hypotheses were not supported Number of DIF items smaller than expected (sample size issue?) Reading/Writing divide by grade –Most grade 4 Reading DIF items favored the focal group (4/5) –Most grade 4 Writing DIF items favored the reference group (4/5) –Most grade 8 Reading DIF items favored the reference group (5/8) –All grade 8 Writing DIF items favored the focal group (2/2)

Technology Assisted Reading Assessment DIF Results Summary Where DIF did seem to show up: –Metaphor –Items and passages related to how people experience the world (e.g. through touch) –Unusual document format Where DIF did not seem to show up: –Items and passages involving typically sighted activities or interests, e.g. photography (in fact, some favor focal groups) –Effects did not seem to apply for all items associated with a passage

Technology Assisted Reading Assessment Favors Students without Disabilities Favors Students who are Visually Impaired using Large Print Easy Difficult Grade 4 Reference: Students without disabilities taking standard form Focal: Students who are visually impaired taking large print form

Technology Assisted Reading Assessment Favors Students without Disabilities Favors Students who are Blind or Visually Impaired using Large Print or Braille Easy Difficult Grade 4 Reference: Students without disabilities taking standard form Focal: Students who are blind or visually impaired taking large print or braille form

Technology Assisted Reading Assessment Grade 8 Reference: Students without disabilities taking standard form Focal: Students who are visually impaired taking large print form Favors Students without Disabilities Favors Students who are Visually Impaired using Large Print Easy Difficult

Technology Assisted Reading Assessment Grade 8 Reference: Students without disabilities taking standard form Focal: Students who are blind or visually impaired taking large print or braille form Favors Students without Disabilities Favors Students who are Blind or Visually Impaired using Large Print or Braille Easy Difficult

Technology Assisted Reading Assessment DDF Results (grade 4 only) Results were not as interpretable as hoped –For the large print focal group (10 B or C DIF items): 2 items had a highly significant distractor; 8 items had at least one moderately significant distractor –For the large print or braille focal group (5 B or C DIF items): 1 item had a highly significant distractor; 3 items had at least one moderately significant distractor –Review did not reveal any obvious causes

Technology Assisted Reading Assessment Review of Forms Large print form issues: –Portions were merely an enlargement, so some fonts, e.g. footnotes, were not in proper font size. –As with the standard form, many different fonts were used. This is not considered best practice for large print. –We noted one passage description that was slightly different, and one item that had different wording from the standard form.

Technology Assisted Reading Assessment Review of Forms Braille form issues: –Distracting logo encoding and page numbering –Paragraph numbering –Contracted vs. uncontracted –Symbols used Symbols not encountered before (e.g., italics) Nemeth code vs. literary braille

Technology Assisted Reading Assessment Discussion There are some caveats: –Sample sizes smaller than desired: These are low incidence populations. –We would need more of each type of item to support inference of a pattern. However, some inferences can be drawn that may be useful in instruction, test development, administration, and research: –access to curriculum –access to various document formats for test preparation –careful review of test forms for these issues –importance of proctor awareness –need to include these groups in fairness and validity measures

Technology Assisted Reading Assessment Presentations and Publications The DIF and DDF work has been presented at: –Association of Test Publishers (ATP) February 2007 –Association for Supervision and Curriculum Development (ASCD) March 2007 –National Council on Measurement in Education (NCME) April 2007 –Council for Exceptional Children (CEC) April 2007 –Institute of Education Sciences (IES) Research Conference June 2007 –Northeastern Educational Research Association (NERA) October 2007 The TARA DIF work is to be published as an ETS Research Report (RR) and will then be able to be found using:

Technology Assisted Reading Assessment Future Steps for TARA psychometric work Examine items and item types across administrations Investigate other DIF methods Test analysis Trend analysis Contribute test development recommendations Creation of TARA using ECD

Technology Assisted Reading Assessment Test Development

Technology Assisted Reading Assessment Evidence-Centered Design Structure –Purpose of Test –High Level Claim –Population of Test Takers –Test Structure –Proficiency Levels

Technology Assisted Reading Assessment Elements of TARA via ECD –Purpose of Test accountability assessment for instruction of technology assisted reading evaluate a students readiness to participate in the regular state assessment for English language arts using assistive technology accommodations does not replace state reading assessment –High Level Claim measure the degree to which a student can independently access grade level English language arts text using assistive technology

Technology Assisted Reading Assessment –Population of Test Takers Students who are blind or visually impaired (using IDEA 2004 definition) are a heterogeneous group who access text or printed materials in a variety of formats often using a wide assortment of assistive technology. Of these students, the test will have the following audience: –Students in grades 7 to 10 whose primary method of reading includes assistive technology –Students whose IEP includes instruction in reading with assistive technology –Not appropriate for students with significant cognitive impairments who participate in the states alternate assessment based on alternate achievement standards –Not appropriate for students who do not have at least literal comprehension skills as measured by a 5 question screening test Elements of TARA via ECD

Technology Assisted Reading Assessment Elements of TARA via ECD –Test Structure Section 1: Literal Comprehension (Screening) Section 2: Using Assistive Technology (Access) –Screen reader tasks would be »adjusting the speed at which material is read, »moving by word or sentence, »accessing embedded links, etc. –Screen-magnification tasks might include »changing the magnification level of the screen, »adjusting the colors, »finding particular portions of the material on-screen. –Other tasks will involve »locating material that is specified literally, e.g. the second paragraph of section three or the sentence reading, Morse developed the telegraph., »opening and closing documents »locating structural elements (index, glossary, a particular word in the glossary or a particular cited work in a bibliography).

Technology Assisted Reading Assessment –Proficiency Levels Advanced: Student can access grade-level text independently and efficiently, with satisfactory literal comprehension, using one or more forms of assistive technology. Proficient: Student can access grade-level text independently, with satisfactory literal comprehension, using one or more forms of assistive technology. Below Proficient: Student has, at most, limited ability to access text independently using one or more format of assistive technology or has less than satisfactory literal comprehension when using assistive technology Elements of TARA via ECD

Technology Assisted Reading Assessment Test Development Timeline Draft Evidence Centered Design (ECD) Guide –Population of Tests Takers –Score Reports –Proficiency Models Draft ECD Models and Test Blueprint based on research findings (Spring 2008) –Evidence Models, –Task Models and Task Shells, –Assembly Models, and –Scoring Models Develop pilot test items (Summer 2008) Pilot test (Fall 2008) Revise and assemble prototype assessment (Spring 2009) Field test prototype assessment ( )

Technology Assisted Reading Assessment Survey and Interviews of Teachers of Students with Visual Impairments (TVIs)

Technology Assisted Reading Assessment Survey Conducted: May 11-25, 2007 Purpose: To obtain information from teachers of students who are blind or have visual impairments – information about current platforms, reading approaches, and other aspects of reading for these students.

Technology Assisted Reading Assessment Target Sample – TVIs Typically certified special education teachers with extensive coursework and professional development experiences in communication skills, braille instruction, access to assistive technology, and providing support to general education classroom teachers Typically work in state schools for the blind or in one or more regular public schools; usually work across grades, and have wide variety of duties and responsibilities

Technology Assisted Reading Assessment Methods 25 question survey Piloted by peers in assessment and visual impairment Revisions made to ensure logic and ease of use for both sighted and respondents with visual impairments Survey provided online and via paper Volunteer survey respondents selected for interviews

Technology Assisted Reading Assessment Coverage of Survey Items Demographic information on TVI Information about students on caseload Information on instruction Information on assessments

Technology Assisted Reading Assessment Responses 185 online responses 12 paper responses Total responses = 197 responses (Participation request sent to listservs of AER Div 17, AFB lists, & NFB) 30% return rate if no overlap and all received request - unlikely)

Technology Assisted Reading Assessment Analyzed Responses Focused on TVIs with caseloads that included students in grades 7-10 Final count = 146 Those not included may not have had any students in grades 7-10 or may not have been TVIs. Number of responses per item varied, down to as low as 98

Technology Assisted Reading Assessment Summary of TARA Survey Results TVIs had an average of 12.4 years of experience (range = years; median = 8.5 years) Average caseload of students in grades 7-10 was 5.8 (overall caseload average = 16.1 students) Respondents spent an average of 35% (median 30%, mode 50%) of their instructional time using computer software assistive technology Primary goals most often cited for instructional time were become a proficient user of assistive technology (42%), and read using a combination of approaches (30%)

Technology Assisted Reading Assessment Summary of TARA Survey Results Respondents spent an average of: – 27% of reading instruction time on direct instruction of how to use assistive technologies to assist in reading –19% of time in supported reading aloud –only 9% of time in direct instruction of phonemic strategies (Braille or print) Survey data showed that most students had congenital vision loss (81%) rather than adventitious (19%) Most students (80%) also have an additional disability documented on their IEP –largest among them cognitive impairment (28%), physical impairment (17%) and learning disability (16%)

Technology Assisted Reading Assessment Summary of TARA Survey Results The largest percentage of students (28%) receive their services in a general classroom with itinerant support, or a general class room with resource room support (23%), and few receive services at a school for the blind (10%) Students access print through visual (25%) or visual + audio (29%) a majority of the time A majority of students (96%) use some kind of accommodation or assistive technology at times in the classroom, largest among them: – audio (38%), large print (35%), read aloud (26%), and braille (25%)

Technology Assisted Reading Assessment Summary of TARA Survey Results Students with visual impairments use JAWS for Windows (26%), ZoomText Magnifier (13%), Duxbury (13%), and ZoomText Magnifier/Reader (10%) to access text most often A positive correlation occurred between the number of years spent as a TVI and the % of students using tactile + audio to access print (p=.010) The % of students whose primary goal is to become a proficient user of assistive technology correlates positively with TVIs spending a students instructional time using computer software assistive technology (p<.001)

Technology Assisted Reading Assessment Summary of TARA Survey Results An inverse relationship between the size of the caseload and the percent of students using these accommodations: –Braille (p=.044) –Audio (p=.004) –CCTVs (p=.030) –Screen Readers (p=.003) The % of students whose primary goal is to read using a combination of approaches is inversely correlated to the relative importance to the students TVI of sounding out words (p=.011)

Technology Assisted Reading Assessment Implications of Survey Results AT use is a large part of instruction for students with VI TVIs take a blended approach to teaching reading (e.g., using a variety of modalities) Standardization of assessments may be a challenge because students use different technology for different purposes

Technology Assisted Reading Assessment TARA Interview Status Interview sample consisted of TWIs from a variety of settings (Schools for the Blind, resource rooms, itinerant) By end of November, 2007, 27 interviews had been completed.

Technology Assisted Reading Assessment Preliminary Findings Other assessments from which to build TARA assessment (e.g., Texas School for the Blind) Assistive technology use appears to be dependent on motivation of teachers and students In many states, the only option for large- scale assessment is large print or Braille

Technology Assisted Reading Assessment Preliminary Findings Non-portability of some technologies make it difficult for students to practice at home or in some settings (e.g., community settings, different classrooms) Product selection often depends on teacher knowledge of product (teachers are more willing to recommend products with which they have familiarity)