Study of Device Comparability within the PARCC Field Test.

Slides:



Advertisements
Similar presentations
Designing Accessible Reading Assessments National Accessible Reading Assessment Projects General Advisory Committee December 7, 2007 Overview of DARA Project.
Advertisements

Information to Help Districts Choose MCAS or PARCC in Spring 2015 May 2014.
Field Testing Testing the Test March PARCC Consortium 2 Governed by the education chiefs in the states.
Selecting and Assigning Accessibility Features and Accommodated Test Forms in PearsonAccess 1 Accessibility Features and Accommodations.
Smarter Balanced Assessments SBAC Field Test this spring Field Test will determine cut points for proficiency levels, validity of items Will not be used.
PARCC Tests: An Investment in Learning Test quality and rigor increase; Costs for states generally hold steady July 2013.
Maria J. Ortiz, Principal January 21, 2015 PARCC Parent Meeting.
Iowa Assessment Update School Administrators of Iowa November 2013 Catherine Welch Iowa Testing Programs.
PARCC Top 20 If you want to follow along, go to.. About PARCC PARCC Top 20
Field Tests … Tests of the test questions Jeff Nellhaus, PARCC, Inc. Louisiana Common Core Implementation Conference February 19,
PARCC Assessment Administration Guidance 1. PARCC Assessment Design Summative Assessments 2 Performance- Based Assessment End-of-Year Assessment  After.
Next Generation of Assessments November Roadmap to
Designing Accessible Reading Assessments Reading Aloud Tests of Reading Review of Research from the Designing Accessible Reading Assessments Projects Cara.
Next Generation of Assessments Paula Mahaley ∙ January 27 and 28, 2014.
1 PARCC/MCAS Choice: Why PARCC in 2015? May/June 2014 Bob Bickerton, MCAS-PARCC Transitions Coordinator Maureen LaCroix, Special Assistant to the Deputy.
San Mateo County Results from the 2014 SBAC Field Test Survey Deann Walsh Manager, Learning Analytics & Program Evaluation.
Validity 4 For research to be considered valid, it must be based on fact or evidence; that is, it must be "capable of being justified."
Robert delMas (Univ. of Minnesota, USA) Ann Ooms (Kingston College, UK) Joan Garfield (Univ. of Minnesota, USA) Beth Chance (Cal Poly State Univ., USA)
Research Review Anxiety Disorder. Study 1 Whiteside and Brown (2008) explore in their research the Spence Children’s Anxiety Scale (SCAS) in a North American.
Advances in Large-Scale Assessment: A PARCC Update National Conference on Student Assessment New Orleans June 26,
A Look at the Future of Assessments Brian Bickley & Ron Bauman · Sept. 15, 2014.
Online Assessment Update Region 6 February 24, 2012.
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
DEVELOPING ALGEBRA-READY STUDENTS FOR MIDDLE SCHOOL: EXPLORING THE IMPACT OF EARLY ALGEBRA PRINCIPAL INVESTIGATORS:Maria L. Blanton, University of Massachusetts.
Georgia Milestones End of Grade (EOG) Assessment Grades 3, 4, and 5 Whitney Shook, Assistant Principal/Testing Coordinator Lori Dunagan, Curriculum Support.
PARCC Assessment Administration Guidance
New Jersey Principals and Supervisors Association Legislative Conference March 21, 2014 New Jersey Department of Education.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
PARCC Update June 6, PARCC Update Today’s Presentation:  PARCC Field Test  Lessons Learned from the Field Test  PARCC Resources 2.
COMPREHENSIVE  Although broken into End of Grade (EOG) and End of Course (EOC) sub sections, the Georgia Milestones Assessment System is a single program,
MCAS and PARCC Testing Franklin Public Schools School Committee January 29,
Interim Assessments Overview of Spring 2015 Assessments Training Module.
Ohio’s New Learning Standards English Language Arts Math Science Social Studies Fine Arts World Language Financial Literacy Physical Education Entrepreneurship.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Butte County Office of Education September 19, 2014 Interim.
By Raani Agrawal 1 PARCC Assessment: ELA/Literacy and Math Grades 6-8.
Welcome to PARCC Field Test Training! Presented by the PARCC Field Test Team.
PARCC Assessment Administration Guidance 1. PARCC System Purpose: To increase the rates at which students graduate from high school prepared for success.
Cara Cahalan-Laitusis Operational Data or Experimental Design? A Variety of Approaches to Examining the Validity of Test Accommodations.
An Introduction to Measurement and Evaluation Emily H. Wughalter, Ed.D. Summer 2008 Department of Kinesiology.
PARCC Assessments Updates Updates Arrived 2/6/13! general specifics.
Smarter Balanced Assessment Consortium Building a System to Support Improved Teaching and Learning Joe Willhoft Shelbi Cole Juan d’Brot National Conference.
Next Generation of Assessments. Ohio’s Next Steps Ohio Graduation Tests (OGT) administered in Ohio Achievement Assessments Ohio will continue.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Santa Clara COE Assessment Accountability Network September.
Modified Achievement Tests for Students with Disabilities: Design Strategies and Experimental Results Stephen N. Elliott Vanderbilt University CCSSO’s.
PARCC Consortium 1 AR CO IL LA MA MS NM NY OH TN MD NJ RI Governing Board States DC 2010 Governing members: Governing members: Administering.
Reliability performance on language tests is also affected by factors other than communicative language ability. (1) test method facets They are systematic.
Chapter Eight: Quantitative Methods
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
Chapter 6 - Standardized Measurement and Assessment
2015 State PARCC Results Presented to the Board of Elementary and Secondary Education Robert Lee MCAS Chief Analyst and Acting PARCC Coordinator October.
PARCC Field Test Study Comparability of High School Mathematics End-of- Course Assessments National Conference on Student Assessment San Diego June 2015.
Math Performance Tasks: Scoring & Feedback Smarter Balanced Professional Development for Washington High-need Schools University of Washington Tacoma Maria.
Smarter Balanced Assessment Consortium (SBAC) Fairfield Public Schools Elementary Presentation.
Spring 2016 MCAS ELA Assessment Updates grades 3-8.
Spring 2015 Verona PARCC Results: Year One Wednesday, March 16 7:00-8:00 p.m. VHS Learning Commons.
1 1 Overview of PARCC Field Test The following slides are from a MA DESE presentation to districts. 1.
LEAP 2025 Practice Test Webinar for Teachers
Research and Evaluation
LANGUAGE ARTS - Specifics
Best Practices in Quality Test Administration
Validity and Reliability
Chapter Eight: Quantitative Methods
Louisiana’s Comprehensive Assessment System
A Parents’ Guide to the FL Standards Assessment
CAASPP Results 2015 to 2016 Santa Clara Assessment and Accountability Network May 26, 2017 Eric E, Zilbert Administrator, Psychometrics, Evaluation.
Investigations into Comparability for the PARCC Assessments
Dr. Timothy Vansickle, QAI
Presentation transcript:

Study of Device Comparability within the PARCC Field Test

PARCC’s ultimate goal for test delivery Digital delivery of the PARCC ELA and mathematics assessments – On the widest variety of devices that will support interchangeable scores – E.g., desktop computers, laptops, and tablets Goal of Test Delivery 2 Fairness

“Tablets” = full size (10”) iPads One form of each of the following tests was chosen for administration on iPads: – Grade 4 ELA/Literacy and Mathematics – Grade 8 ELA/Literacy and Mathematics – Grade 10 ELA/Literacy – Geometry Selected “condition 1” forms so that the same students took both the PBA and EOY components of the selected forms Quantitative Comparability Study 3

1.Do the individual items/tasks perform similarly across computers and tablets? 2.Are the psychometric properties of the test scores similar across computers and tablets? 3.Do students perform similarly on the overall test across computers and tablets? Research Questions 4

Methodology

Grade 8 and high school studies used random assignment of Burlington, MA students to computer and tablet conditions – Random assignment to conditions by homeroom or class section Grade 4 study used matched sample from MA – Burlington students assigned to tablet condition matched to other MA students who tested on computer – Matching based on previous scores on state assessment, Massachusetts Comprehensive Assessment System (MCAS) Data Collection Design 6

Item/Task Level Analysis – Comparison of p-values and item means – Analysis of IRT item difficulty estimates Component Level Analysis – Correlation between PBA and EOY scores Test Level Analysis – Reliability – Validity – Score Interpretation Analysis Methods 7

Summary of Results

Grade 4 Mathematics – Device effect found for 18 of 51 (35%) items – Elementary students less familiar with taking mathematics tests online – Degree of success in matching samples for Grade 4 Grade 8 Mathematics – Device effect found in component-level and reliability analysis – Highest number of items (29 of 67, or 43%) excluded from study Observed Device Effect 9

Grade 4 ELA – Device effect found in validity and score interpretation analysis – Elementary students less familiar with taking items/tasks that are not selected responses online – Degree of success in matching samples for Grade 4 Consistent device effect across analyses was not observed for any of the tests in the study – Device effect was found for none of the analyses in Grades 8 ELA and Geometry Observed Device Effect 10

Conclusions and Implications

1.Do the individual items/tasks perform similarly across computers and tablets? o YES, for most items/tasks in the study o More items with device effect in Grade 4 – Unfamiliarity with taking certain item types online for elementary school students – Degree of success in matched samples o Insufficient device effect items to draw conclusions about item features Conclusions 12

2.Are the psychometric properties of the test scores similar across computers and tablets? o YES, for all but one test in this study o Exception: Grade 8 mathematics (component- level and reliability analyses) – Highest number of items excluded from study may have led to less stable correlation estimates Conclusions 13

3.Do students perform similarly on the overall test across computers and tablets? o In general, YES – no consistent device effect was observed across analyses for any test in study o Device effect found in score interpretation analysis for Grade 4 ELA – Unfamiliarity with taking non-selected response tasks online for elementary school students – Degree of success in matched samples Conclusions 14

Comparability of assessments administered on computer and tablets – No evidence of large or consistent differences in comparability was found in this study – Also supported by device comparability research conducted outside of PARCC (e.g. Davis, Orr, Kong, Lin, 2014; Olsen, 2014; Davis, Kong, McBride, 2015) – Further supported by policies in other large scale assessment programs (e.g., SBAC and other statewide assessments) Implications 15

Item development and user interface design – Consider familiarity of younger students with nontraditional item types online – Additional focus groups and/or cognitive labs with elementary school students – Minimize the use of item features (e.g., drag and drop) that may lead to differential performance across computers and tablets Implications 16