On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Slides:



Advertisements
Similar presentations
Laura Snow SD Department of Education Office of Assessment and Technology Systems.
Advertisements

Purpose To give guiding instruction and preparation required for the administrative, teacher, and student experience utilizing the Performance Matters.
Cognitive Abilities Test (CogAT) 2015
West Virginia Educational Standards Test WESTEST 2005 Training for County Test Coordinators Part II – “Bundling”
TerraNova School Test Coordinators Training September 10, 2014 Roger Traynham Office of Achievement and Accountability.
Riverside Interim Assessments Winter 2014 Brianna Beitler.
1 Pearson Kentucky Program Team January Kentucky Performance Rating for Educational Progress K-PREP PearsonAccess Training.
1 Arizona Grade 8 Science Online Field Test Test Administrator Training March 23 – April 2, 2007.
Statewide Assessment Test Administrator Training 2015 PAWS & SAWS.
Operations Management and Technology Ross L. Fink.
Grade 12 Subject Specific Ministry Training Sessions
Optimum Solutions Corporation All the Right Answers.
What is Assessment Admin? Districts, schools and teachers may create and administer assessments online using CIITS Teachers can use ISEE to: − Create and.
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
Supply chain management is only the beginning Tim Lopes, Sergio Marrero, & Michelle Spivak April 17th 2007.
State Assessment Results CCR = ACTE T E A E C A C T A.
Copyright © 2006 Pearson Education, Inc. or its affiliate(s). All rights reserved.
FSA Writing Component Spring 2015.
Statewide Assessment Test Administrator Training 2014 SAWS.
Praxis II What, How, and Why?. What is the Praxis II? The Praxis is an assessment used to measure subject-specific knowledge or teaching skills There.
1 ADP Algebra II End-Of-Course Online Field Test Test Administrator Training September, 2007.
(After the Tests, Shipping All Materials Back to ACT, Part 3 of 3)
PARCC Assessments Updates Updates Arrived 2/6/13! general specifics.
Computer Scoring Le Grand Concours 2011 and beyond.
UNSD Census Workshop Day 2 - Session 7 Data Capture: Intelligent Character Recognition Andy Tye – International Manager DRS are Worldwide specialists in.
Copyright © 2006 Pearson Education, Inc. or its affiliate(s). All rights reserved.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction California Computer Based Testing Tryout in Science CCSSO.
The Dark Side of Document Imaging: ‘The Hidden Cost of Capture’
Test Chair 101: Guidelines and Tips for New Test Chairpersons Wanda L. Cunningham 1.
FitnessGram® 2015 Student Information System (SIS) Extract Import Training for Georgia School Year.
ALTERNATIVE STANDARDIZED READING ASSESSMENT Spring 2014 Screencast Training Session Program Guide:
Verification After Testing Before submitting completed answer sheets, please ensure the following: All student names are filled in, legible, and spelled.
ASSESSMENT CHANGES for WASHINGTON Joe Willhoft, OSPI Asst Supt., Assessment and Student Information Spring WERA, March 26, 2009.
Smarter Balanced Assessment Consortium (SBAC) Fairfield Public Schools Elementary Presentation.
SIX STANDARDS RELATED TO TEACHER EFFECTIVENESS STANDARDS 1-5 FROM OBSERVATIONS AND EVIDENCE STANDARD 6 AS A MEASURE OF STUDENT GROWTH WITH A TEACHER AND.
ALTERNATIVE ASSESSMENT FOR GRADE 3 PROMOTION Summer 2008 Screencast Training Session Program Guide:
TestNav: Pearson’s Online Testing Engine Training and Practice Item Review Colorado Summative Science and Social Studies Field Test Spring 2013 Call in:
TerraNova/ Supera Survey CAMPUS NAME SPRING 2016.
SMARTER BALANCED ASSESSMENT PARA LOS NIÑOS APRIL 30, 2013 Transitioning to the Common Core.
Advanced Higher Computing Science The Project. Introduction Worth 60% of the total marks for the course Must include: An appropriate interface using input.
ALTERNATIVE STANDARDIZED READING ASSESSMENT Spring 2016 Program Guide:
Spring 2011 Screencast Training Session Program Guide:
April 2011 Division of Academics, Performance and Support Office of Assessment High School Testing – Scoring Best Practices May 24, 2011 CFN 201.
ITBS END OF YEAR ALTERNATE STANDARDIZED ASSESSMENT
The First Mile – Single Family Loan Document Processing in 37 Minutes
District Coordinator Training
ALTERNATIVE ASSESSMENT FOR GRADE 3 PROMOTION (AAGTP)
of Full Service Scoring
UNSD Census Workshop Data Capture: Intelligent Character Recognition
ITBS END OF YEAR ALTERNATE STANDARDIZED ASSESSMENT
Annual Assessment and Accountability Meeting Updates
8th Grade IB Community Projects
Grade 3 Midyear Promotion (GTMYP)
OAKS Online Science & Optional Social Sciences
ALTERNATIVE ASSESSMENT FOR GRADE 3 PROMOTION (AAGTP)
Smarter Balanced Assessments
Grade 3 Midyear Promotion (GTMYP)
Grade 3 Midyear Promotion (GTMYP)
Grade 3 Midyear Promotion (GTMYP)
Michigan Assessment Consortium Common Assessment Development Series Module 21 Assessment Administration and Scoring MAC CAD-PD Mod-6 BRF
ALTERNATIVE STANDARDIZED READING ASSESSMENT
Grade 3 Midyear Promotion (GTMYP)
ITBS END OF YEAR ALTERNATE STANDARDIZED ASSESSMENT
ALTERNATIVE ASSESSMENT FOR GRADE 3 PROMOTION (AAGTP)
FitnessGram® 2015 Student Information System (SIS) Extract Import Training for Georgia School Year.
ALTERNATIVE ASSESSMENT FOR GRADE 3 PROMOTION (AAGTP)
ALTERNATIVE STANDARDIZED READING ASSESSMENT
Presentation transcript:

On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001

Speaking Points Current paper-and-pencil-based assessments Image Scoring Computer Administration Computer Scoring

Typical Current Paper-and-Pencil Based Statewide Assessment 3 grades Reading, writing, math, science, social studies 30 MC and 6 OE questions for four areas, one essay for writing 50,000 students per grade

Materials Processed 150, page test booklets 2 millions sheets of paper 10 tons of paper, a stack 700 feet high 150, page answer documents 1.5 million sheets of special paper 7.5 tons 600 boxes to store (per year)

Process Materials shipped to schools Materials shipped back to contractor Materials logged in Count everything, resolve discrepancies Note that one misplaced school can stop entire process

Process for Receiving Materials Separate answer booklets from test booklets Test booklets placed in temporary storage in original boxes, then destroyed after reporting complete Answer sheets guillotined MC answer sheets scanned OE sheets packaged by scoring

Processing of OE Sheets Separate by content area Sorted by form, randomized across schools Scanned to capture ID numbers Scoring headers prepared, then merged with answer sheets

Scoring Hire, train, qualify Score On-going evaluation of quality of scoring Determine papers that need adjudication, then rescore as necessary Scan scoring headers Merge MC, OE and writing scores

Scoring Time 20 seconds per OE question 5 minutes per essay (2 scorings plus adjudication, if necessary) 13 minutes per student 32,500 hours 1000 person-weeks, plus training, qualifying, quality control and equating

Equating to Previous Year MC OE Difficulty of items Changes in scoring

Count, Count, Count Initial log-in counts After packaging Every time a box is opened or closed Count boxes, too

Final Steps Ship reports back to schools Resolve problems Missing or misplaced students Challenges to scoring (requires finding answer sheets—perhaps all for one student) Destroy test materials Long-term storage for answer documents

Solution # 1—Image Scoring High-speed scanners capture images of documents All processing is done on CRTs by looking at electronic image of original paper

Advantages Control Scoring Blind read-behinds Real-time tracking of accuracy of every scorer Multiple sites Equating Blind rescores from previous year

Advantages (cont’d) Scoring speed Next response is ready to be scored when first is done Scoring stops when rates decline No fumbling for papers Up to 1/3 faster

Advantages (cont’d) Tracking No need for counting Nothing is lost Nothing is damaged Records automatically linked Special-request papers easy to obtain Prep for next year’s scoring Challenged papers Adjudication

Advantages (cont’d) Reporting—Send sample of work home to parents Storage Permanent Compact

Disadvantages Hardware and software costs Costs have dropped dramatically ($150,000 server two years ago now selling for $16,000) Need to prove that scoring is the same Writing vs. OE Connectivity Power outages

Computer Administered Tests Web-based vs. CD Comparability Standards—especially writing Students that write on paper and then just type in Full use of computer capabilities Underestimation of (some) students’ abilities

Georgia’s Proposed System Huge item bank, three levels Teachers can create tests Capacity concerns for Level III tests

Advantages Elimination of paper Accommodations Adaptive testing Shorter tests Diagnostic tests Lower frustation levels Real-time scoring

Issues Administration time All schools have some computers, but how many? Transition Recommendation is to test all schools the same way Comparability Logistics of operating two programs at same time

Computer Scoring Major vendors NCME Session N1, April 12, 2001 ETS Technologies—E-rater (Princeton, NJ) Vantage Learning—Intellimetric (Yardley, PA) TruJudge—Project Essay Grade (PEG) (Purdue) Knowledge Analysis Technologies— Intelligent Essay Assessor (Boulder, CO)

Advantages Time Cost Objective (or at least impersonal)

Issues Accuracy rates PA study—computers vs. humans Computer more accurate than one human Computer less accurate than two humans Bias vs. random error Beating the system (“Stakes changes everything”) Capacity of contractors to deliver logistics

Alternate Testing Modes Listening Special education adaptations—see Tindel Virtual reality