Download presentation
Presentation is loading. Please wait.
Published byDiana Cordelia Morris Modified over 9 years ago
1
From State-Wide to State Consortium Assessment Systems: Test Administration Lessons Learned From a Consortium Pilot National Conference on Student Assessment June 22, 2013
2
PRESENTERS: Norma Sinclair Psychometrician and Education Consultant, Connecticut State Department of Education Juan D’Brot Executive Director, West Virginia Department of Education Beth Fultz Program Consultant, NAEP, Kansas State Department of Education Paula Hutton Deborah Matthews SWD Educational Program Consultant, Kansas State Department of Education MODERATOR: Jennifer Paul EL Assessment Consultant, Michigan Department of Education
3
Overview –Try out 10,000 new CB Math/ELA test items –Try out new item types –Try out test delivery system –Secure data to analyze the stability of reporting scales –Secure data to build CAT system
4
Before the Pilot: Expected Scope –25 member states –1.2 million students –9,000 schools –February to May 2013 testing window –Untimed pilot administrations –Desktops, laptops, notebooks, tablet support –Support multiple operating systems
5
Test Administration Lessons From a Consortium Pilot Paula Hutton Maine Department of Education Norma Sinclair Connecticut State Department of Education Paper presented at the NCSA 2013, Washington, D.C.
6
Without standardized test administrations and testing conditions, the accuracy and comparability of score interpretations as well as student opportunity to demonstrate skills and abilities could be diminished (AERA, APA & NCME, 1999).
7
Our Goals Describe resources to foster standardization. Record pilot participant experiences. Implications for transitioning to multi- state SBAC Assessments.
8
Standardizing the Pilot Administration Downloadable administration manuals Webinars and training modules Practice tests Help-desk support Student instructions (Test navigation, tools) Recruitment materials, pilot updates
9
Data Collection On-site observations: –Connecticut –Maine Email surveys Observation notes
10
Pilot Participant Comments: Kudos Manuals/modules: comprehensive coverage of fundamentals TIDE (test information distribution engine): Efficient to manage students and teachers Secure browser easy to install Help desk support
11
Standardization Challenges: Highlights Challenges based Standard 5 of test administration standards. Administration materials, procedures, resources (5.1) –Difficulties with Operating systems/internet providers –Voluminous/inconsistent manuals and resource materials –Reduced performance and ease of use of software (due to multiple TDS technical difficulties) –Online tools and student supports: Inconsistent quality Pilot administration disruptions/modifications (5.2) –Untimed pilot testing (non-standardized test lengths) –Technical difficulties (arbitrary log-offs, computer freezes, error messages, volume control)
12
Pilot Standardization Challenges Contd. Distractions-free testing environment (5.4) –Testing in open areas in libraries –Technical difficulties (repeated log-offs, freezes, volume control) Student test instructions (5.5) –Misleading instructions –Quality of videos and audio prompts –Incorrect pilot component assignments
13
Pilot Standardization Challenges Contd. Unfamiliar test equipment and tools (5.5) –Incomplete/missing instructions for online tools and student supports –Practice items unavailable at top of session Responding to test items using unfamiliar equipment (5.5) –Limited opportunity to learn keyboarding –No opportunity to practice using navigation and test tools
14
Pilot Participant Wish List Quick start administration guide In-person training Top-Down Communications Have states upload student information Minimize technical difficulties Reduce the length of field test Reduce keyboarding requirements
15
SBAC Field Test Administration: Implications Attend to differing needs of CBT and Non-CBT states and their students Implement top-down communication system for test administration Institute a top-down approach in TIDE to manage students and test administrators Improve TDS to reduce technical issues More responsive and accurate help desk Clarify roles and responsibilities of test administrators (Difference between test admin. facilitation/cheating)
16
Test Delivery Beth Fultz Kansas Department of Education National Conference on Student Assessment Saturday, June 22, 2013
17
Technology Readiness
18
Data My school was prepared with the technology required to administer the SBAC computer- based Pilot test: Strongly Agree AgreeNeither Agree or Disagree DisagreeStrongly Disagree 26.4%48.8%6.8%13.6%4.3%
19
Common Issues Lack of understanding around length of test - Volume Control Headphones Ability to move to the next question Programming issues Skipped questions Split screens
20
Computer Delivery States Assumption - SBAC computer test technology would work just like the state assessment Required closer coordination with district IT staff Multiple test takers using the same computer on the same day (access point/reboot)
21
Paper-Pencil States Band-width and wireless connectivity Age of school computers Unexpected “internal system updates” Students not as familiar with online matching, drag-and-drop and calculator tools Lack of experience in dealing with small problems Student getting logged out Browser getting stuck
22
Browser/Delivery Systems
23
Data How easy or difficult did you find the SBAC delivery system to use: Very EasyEasyNeither Easy or Difficult DifficultVery Difficult 8.7%39.8%28.9%18.8%3.7%
24
Data SBACKansas Windows81.29%74.89% iPad1.89%10.15% Mac14.74%14.92% Other2.08%.04%
25
Common Issues Audio Film clips - often audio would not play Re-listen to entire section – not a selected section Drag & drop didn’t always work Issue – test would not allow the student to go on to next question
26
iPad and Tablets Keyboards Difficulty seeing the entire question/scrolling Software would only work if all other programs were closed Overall, very successful pilot Fewer technology issues
27
Technical Difficulties
28
Data Did you, or any of the test takers, experience any technical difficulties during the administration of the test: Yes78.6% No21.4%
29
What technical difficulties did you, or any of the test takers, experience? Reasons/Responses Problems with school computer equipment36.0% Issues with the testing platform/ the test itself51.5% Lack of resources to conduct testing10.1% Overall the system was difficult to use14.2% Training materials were inadequate15.0% Insufficient time allowed for introducing students to the test environment 21.3% Other22.1%
30
Conclusions
31
Comment from a Kansas Testing Coordinator: “Most problems are of the sort that will be resolved before the final test is operational. In other words, the pilot is serving its purpose.”
32
Accessibility and Accommodation for SBAC Pilot Assessment National Conference on Student Assessment June 22, 2013
33
Pilot Test Universal Design- Increased accessibility –To include 2% students –Only looking to exempt ALT 1% Work Group Members Contract Work –State Practices –Literature Reviews –Policy Recommendations Advisory Committees –ELL Advisory Committee –SWD Advisory Committee –Cross-consortia ELL Advisory Committee
34
Pilot Test ELA Writing Tools for Performance Tasks: available to all students General tools for Math and ELA: most available to all students Accessibility Pilot Studies: content and grades specified
35
Digital Tools – All Grades Writing ELA Performance Tasks Universal Digital ToolAll Students BoldYes ItalicsYes UnderlineYes IndentYes CutYes CopyYes PasteYes Spell CheckYes Undo/RedoYes
36
Additional Tools and Resources All Grades Universal Digital ToolELAMathAll Students Tab-Enter NavigationYes Font Background Color Alternatives Yes BreaksYes Additional TimeYes Calculator*N/AYes * Note: Calculators available on items when they do not interfere with intended construct.
37
Accessibility Pilot Studies FeaturesMath Grades ELA Grades Full Spanish Translation3, 7, 11--- Customized Spanish glossaries3, 7, 11--- Online refreshable Braille*3, 7, 114, 7, 11 Text to speech3, 7, 114, 7, 11 (items only) Customized English glossaries34 (items only) * Note: Special equipment provided locally.
38
Full Spanish translation Customized Spanish glossaries English Language Learners and the Pilot Test
39
Students with Disabilities and the Pilot Test Technology Students and Technology Braille Text to Speech
40
Item Types & Content Areas ELA and Math –Multiple-choice –Constructed Response –Performance Tasks –Classroom Activities –Technology Enhanced What new accessibility issues does the item create? *Special considerations for SWDs and ELLs
41
Technology Enhanced Items Unnecessary item format (increased cognitive load) Unclear/confusing design and layout Appropriateness for visually impaired Embedded identification of tools and commands for use All included tools are necessary
42
SBAC Efforts For Pilot Test: –All items went through bias, sensitivity, and accessibility vendor review –SBAC A & A workgroup reviewed items For Field Test: –All items to go through vendor review –SBAC A & A workgroup quality check –Large, coordinated SBAC review by state members –Strengthen training
43
Pilot Studies Results Gather information about the process of providing accommodations and results of offering them. Provide feedback to states, work groups, experts. Incorporate what we learn into field test development work. Continue to develop materials resources that are state friendly.
44
Test Security From State to Consortium and Back National Conference on Student Assessment June 22, 2013
45
Purpose of the Pilot –Test thousands of new CB Math/ELA test items –Test new item types –Test a new delivery system –Analyze the stability of reporting scales –Secure data to build CAT system –Unnamed: To identify gaps in policy, process and procedure
46
Timeline and Challenges at Two Levels Pilot Test (SY 2012-2013) & Field Test (SY 2013-2014) 2 opportunities: –Smarter Balanced: System readiness –Smarter Balanced & States Field readiness Policy availability Defined processes and procedures
47
Test Security TILSA Test Security Guidebook Three Key Areas 1.Prevention 2.Detection 3.Follow-up Investigation Primary Goal: –Ensure the reliability of results and the validity of student responses –Validity: the consequential kind Accountability for Schools and Teachers
48
Prevention Program Management –Security Plans and Staffing Test Design and Deployment –Item pools mitigating over-exposure Test Administration and Scoring –Procedures, rules, documentation Quality Control –Security of items and materials –Web monitoring –Training and security awareness
49
Detection Reporting Protocols Investigation Protocols Data Forensics –Erasure –Person-fit –Answer change –Gains and Losses –Similarity/collusion
50
Follow-up Investigations Evidence guidelines and criteria –What kinds, how much, and from whom? Roles and Responsibilities –State staff –Vendor staff –LEA and school staff Investigation Tool Kit for SEAs and LEAs –Expectations, requirements, roles, responsibilities, and types of information Timelines –A dedicated and transparent plan
51
The Pilot Test Experience States may have been unprepared beyond their current policies Pilot test administration manual: –Informed administrators about security of the assessment –If everyone read it Consortium data –What kinds of analyses will be conducted? State feedback (WV): –General adherence to security requirements of state policy –Multiple cases of “breach” events –A few cases of test impropriety
52
The Pilot Test Experience (cont’d) State feedback (WV): –A few cases of test impropriety Recourse –Little due to lack of paper trail –Non-state directed process led to a degree of disconnect between LEA and SEA – affected prevention –Lessons learned for Field Test and Operational
53
A Potential State Example What if an administrator posed as a student? –Threats to: response validity item over-exposure Influencing field test scaling –What recourse does a state have? Against what state policy? Against what LEA and school required process? Depends on documentation…
54
What’s Needed for the Field Test Processes (co-chairs present…) –More support from the Smarter Balanced test administration standpoint –Increased integration with state-specific best practices –States requiring training from the consortium –Leads to: Documentation of Evidence –Signed agreements –Training verification –Creating a paper-trail
55
What States Need for Operational Administration Solid Policy and Guidelines –Consortium-sponsored guidelines, potentially policy –State defined policies, requirements, and guidance –LEA-focused toolkits, plans, and requirements –Signed user agreements –Minimum standards for states to engage in the consortium
56
What States Need for Operational Administration Agreements –Vendor agreements to get data/reports Data Themselves –Student level responses Answer changing (think erasure) Pattern analyses Person-fit analyses –Latency data –Time stamped data –New flagging criteria
57
General Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.