Download presentation
Presentation is loading. Please wait.
Published byAdam Dean Modified over 9 years ago
1
Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education 1
2
27 Multiple Choice CRTs ◦Grades 3 – 11 English language arts ◦Grades 3 – 7 math, Pre-Algebra, Algebra 1, Geometry, Algebra 2 ◦Grades 4 – 8 science, Earth Systems Science, Physics, Chemistry, Biology Direct Writing Assessment ◦Grades 5 and 8 ◦Plus formative tool available year-round, grades 5 & 8 Utah Test Item Pool Service (UTIPS) ◦Formative tool – USOE item pool and/or educator items ◦Available year-round to all content areas, K-12 ◦Facilitates local benchmark/interim tests
3
41 Districts, 81 Charter Schools 530,000 students Lowest per-pupil spending in nation Infrastructure 50% Windows, 40% Macintosh, 10% Linux Strong technical skills among LEAs ◦Wireless, thin clients, multiplied workstations Utah Education Network ◦ISP for districts and secondary schools, some charter schools ◦Few elementary schools with a single T1 line
4
YearParticipation Rate Number Of CRTs Administered 2001- 2006 4 – 8%Max 90,000 20078%92,000 200850%495,000 200966%659,000 201080% Projected 815,000 Projected
5
YearKey Events 2001All 27 CRTs available online 2004UTIPS available online 2004 & 2007 One-time legislative funding, focused on hardware acquisition 2007CBT Summit – to define state vision 2009Change in CBT vendor
6
YearKey Events 2009 & 2010 CAT pilot available as local LEA assessment option 2010Change in CRT development vendor (ELA & math) 2010Shorter CRTs, embedded pilot items 2010Text-to-speech pilot, embedded within CRTs 2010Innovative item research & small-scale pilot 2010DWA online with AI scoring
7
Hardware + Software + Test items & forms + Bandwidth + Local configurations + Student preparation + Test administration procedures = Testing experience It’s not just a new test – it’s an ambitious technology implementation project Different skills needed to support testing ◦Cleaning answer documents vs. technical support ◦Different and more preparation prior to testing
8
Low tolerance for interruptions ◦Browser loading of pages ◦System interruptions Aging infrastructure ◦One-time funding creates “bubbles” ◦HVAC, electrical upgrades needed ◦Participation tied to what is physically possible Balancing innovation with stability ◦Item types and accessibility impact on system ◦What are LEAs purchasing? Can it be supported?
9
What is standardized presentation? PBT version of the CBT format Change in vendor/software LEA configurations (e.g., screen resolution) What is comparable? Year to year Form to form Redesigning processes to be CBT-centric, while still producing PBT Development QA timeline is different
10
Require industry best practices for software development and deployment Clear communication with all parties ◦Assessment and Technology brainstorming, preparing, and resolving problems together Plan for crisis management ◦There will be problems ◦Philosophy shift to “not if, but when” Set clear expectations for participation ◦What is voluntary? Flexibility for LEAs? ◦Each school CAN do something
11
All efforts focused on lowest risk implementation Solid LEA and school readiness checklists ◦Compare system technical specifications to LEA reported configurations to what is actually used Strong support for issue resolution ◦Separate policy issues from system training and technical troubleshooting issues ◦Well defined tier 1, 2 and 3 support ◦Local configuration vs. system-wide problems ◦How to respond to administration anomalies
12
Long-term vision for assessments ◦More options for validly assessing students Students more engaged Student results in teacher hands faster Technology resources available to support instruction CBT shines light on many issues ◦Test administration processes and ethics ◦Appropriate accommodations ◦SIS system and course scheduling ◦Better picture of technology infrastructure
13
More time to spend on what to do because of the data instead of generating the data ◦Automatic scoring & use of artificial intelligence Increases assessment literacy ◦What do good questions look like? ◦How can we make our questions better? Easier to tailor assessments to instruction and student needs Encourages conscious alignment of individual assessments to curriculum, K-12 ◦Why am I asking this question?
14
Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education julie.quinn@schools.utah.gov http://schools.utah.gov/assessment
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.