Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education 1.

Slides:



Advertisements
Similar presentations
DELAWARE COMPREHENSIVE ASSESSMENT SYSTEM Presented to District Test Coordinators February 17, 2010.
Advertisements

1 LAUSD Mathematics Periodic Benchmark Assessments Using Data to Inform Instruction.
California Common Core State Standards December 13, 2011 Information taken from Sacramento COE, San Joaquin COE, California Department of Education 1.
Assessment and Accountability at the State Level NAEP NRT (Iowa Tests) Core CRTs DWA UAA UBSCT NCLB U-PASS Alphabet Soup.
AzMERIT Arizona’s Statewide Achievement Assessment for English Language Arts and Mathematics November 20, 2014.
Common Core Standards and the Edmonds School District November 4, 2013.
On The Road to College and Career Readiness Hamilton County ESC Instructional Services Center Christina Sherman, Consultant.
Staar Trek The Next Generation STAAR Trek: The Next Generation Test Design.
1 Judy W. Park, Ed.D., Associate Superintendent, Utah State Office of Education.
CORE California Office to Reform Education Fall Performance Assessment Pilot October-December 2012.
LUSD 21st Century Technology PresentationPage 1 21 st Century Technology in the Classroom A Presentation to the Board of Education Prepared by Facilities.
Race to the Top Technology and Innovation in Assessments Boston, MA Tony Alpert Oregon Department of Education.
LCFF & LCAP PTO Presentation April, 2014 TEAM Charter School.
Moving Forward With Assessment and Accountability August 2011.
Utah Test Item Pool Service UTIPS January 17, 2007 Julie Quinn Accountability/UTIPS Specialist Utah State Office of Education.
Part II: Strategic Planning for a Successful 1:1 Program
 Inclusion and the Common Core State Standards  Inclusion and State Assessment  Inclusion and Teacher Evaluation  Results Driven Accountability 
+ Curriculum & Instruction Technology Terry Duggan, Dir. of Program Development Deb Gammerman, Dir. of Technology & Innovation FY’15 Budget 8/21/
Online Testing TAKS – XL Retest 2008 – 2009 Online Testing Opportunities October 21 st – 24 th 2008 March 3 rd – 6 th 2009 April 28 th –
April 11, 2012 Comprehensive Assessment System 1.
Online Assessment Update Region 6 February 24, 2012.
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
UTIPS Core Roll-Out Fall, 2011 Julie Quinn, Data and Delivery Coordinator Assessment and Accountability Utah State Office of Education.
Critical Information SAGE Critical Information 1 Judy Park, Ed.D. Associate Superintendent Utah State Office of Education.
K-5 Instructional Technology Royle PTO Meeting – 1/15/14 Christina L. Hefele K-12 Technology/Data Coordinator
State Assessment Results CCR = ACTE T E A E C A C T A.
Consortia of States Assessment Systems Instructional Leaders Roundtable November 18, 2010.
So You’re Interested in Opting Into Home Base Review of Opt-in Components and Q&A.
Navigating the State Assessment Program Pam Biggs NCDPI Division of Accountability Services Conference on Exceptional Children November 7, 2006.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Ohio’s Assessment Future The Common Core & Its Impact on Student Assessment Evidence by Jim Lloyd Source doc: The Common Core and the Future of Student.
Tennessee’s Common Core Standards South Gibson County High School Parent Information Session.
PARCC Update June 6, PARCC Update Today’s Presentation:  PARCC Field Test  Lessons Learned from the Field Test  PARCC Resources 2.
Spring 2011 Computer-Based Testing for Grade 10 Mathematics and Reading & Mathematics Retakes February 15, 16, and 17, 2011.
Accommodations Charter Schools Roundtable October 26, 2011.
SMARTER BALANCED UPDATE ELA LEADERSHIP DECEMBER 6, 2012.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
1:1 Computing Initiative RttT Fidelity Check Spring 2013.
UTIPS Update C-Forum October 5, 2007 Julie Quinn Computer-Based Assessments Specialist Assessment Section Utah State Office of Education.
Overview of Michigan’s Secondary Assessments of Science Edward Roeber Office of Educational Assessment and Accountability.
MARCH & APRIL 2012 TEST ADMINISTRATIVE ERRORS. LPSS TESTING JUST LIKE LPSS TURNAROUND WE ARE GOOD IN SOME SPOTS…NOT SO GOOD IN OTHERS… WE ARE WORKING.
Ohio PARCC Science Education Council of Ohio September 29, 2012.
Spring 2009 MCAS Results. Dedham Elementary Schools.
Accommodations Required Content for STC and TA Training.
New Developments in NYS Assessments. What is new? Required use of Standardized Scannable Answer Sheets for all Regents Exams starting in June 2012 Beginning.
Missouri Department of Elementary and Secondary Education September 2015 Missouri Assessment Program Assessment Updates Shaun Bates Director.
Session 1 Achievement Testing Pre-Test Training
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
Dynamic Learning Maps Alternate Assessment Transtions in West Virginia Melissa Gholson Office of Assessment.
KAREN TOHINAKA SAGE Overview. What is SAGE? SAGE (Student Assessment of Growth and Excellence)  is Utah’s computer adaptive assessment system aligned.
Next Generation Assessments Stakeholder Meeting June 26, 2014.
Moving Forward With Assessment and Accountability August 2011 High School.
MTSS PRESENTATION SEPTEMBER, 2013 NCSC Summative Assessment Update.
Understanding the Common Core State Standards and Literacy Standards.
“ Public education is open to all children - no matter their ability, heritage, or economic background. It is the promise of our future ” Denise Juneau,
ASSESSMENT UPDATE Literacy Specialists September 10, 2014.
Bridgewater-Raritan Regional School District Program Evaluations A summary of recommendations from the completed program evaluations March 6, 2009.
LaKenji Hastings, NWLC Assessment Program Specialist Georgia Milestones Parent Informational.
What do you think should be the goal of technology in education?
Understanding the Common Core Standards Adopted by Nevada in 2010 Our State. Our Students. Our Success.
LEAP 2025 Practice Test Webinar for Teachers
Common Core State Standards
Common Core for Parents/Families
New Developments in NYS Assessments
Erie 2 Regional Curriculum Council March 14, 2012
Smarter Balanced Assessments
CBA Assessments in Eduphoria Aware
Presentation transcript:

Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education 1

 27 Multiple Choice CRTs ◦Grades 3 – 11 English language arts ◦Grades 3 – 7 math, Pre-Algebra, Algebra 1, Geometry, Algebra 2 ◦Grades 4 – 8 science, Earth Systems Science, Physics, Chemistry, Biology  Direct Writing Assessment ◦Grades 5 and 8 ◦Plus formative tool available year-round, grades 5 & 8  Utah Test Item Pool Service (UTIPS) ◦Formative tool – USOE item pool and/or educator items ◦Available year-round to all content areas, K-12 ◦Facilitates local benchmark/interim tests

 41 Districts, 81 Charter Schools  530,000 students  Lowest per-pupil spending in nation Infrastructure  50% Windows, 40% Macintosh, 10% Linux  Strong technical skills among LEAs ◦Wireless, thin clients, multiplied workstations  Utah Education Network ◦ISP for districts and secondary schools, some charter schools ◦Few elementary schools with a single T1 line

YearParticipation Rate Number Of CRTs Administered – 8%Max 90, %92, %495, %659, % Projected 815,000 Projected

YearKey Events 2001All 27 CRTs available online 2004UTIPS available online 2004 & 2007 One-time legislative funding, focused on hardware acquisition 2007CBT Summit – to define state vision 2009Change in CBT vendor

YearKey Events 2009 & 2010 CAT pilot available as local LEA assessment option 2010Change in CRT development vendor (ELA & math) 2010Shorter CRTs, embedded pilot items 2010Text-to-speech pilot, embedded within CRTs 2010Innovative item research & small-scale pilot 2010DWA online with AI scoring

 Hardware + Software + Test items & forms + Bandwidth + Local configurations + Student preparation + Test administration procedures = Testing experience  It’s not just a new test – it’s an ambitious technology implementation project  Different skills needed to support testing ◦Cleaning answer documents vs. technical support ◦Different and more preparation prior to testing

 Low tolerance for interruptions ◦Browser loading of pages ◦System interruptions  Aging infrastructure ◦One-time funding creates “bubbles” ◦HVAC, electrical upgrades needed ◦Participation tied to what is physically possible  Balancing innovation with stability ◦Item types and accessibility impact on system ◦What are LEAs purchasing? Can it be supported?

 What is standardized presentation?  PBT version of the CBT format  Change in vendor/software  LEA configurations (e.g., screen resolution)  What is comparable?  Year to year  Form to form  Redesigning processes to be CBT-centric, while still producing PBT  Development QA timeline is different

 Require industry best practices for software development and deployment  Clear communication with all parties ◦Assessment and Technology brainstorming, preparing, and resolving problems together  Plan for crisis management ◦There will be problems ◦Philosophy shift to “not if, but when”  Set clear expectations for participation ◦What is voluntary? Flexibility for LEAs? ◦Each school CAN do something

 All efforts focused on lowest risk implementation  Solid LEA and school readiness checklists ◦Compare system technical specifications to LEA reported configurations to what is actually used  Strong support for issue resolution ◦Separate policy issues from system training and technical troubleshooting issues ◦Well defined tier 1, 2 and 3 support ◦Local configuration vs. system-wide problems ◦How to respond to administration anomalies

 Long-term vision for assessments ◦More options for validly assessing students  Students more engaged  Student results in teacher hands faster  Technology resources available to support instruction  CBT shines light on many issues ◦Test administration processes and ethics ◦Appropriate accommodations ◦SIS system and course scheduling ◦Better picture of technology infrastructure

 More time to spend on what to do because of the data instead of generating the data ◦Automatic scoring & use of artificial intelligence  Increases assessment literacy ◦What do good questions look like? ◦How can we make our questions better?  Easier to tailor assessments to instruction and student needs  Encourages conscious alignment of individual assessments to curriculum, K-12 ◦Why am I asking this question?

Julie Quinn Computer-Based Assessments Specialist Utah State Office of Education