26th Annual Management Information Systems [MIS] Conference February 14, 2013 Washington, DC Common Education Data Standards (CEDS) Supporting Assessment.

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

___________________________ NCDPI Division of Accountability Services/North Carolina Testing Program North Carolina Alternate Assessment Academic Inventory.
Student Learning Outcomes and Assessment An Overview of Purpose, Structures and Systems Approved by Curriculum Committee Approved by Academic.
APIP Goals Portability of item content Accessibility of item content
ESEA: Developing the Paraprofessional Portfolio Prepared by Carolyn Ellis Logan, Consultant Professional Development/Human Rights Department Michigan Education.
Iowa Assessment Update School Administrators of Iowa November 2013 Catherine Welch Iowa Testing Programs.
Field Tests … Tests of the test questions Jeff Nellhaus, PARCC, Inc. Louisiana Common Core Implementation Conference February 19,
NOCTI Overview Amie Birdsall and Patricia Kelley February 23, 2012.
Redesigning the Developmental Math Curriculum for Increased Student Success: A Case Study from the Virginia Community College System Redesigning the Developmental.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
E-Portfolios for Students
Nevada CTE & CTECS: Programs, Standards, Assessments & Credentials January, 2014 Nevada Department of Education Office of Career, Technical and Adult.
Common Core State Standards & Assessment Update The Next Step in Preparing Michigan’s Students for Career and College MERA Spring Conference May 17, 2011.
The Initiative for School Empowerment and Excellence (i4see) “ Empowering teachers, administrators, policy makers, and parents to increase student achievement.
April 11, 2012 Comprehensive Assessment System 1.
The Five New Multi-State Assessment Systems Under Development April 1, 2012 These illustrations have been approved by the leadership of each Consortium.
Consortia of States Assessment Systems Instructional Leaders Roundtable November 18, 2010.
Student Learning Objectives The SLO Process Student Learning Objectives Training Series Module 3 of 3.
What are Best Practices in Large Scale Testing? A New Book: 2013 Edition Outlines Those Practices Operational Best Practices for Statewide Assessment Programs.
BRINC Personalized Learning Project DLC/TechMACC Combined Meeting March 14, 2014.
Ohio’s Assessment Future The Common Core & Its Impact on Student Assessment Evidence by Jim Lloyd Source doc: The Common Core and the Future of Student.
PARCC Update June 6, PARCC Update Today’s Presentation:  PARCC Field Test  Lessons Learned from the Field Test  PARCC Resources 2.
1 Policy No Child Left Behind of 2001 HSP-C-005/State Board of Education –Annual Language Proficiency Assessment –No Exemptions –Same standard, Same content.
Open Source Innovations for Better Assessment Brandt Redd CTO NCSA 22 June 2015.
New Products for ©  2009 ANGEL Learning, Inc. Proprietary and Confidential, 2 Update Summary Enrich teaching and learning Meet accountability needs.
CLASS Keys Orientation Douglas County School System August /17/20151.
Common Core Transition 1 Keystone Exams Project Based Assessments PASCD Conference November 22,2011.
Welcome to PARCC Field Test Training! Presented by the PARCC Field Test Team.
Stronge Teacher Effectiveness Performance Evaluation System
Race To The Top (RttT) MSDE Division of Accountability and Assessment Data Systems (DAADS) Maryland Public Schools: #1 in the Nation Three Years in a Row.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction California Measurement of Academic Performance and Progress.
TOM TORLAKSON State Superintendent of Public Instruction National Center and State Collaborative California Activities Kristen Brown, Ph.D. Common Core.
ITEC 3220M Using and Designing Database Systems
1 Chapter 9 Database Design. 2 2 In this chapter, you will learn: That successful database design must reflect the information system of which the database.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Data Sources Artifacts: Lesson plans and/or curriculum units which evidence planned use of diagnostic tools, pre- assessment activities, activating strategies,
MAC Common Assessment Training Modules Session F3 Michigan School Testing Conference February 23, 2012.
NC State University Center for Urban Affairs and Community Services.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
CURRICULUM, INSTRUCTION, AND ASSESSMENT UPDATE FEBRUARY 2014 VERONA PUBLIC SCHOOLS.
Why Do State and Federal Programs Require a Needs Assessment?
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Kansas State Department of Education June 23, 2014.
Transitioning to a Balanced Assessment System. Overview Professional Development in Assessment Smarter Balanced Logistics.
February 16, 2012 MIS Conference San Diego A History of and a Path Forward for the K-12 Standards Movement Copyright © SIF Association.
Measured Progress ©2011 Guide to the Smarter Balanced IT Architecture Connecticut Assessment Forum August 14, 2012.
Assessments aligned to Common Core State Standards August 2012IDEA Partnership1.
Standards Instruction Assessment Feedback. Session Outcomes Participants will:  Develop an understanding of the Test and Quizzes application in BCPS.
MTSS PRESENTATION SEPTEMBER, 2013 NCSC Summative Assessment Update.
Honors Level Course Implementation Guide Q & A Webinar Honors Rubric and Portfolio Review Process May 2, 2013.
1 BUILDING QUALITY LEARNING USING PERIODIC ASSESSMENTS Session Outcomes: Use diagnostic Periodic Assessments as instructional tools for quality enhancement.
North Carolina Educator Evaluation System Jessica Garner
Standards-Based Online Professional Development to Mentor and Retain Special Education Personnel Presented to Teacher Education Division of Council for.
2011 SMARTER Balanced Assessment Consortium: Technology Update November 21, 2011.
TestNav: Pearson’s Online Testing Engine Training and Practice Item Review Colorado Summative Science and Social Studies Field Test Spring 2013 Call in:
Differentiation Compacting Curriculum Carrollton-Farmers Branch ISD.
2011 SBAC Architecture: Implications for Technology and Testing Providers (Part 1) Co-Hosted with ATP February 6, 2011.
Understanding Assessment Projects. Learning Objectives for this Session After completing this session you should be able to… 1.Articulate the requirements.
Educator Recruitment and Development Office of Professional Development The NC Teacher Evaluation Process 1.
Why was the NCAAAI Developed?
Selective Interoperable Technology Standards
Interoperability.
Assessments aligned to Common Core State Standards
Georgia Department of Education
Kathy Cox State Superintendent of Schools GPS Day 3 Training
Exploring Assessment Options NC Teaching Standard 4
Iowa Statewide Assessment of Student Progress
Georgia’s Changing Assessment Landscape
Presentation transcript:

26th Annual Management Information Systems [MIS] Conference February 14, 2013 Washington, DC Common Education Data Standards (CEDS) Supporting Assessment Systems Development Copyright © IMS Global and SIF Association

Overview Need What is AIF Status Demonstration Prototypes Discussion and Questions Copyright © IMS Global and SIF Association

Created by Wordle.net Copyright © IMS Global and SIF Association

Need Standards – Confusing Assessment – RTTA and Consortia – Formative – Accountability Inform Instruction Stock photo.

Assessment Lifecycle Content Development Content Development Pre-Test Administration Pre-Test Administration Test Administration Test Administration Scoring Reporting Post-Test Administration Post-Test Administration Content and data reviews Test form construction Field testing Item banking & statistics Content exchange / interoperability Planning & blueprinting Item types Content development & universal design Learning standard alignment Administration planning & scheduling Registration, assignment, Form sampling Online infrastructure readiness assessment Pre-session planning (paper / online) & setup Alternate form assignment Test form delivery Platform (paper, online, mobile) presentation Item content & tools Adaptive testing Response collection Proctoring controls Form content security Desktop security Accessibility Testing anomalies Computer scoring Professional scoring Algorithmic (AI) scoring Portfolio scoring Sub test / strand scoring Attemptedness Performance levels Scaling / norming Growth scores Range finding Assessment Life Cycle Assessment Life Cycle Individual reporting Diagnostic reporting Informing & personalizing instruction Performance on standards Dashboard / summary reporting Aggregation / disaggregation Exchanging results / data Psychometric analysis Equating Score tables - scaling, norming Performance levels / cut scores Field test analysis Aligning results with curriculum / instruction Program and teacher effectiveness

What is AIF? Systemic view of an assessment system and applications Demonstrates points of interoperability between applications within system Outlines the data model and transport Copyright © IMS Global and SIF Association

REGIONAL Systems Assessment Platform Main Components STATE Systems Assessment Creation & Management System (ACMS) Assessment Creation & Management System (ACMS) LOCAL Systems SIF Assessment Delivery System (ADS) Assessment Delivery System (ADS) Assessment Reporting System (ARS) Assessment Reporting System (ARS) APIP SIF APIP SIF APIP Assessment Score Processing System (ASPS) Assessment Score Processing System (ASPS) ? ? APIP

Status Use cases outlined Elements complete Public review complete Prototype demonstrations complete Best practice document complete Available on ceds.ed.gov/aif.aspx

Demonstration Pilots Logical and Physical Testing Arrow 1 Arrow 10 Arrow 14

Arrow 1 Item Bank to Item Bank

Arrow 1 Diagram

Arrow 1 Test Scenario Scenario Description The purpose of this scenario is to demonstrate assessment items moving from an item authoring system to an item bank or from one item bank to another. This scenario covers only assessment items moving from an item authoring system to an item bank. A consortium has secured item authoring system A. This Item authoring system will not house the items; it is simply a means for item creation. The items will reside in item banking system B. Item authoring system A has created the assessment items and the editing and revision cycle has been complete. Item authoring system A is ready to send the items to item banking system B for storage and assessment instrument creation. Copyright © IMS Global and SIF Association

Test Scenario Components Required Item authoring system Item banking system Pre-Condition Assessment content (items, instruments, etc.) is ready to be transferred in the sending item bank. Post-Conditions Content is now ready for use including reviews, edit, extensions, assessment instrument creation, passing of instruments to delivery system, etc. Copyright © IMS Global and SIF Association Arrow 1 Test Scenario Cont’d

Copyright © IMS Global and SIF Association Arrow 1 Process Flow

Arrow 10 Assessment Registration Copyright © IMS Global and SIF Association

Arrow 10 Diagram Copyright © IMS Global and SIF Association

Arrow 10 Test Scenario Scenario Description This scenario describes the necessary components for an assessment registration. This scenario covers student demographic, teacher, hierarchy and PNP information. A consortium is ready to give an assessment. The student, school, LEA, PNP and teacher information have already been entered into the SIS or data warehouse. Assessment registration and administration system A pulls over all of the necessary information from the SIS or data warehouse B. The students are then assigned to a specific administration of an assessment. The registration and administration information is then passed from the assessment registration and administration system A to the assessment delivery system C. An alternate scenario would be that the PNP information is entered into the registration system after the information has been pulled over from the SIS or data warehouse. Copyright © IMS Global and SIF Association

Arrow 10 Test Scenario Cont’d Test Scenario Components Required Student information system or data warehouse that houses all of the necessary information for registration Assessment registration and administration system Assessment delivery system Pre-Condition All of the student, school, LEA and teacher information have been entered into the SIS and/or Data Warehouse. An administration of an assessment(s) has been identified. Post-Conditions The students are registered for a specific administration(s) of an assessment. Copyright © IMS Global and SIF Association

Arrow 10 Process Flow Copyright © IMS Global and SIF Association

Arrow 14 Reporting System to Data Warehouse Copyright © IMS Global and SIF Association

Arrow 14 Diagram Copyright © IMS Global and SIF Association

Arrow 14 Test Scenario Scenario Description The purpose of this scenario is to demonstrate sending assessment results to the SEA from the Consortium administered assessment. The consortium has administered an assessment. The assessment results system A has received and compiled all information based upon the assessment. Assessment results system A compile, packages and disseminates the information to the SEA data warehouse B. Copyright © IMS Global and SIF Association

Arrow 14 Test Scenario Cont’d Test Scenario Components Required Assessment results system State data warehouse or reporting system Pre-Condition Assessment results are available and have been collected so that summaries can be produced. Post-Conditions The state system has the results loaded. Copyright © IMS Global and SIF Association

Arrow 14 Process Flow Copyright © IMS Global and SIF Association

Data Available by Entity Assessment Family Assessment Assessment Form Assessment Form Section Assessment Form Subtest Assessment Subtest Result Assessment Performance Level Assessment Item Assessment Item Possible Response Assessment Item Response Assessment Item Rubric Assessment Participant Session Learner Action Assessment Administration Assessment Registration Assessment Session Organization Registration Accommodation Person Achievement Evidence Achievement Learning Goal Learning Assignment Learning Resource Learning Resource Learner Activity Learning Standard Document Learning Standard Item Competency Item_Competency Set Competency Set Learning Standard Item Grade Level Assessment Levels for Which Designed Assessment Form Subtest Levels for Which Designed Copyright © IMS Global and SIF Association

Documentation Assessment Interoperability Framework Definitions and Requirements Use Cases for the Assessment Interoperability Framework Data Elements Best Practices Demonstration Prototype Copyright © IMS Global and SIF Association

Work Moving Forward Continue to evolve the technical standard within IMS and SIF respectively and reflect back to CEDS Implementation Additional pilots API and transport Certification Copyright © IMS Global and SIF Association

Contact Jill Abbott, CEO Abbott Advisor Group Rob Abel, CEO IMS Global Consortium Larry Fruth, CEO SIF Association All current documentation can be found at - Assessment Interoperability Framework (AIF)