Development of Assessment Literacy Knowledge Base

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Project-Based vs. Text-Based
Test Item Analysis Science Grade 6 Josh Doty John Sevier Middle School Kingsport, TN Grade Level: 6 Content Area: Earth Science Source: 2007 Tennessee.
Understanding the ELA/Literacy Evidence Tables. The tables contain the Reading, Writing and Vocabulary Major claims and the evidences to be measured on.
Advances in the PARCC Mathematics Assessment August
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Whole site approach to improvement Leading the Learning Workshop 3 - for leadership teams in secondary sites Quality, Improvement & Effectiveness Unit.
Digital Library Resources Instructional Design (5100:631) Dr. Savery April 27, 2010.
Principles of Assessment
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Educator’s Guide Using Instructables With Your Students.
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Edward A. Shafer, Director, CTE Technical Assistance Center of New York,
Instruction aligned to Iowa Core: What does it look like? #CCSS.
A Framework for Inquiry-Based Instruction through
KEEP And Student Growth Measures for Building Leaders Lawrence School District, May 14, 2014 Bill Bagshaw, Assistant Director, TLA, KSDE Kayeri Akweks,
 Participants will teach Mathematics II or are responsible for the delivery of Mathematics II instruction  Participants attended Days 1, 2, and 3 of.
Elementary & Middle School 2014 ELA MCAS Evaluation & Strategy.
1 WEB Engineering E-Commerce Strategy & Management COM350.
Curriculum Update Curriculum and Instructional Leaders Meeting July 19,
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
Interactive Notebooks and Portfolios What? Why? How?
Twilight Training October 1, 2013 OUSD CCSS Transition Teams.
Setting the Context 10/26/2015 page 1. Getting Students READY The central focus of READY is improving student learning... by enabling and ensuring great.
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Common Core Standards English Language Arts 1. Overview of the Initiative o State-led and developed Common Core Standards for K-12 in English Language.
Bridge Year (Interim Adoption) Instructional Materials Criteria Facilitator:
Writing Informative Grades College and Career Readiness Standards for Writing Text Types and Purposes arguments 1.Write arguments to support a substantive.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
You Can’t Afford to be Late!
Long Beach Unified School Distcit
Greenbush. An informed citizen possesses the knowledge needed to understand contemporary political, economic, and social issues. A thoughtful citizen.
College Career Ready Conference Participants will  Review components of the Grade 3 and the Grades 4 and 5 Condensed Scoring Rubric for Prose Constructed.
Welcome!  Please complete the three “Do Now” posters.  There are nametags on the tables:  Please ensure that more than one district is represented at.
July 11, 2013 DDM Technical Assistance and Networking Session.
Kentucky Core Academic Standards Pike County Schools.
Common Core.  Find your group assignment.  As a group, read over the descriptors for mastery of this standard. (The writing standards apply to more.
The Blended Learning Project. Session Objective  Introduce the Blended Learning Project  Explore and experience SOLA packs that have already been created.
You Can’t Afford to be Late!
What about the Assessment System?
“Grade-level” and “Competency” Portfolios
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Inquiry-based learning and the discipline-based inquiry
Working in Groups in Canvas
Thinking with Technology Course Module 9
ASSESSMENT OF STUDENT LEARNING
The Importance of Technology in High School Science
Project-Based Learning
What Digital Resources Are Available to Support STEM?
Academic Portfolios University of Wisconsin
Illinois Performance Evaluation Advisory Council Update
MCAS-Alt Teacher Consultants
Smarter Balanced Assessments: What do families need to know?
Implementation Guide for Linking Adults to Opportunity
CURRICULUM & INSTRUCTION
Standard Four Program Impact
Illinois Performance Evaluation Advisory Council Update
Smarter Balanced Assessments
Common Core State Standards Initiative
Essential Skills Last updated: 09/19/09.
Jeanie Behrend, FAST Coordinator Janine Quisenberry, FAST Assistant
A Model for Building and Using Professional Learning Communities
Planning a lesson & the lesson overview slide
Suggested TALKING POINTS:
DITA Overview – Build the case for DITA
Presentation transcript:

Development of Assessment Literacy Knowledge Base Kathleen Flanagan Assessment Research Coordinator, Massachusetts Department of Elementary and Secondary Education June 2016

Assessment System Features Massachusetts Department of Elementary and Secondary Education

What is Assessment Literacy? Massachusetts Department of Elementary and Secondary Education

Building Assessment Literacy LEAs have a great need to build capacity for assessment literacy More assessment engines available More data available More demands in the generation and use of local data (e.g., teacher evaluation) Different assessment literacy needs (e.g., data team leader vs. classroom teacher) Better, more accessible guidance is needed to generate better assessments and to make better use of assessment data to make decisions Massachusetts Department of Elementary and Secondary Education

Massachusetts Assessment Literacy Guidance Materials Provide users of assessment and reporting systems with key assessment literacy and data use information Support high-quality assessment and data use practices in LEAs Provide educator-facing materials on assessment literacy and data use Develop go-to reference materials for educators that will grow and evolve to encompass more topics, greater specificity over time

Assessment Literacy Materials Development Co-authored by NCIEA’s Charlie DePascale and Karin Hess Elicited contributions from: Massachusetts educators Graduate students (test construction class, Boston College) Massachusetts Department of Elementary and Secondary Education

Demonstration: Writing Content with MA LEAs *The basic logic in establishing scoring reliability for OR items is that the scoring would be consistent across scorers – in other words, an individual student’s response would be scored the same regardless of who was doing it. *When a single person is scoring a student’s work, reliability is enhanced by: Prepare a scoring guide that establishes points for each level of response Grade one question at a time to ensure uniformity in scoring Block the identify of the student when scoring the response Massachusetts Department of Elementary and Secondary Education

Revised Scoring Reliability Section Massachusetts Department of Elementary and Secondary Education

Materials Description Comparative Advantages of Item Types Type of Item Selected Response (Objective Items) Constructed Response (Subjective Items) Performance/Portfolio/ Observational Items Sampling of Curriculum Samples a lot of curriculum in a short period of time Samples less curriculum than selected response items; takes longer examinee administration time Item Development Requires the development of many items Fewer items are needed Fewer items are needed, but the items are written to break out the components of the task Rigor Can sample the range of Bloom's Revised taxonomy from Remembering to Evaluating. Takes skill to write items at the higher levels of rigor Constructed Response items should be written for higher levels of rigor Performance Items can range the levels of rigor although some of them should represent higher-level demands Complexity Low to moderate complexity Can range from low to high complexity Tasks should reflect moderate to high levels of complexity Scoring Objective scoring -- efficient with a scoring key Subjective scoring -- requires the use of rubrics/scoring papers and scorer training Subjective scoring -- requires the use of rubrics; students can participate in scoring Currently, 125 pages of body text with graphs, tables, illustrations Hierarchical nesting with larger topics broken out into digestible parts Specific heading styles (e.g., fonts and sizes) Multiple examples for each topic area ~5 pages glossary ~ 5 small datasets ~ 80 pages of links Excerpt, Grammatical Cuing Excerpt, Cycle of Inquiry

Table of Contents Section 2 Section 1 Massachusetts Department of Elementary and Secondary Education

Why Build a Knowledge Database? Discrete Documents Knowledge Database Can quickly overwhelm audiences Dense single documents Disparate small documents Interrelationships unclear Collection cultivated by single organization Static documents hard to update Navigational tools in a database can provide customized information to audiences Interrelationships clear in content and organization of materials Can be cultivated by multiple organizations, (e.g., cross-state use of materials) Updates made in real-time; scalable Massachusetts Department of Elementary and Secondary Education

Example: Social Science Research Knowledge Base Publication Style Illustration Left-hand navigation for document, nested topics Hierarchically arranged short topic presentations with unified graphics and illustrations Many illustrations – most are embedded as links Structure is scalable, allows for additional topics

Literasee Repository developed and hosted by the NCIEA Charlie DePascale Damian Betebenner State-level authorship and control over collections Technical expertise provided by NCIEA Publications generated using Github GIT=Version control system (for collaborative, multiple versions) HUB=Project repository Allows authors to connect to other web-based material Massachusetts Department of Elementary and Secondary Education

Why Open Content? Knowledge database contents available to all users (can request that attribution to original authors/states, etc., is used via Creative Commons licensing) Leverage materials across the Web Encourage fluidity of contents (revisions, updates) Massachusetts Department of Elementary and Secondary Education

Example from Literasee Massachusetts Department of Elementary and Secondary Education

A Work in Progress Complete online publication of Assessment Literacy materials Collect and revise draft per user feedback Other areas of needed guidance Data use Data visualizations (e.g., Student Growth Percentiles) Massachusetts Department of Elementary and Secondary Education