Presentation is loading. Please wait.

Presentation is loading. Please wait.

Development of Assessment Literacy Knowledge Base

Similar presentations


Presentation on theme: "Development of Assessment Literacy Knowledge Base"— Presentation transcript:

1 Development of Assessment Literacy Knowledge Base
Kathleen Flanagan Assessment Research Coordinator, Massachusetts Department of Elementary and Secondary Education June 2016

2 Assessment System Features
Massachusetts Department of Elementary and Secondary Education

3 What is Assessment Literacy?
Massachusetts Department of Elementary and Secondary Education

4 Building Assessment Literacy
LEAs have a great need to build capacity for assessment literacy More assessment engines available More data available More demands in the generation and use of local data (e.g., teacher evaluation) Different assessment literacy needs (e.g., data team leader vs. classroom teacher) Better, more accessible guidance is needed to generate better assessments and to make better use of assessment data to make decisions Massachusetts Department of Elementary and Secondary Education

5 Massachusetts Assessment Literacy Guidance Materials
Provide users of assessment and reporting systems with key assessment literacy and data use information Support high-quality assessment and data use practices in LEAs Provide educator-facing materials on assessment literacy and data use Develop go-to reference materials for educators that will grow and evolve to encompass more topics, greater specificity over time

6 Assessment Literacy Materials Development
Co-authored by NCIEA’s Charlie DePascale and Karin Hess Elicited contributions from: Massachusetts educators Graduate students (test construction class, Boston College) Massachusetts Department of Elementary and Secondary Education

7 Demonstration: Writing Content with MA LEAs
*The basic logic in establishing scoring reliability for OR items is that the scoring would be consistent across scorers – in other words, an individual student’s response would be scored the same regardless of who was doing it. *When a single person is scoring a student’s work, reliability is enhanced by: Prepare a scoring guide that establishes points for each level of response Grade one question at a time to ensure uniformity in scoring Block the identify of the student when scoring the response Massachusetts Department of Elementary and Secondary Education

8 Revised Scoring Reliability Section
Massachusetts Department of Elementary and Secondary Education

9 Materials Description
Comparative Advantages of Item Types Type of Item Selected Response (Objective Items) Constructed Response (Subjective Items) Performance/Portfolio/ Observational Items Sampling of Curriculum Samples a lot of curriculum in a short period of time Samples less curriculum than selected response items; takes longer examinee administration time Item Development Requires the development of many items Fewer items are needed Fewer items are needed, but the items are written to break out the components of the task Rigor Can sample the range of Bloom's Revised taxonomy from Remembering to Evaluating. Takes skill to write items at the higher levels of rigor Constructed Response items should be written for higher levels of rigor Performance Items can range the levels of rigor although some of them should represent higher-level demands Complexity Low to moderate complexity Can range from low to high complexity Tasks should reflect moderate to high levels of complexity Scoring Objective scoring -- efficient with a scoring key Subjective scoring -- requires the use of rubrics/scoring papers and scorer training Subjective scoring -- requires the use of rubrics; students can participate in scoring Currently, 125 pages of body text with graphs, tables, illustrations Hierarchical nesting with larger topics broken out into digestible parts Specific heading styles (e.g., fonts and sizes) Multiple examples for each topic area ~5 pages glossary ~ 5 small datasets ~ 80 pages of links Excerpt, Grammatical Cuing Excerpt, Cycle of Inquiry

10 Table of Contents Section 2 Section 1
Massachusetts Department of Elementary and Secondary Education

11 Why Build a Knowledge Database?
Discrete Documents Knowledge Database Can quickly overwhelm audiences Dense single documents Disparate small documents Interrelationships unclear Collection cultivated by single organization Static documents hard to update Navigational tools in a database can provide customized information to audiences Interrelationships clear in content and organization of materials Can be cultivated by multiple organizations, (e.g., cross-state use of materials) Updates made in real-time; scalable Massachusetts Department of Elementary and Secondary Education

12 Example: Social Science Research Knowledge Base
Publication Style Illustration Left-hand navigation for document, nested topics Hierarchically arranged short topic presentations with unified graphics and illustrations Many illustrations – most are embedded as links Structure is scalable, allows for additional topics

13 Literasee Repository developed and hosted by the NCIEA
Charlie DePascale Damian Betebenner State-level authorship and control over collections Technical expertise provided by NCIEA Publications generated using Github GIT=Version control system (for collaborative, multiple versions) HUB=Project repository Allows authors to connect to other web-based material Massachusetts Department of Elementary and Secondary Education

14 Why Open Content? Knowledge database contents available to all users (can request that attribution to original authors/states, etc., is used via Creative Commons licensing) Leverage materials across the Web Encourage fluidity of contents (revisions, updates) Massachusetts Department of Elementary and Secondary Education

15 Example from Literasee
Massachusetts Department of Elementary and Secondary Education

16 A Work in Progress Complete online publication of Assessment Literacy materials Collect and revise draft per user feedback Other areas of needed guidance Data use Data visualizations (e.g., Student Growth Percentiles) Massachusetts Department of Elementary and Secondary Education


Download ppt "Development of Assessment Literacy Knowledge Base"

Similar presentations


Ads by Google