Download presentation
Presentation is loading. Please wait.
Published byJoseph Parks Modified over 9 years ago
1
Race to the Top Assessment Program: Public Hearing on Common Assessments January 20, 2010 Washington, DC Presenter: Lauress L. Wise, HumRRO Aab-sad-nov08item09 Attachment 1 Page 1 of 20
2
Jan. 20, 2010Panel on Common Assessment Grants 2 Overview 1.Through-Course Summative Assessments –Key evidence required 2.Common High School End-of-Course Exams –Comparability across assessments 3.Challenges for Computer-Based Testing –Comparability with paper-and-pencil and other alternatives 4.Continuous Process Improvement –Support for continued improvements after the initial grants 5.Further Research Needs –Value-added, use of performance tasks, and ???
3
Jan. 20, 2010Panel on Common Assessment Grants 3 Through-Course Accountability Systems Alternative models for through-course assessments –Parallel forms of the same test at different times (best, last, avg) Oregon assessment model Does not really support increased depth of assessment Students tested in areas not yet covered in their curriculum –Segmented assessments + summative end-of-course Each assessments covers a unique piece of the curriculum; timing may not matter (states must ensure curricular coverage) Provides greatest depth of coverage of particular objectives Does not provide mid-term measures of “catching up” –Cumulative assessments Each assessments covers current and prior curriculum Provides students chances to demonstrate improvement on mastery earlier objectives (impact of remediation) Less coverage of specific topics than with segmented assessments More (but less clear) options for obtaining overall scores
4
Jan. 20, 2010Panel on Common Assessment Grants 4 Through-Course Accountability Systems Establishing the validity of summative scores –Content-related validity evidence – alignment studies –Correlation with other indicators of achievement Teacher ratings and course grades Cognitive lab analysis of student knowledge and skill –Predictive evidence Correlation with achievement at next grade Correlation with subsequent postsecondary preparedness –Consequences: impact on instruction and learning Surveys and observation of curriculum and instruction Analysis of achievement trends over time
5
Jan. 20, 2010Panel on Common Assessment Grants 5 Common High School End-of-Course Exams Comparability of exams across courses –Agreement on content must precede common exams –Similar process for each course for establishing content specs –Evaluate effectiveness of different state curricula in supporting mastery of targeted content and skills for each subject High school accountability (and student accountability) –Example, percent of all students who: Pass all core EOC exams (e.g., Algebra, English II) Pass at least some number of other EOC exams –Better indicator of high school “value added” than obtained with core exams alone
6
Jan. 20, 2010Panel on Common Assessment Grants 6 Challenges for Computer-Based Testing Technology (including computer platforms and software systems) will continue to evolve rapidly –Target testing systems to the best that can currently be imagined today (may still be obsolete tomorrow). –Some tasks (e.g., extensive simulations) assessed on the computer cannot be covered in a paper test –Where adaptation or accommodation is needed for some students, look at how they learn targeted skills in the classroom. Universal Design principles apply to computer tests –Elimination of inappropriate sources of variation increases the likelihood of comparability across testing modes or platforms –Plenty of examples of good comparability and equating studies to investigate mode differences to support studies needed where comparability across modes is important.
7
Jan. 20, 2010Panel on Common Assessment Grants 7 Continuous Process Improvement Process for identifying items that do not work, finding out why, and revising item writing and review –Test/Item specifications –Item writing (exercise development) guides –Item and test review processes Process for evaluating impact on teaching and learning –Are changes in desired directions? Analysis of predictive power of assessment results –Success at the next grade –College preparedness by the end of high school –Are some essential skills not covered? Need ongoing develop-test-revise cycles
8
Jan. 20, 2010Panel on Common Assessment Grants 8 Further Research Needs Value-Added –Focus on goal more than on method –Really how to coach, evaluate, improve teacher effectiveness (and principal) Performance Tasks –Really how to assess and encourage performance not well covered by other modes Complex inquiry, extended problem-solving Team skills? Preparedness and Prerequisite Skills –Initial choice of common standards not necessarily final –Develop-test-revise learning trajectory models
9
Jan. 20, 2010Panel on Common Assessment Grants 9 Summary of Recommendations 1.Through-Course Summative Assessments –Use cumulative or segmented models to increase depth –Require multiple types of validity and impact evidence 2.Common High School End-of-Course Exams –Use in developing high school accountability models 3.Challenges for Computer-Based Testing –Count on continuing technological advances –Avoid extraneous features (universal design) for comparability 4.Continuous Process Improvement –Require feedback systems for improving item/test development –On-going analyses of impact to support improvement of content standards, test specifications, and reporting 5.Further Research Needs –Start from goals rather than methods –Add analyses and improvement of learning trajectory models
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.