1 Race to the Top Assessment Program General & Technical Assessment Discussion Jeffrey Nellhaus Deputy Commissioner January 20, 2010.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

RIDE – Office of Special Populations
Measuring Student and Teacher Technology Literacy for NCLB Whats an LEA to do? 2004 National School Boards Association Conference Denver Carol D. Mosley.
STATE STANDARDIZED ASSESSMENTS. 1969The National Assessment for Educational Progress (NAEP) administered for the first time, Florida participated in the.
On The Road to College and Career Readiness Hamilton County ESC Instructional Services Center Christina Sherman, Consultant.
The Partnership for Assessment of Readiness for College and Careers.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
Chapter Fifteen Understanding and Using Standardized Tests.
Computing Leadership Summit STEM Education Steve Robinson U.S. Department of Education White House Domestic Policy Council February 22, 2010.
EdTPA: Task 1 Support Module Mike Vitale Mark L’Esperance College of Education East Carolina University Introduction edTPA INTERDISCIPLINARY MODULE SERIES.
CDE ASSESSMENT RECOMMENDATIONS January 29, CDE’s Assessment Recommendations.
The Partnership for Assessment of Readiness for College and Careers Outline of PARCC Information *Parent/Student Resources, are noted at the end of this.
1. Oklahoma C 3 Standards, Including Common Core 2 The Oklahoma C 3 Standards, including the Common Core, lay the foundation toward ensuring that students.
The Partnership for Assessment of Readiness for College and Careers (PARCC) Common Core Summer Institutes 1.
The Academic Assessment Process
Jamal Abedi University of California, Davis/CRESST Presented at The Race to the Top Assessment Program January 20, 2010 Washington, DC RACE TO THE TOP.
Common Core State Standards and Partnership for Assessment of Readiness for College and Careers (PARCC) Common Core State Standards and Partnership for.
Common Core State Standards & Assessment Update The Next Step in Preparing Michigan’s Students for Career and College MERA Spring Conference May 17, 2011.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Portfolios Definitions Advantages and Disadvantages Specific Steps Examples Try it!
Teacher Certification Next Steps……. How certification works within your current practice Student Growth Criterion 3: Recognizing individual student learning.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
April 11, 2012 Comprehensive Assessment System 1.
PARCC Information Meeting FEB. 27, I Choose C – Why We Need Common Core and PARCC.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Consortia of States Assessment Systems Instructional Leaders Roundtable November 18, 2010.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Becoming a Teacher Ninth Edition
NEXT GENERATION BALANCED ASSESSMENT SYSTEMS ALIGNED TO THE CCSS Stanley Rabinowitz, Ph.D. WestEd CORE Summer Design Institute June 19,
Technology Leadership
Ensuring State Assessments Match the Rigor, Depth and Breadth of College- and Career- Ready Standards Student Achievement Partners Spring 2014.
Elementary & Middle School 2014 Mathematics MCAS Evaluation & Strategy.
Accommodations in Oregon Oregon Department of Education Fall Conference 2009 Staff and Panel Presentation Dianna Carrizales ODE Mike Boyles Pam Prosise.
The Partnership for Assessment of Readiness for College and Careers (PARCC) December
PARCC – Partnership for the Assessment of Readiness for College and Career Aundrea Kelley, Deputy Commissioner, P-16 Policy and Collaborative Initiatives.
Mathematics and Science Education U.S. Department of Education.
PARCC Assessments Updates Updates Arrived 2/6/13! general specifics.
Race to the Top Assessment Competition Public & Expert Input Meetings General & Technical Assessment Washington, DC January 20, 2010.
CCSSO Criteria for High-Quality Assessments Technical Issues and Practical Application of Assessment Quality Criteria.
Comprehensive Educator Effectiveness: New Guidance and Models Presentation for the Special Education Advisory Committee Virginia Department of Education.
Multi-State Common Assessment of Common Core Standards.
CALIFORNIA DEPARTMENT OF EDUCATION Tom Torlakson, State Superintendent of Public Instruction Enhanced Assessment Grant: English Language Proficiency Assessment.
Assessing The Next Generation Science Standards on Multiple Scales Dr. Christyan Mitchell 2011 Council of State Science Supervisors (CSSS) Annual Conference.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
Smarter Balanced Assessment System March 11, 2013.
Setting the Context 10/26/2015 page 1. Getting Students READY The central focus of READY is improving student learning... by enabling and ensuring great.
Illustration of a Validity Argument for Two Alternate Assessment Approaches Presentation at the OSEP Project Directors’ Conference Steve Ferrara American.
CEDAR RIDGE MIDDLE SCHOOL JANUARY 15, 2015 acos2010.wikispaces.com.
Dr. Nancy S. Grasmick July 26,2012.  Maryland is proud to be the top-ranked state in U.S. growth as reported in this study, and judged by Education Week.
Race to the Top Assessment Program General & Technical Discussion Lizanne DeStefano University of Illinois.
Illinois Community College BoardIllinois State Board of Education Programs of Study Self-Assessment: Starting the Journey on the Right Foot February 4,
Race to the Top Assessment Program: Public Hearing on High School Assessments November 13, 2009 Boston, MA Presenter: Lauress L. Wise, HumRRO Aab-sad-nov08item09.
APPR: Ready or Not Joan Townley & Andy Greene October 20 and 21, 2011.
Race to the Top Assessment Program: Public Hearing on Common Assessments January 20, 2010 Washington, DC Presenter: Lauress L. Wise, HumRRO Aab-sad-nov08item09.
Science Department Draft of Goals, Objectives and Concerns 2010.
Changes in Professional licensure Teacher evaluation system Training at Coastal Carolina University.
Summary of Assessments By the Big Island Team: (Sherry, Alan, John, Bess) CCSS SBAC PARCC AP CCSSO.
Scales and Indices While trying to capture the complexity of a phenomenon We try to seek multiple indicators, regardless of the methodology we use: Qualitative.
Smarter Balanced & Higher Education Cheryl Blanco Smarter Balanced Colorado Remedial Education Policy Review Task Force August 24, 2012.
Spring 2015 Verona PARCC Results: Year One Wednesday, March 16 7:00-8:00 p.m. VHS Learning Commons.
Instructional Leadership Supporting Common Assessments.
PLCs Professional Learning Communities Staff PD. Professional Learning Committees The purpose of our PLCs includes but is not limited to: teacher collaborationNOT-
Breakout Discussion: Every Student Succeeds Act - Scott Norton Council of Chief State School Officers.
Assessment to Support Competency-Based Pathways
American Institutes for Research
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Implementing Race to the Top
Understanding and Using Standardized Tests
Assessment Literacy: Test Purpose and Use
Presentation transcript:

1 Race to the Top Assessment Program General & Technical Assessment Discussion Jeffrey Nellhaus Deputy Commissioner January 20, 2010

2 Through-Course Summative Assessment System A through-course summative assessment system includes multiple assessment components. Components are administered periodically over the course of the school year. Student performance on each component is aggregated to produce summative results Definition  This type of assessment system will require states/consortia to identify the standards assessed by each component, the component exam schedule, and whether LEAs must administer the components in a particular sequence  This will require significant changes in local curricula, because most/all state assessment programs do not currently dictate any particular instructional sequence Implications for Curriculum

3 Through Course Assessments Construct Validity The Whole is Greater Than the Sum of the Parts problems Proficiency means going beyond demonstrating a basic grasp of individual standards or groups of closely related standards, and includes the application of multiple standards from any aspect of the content area to solve complex problems Premise  Their concept of proficiency and their approach to its measurement using the through-course summative assessment system by indicating oThe standards that will be assessed by each component exam oHow each component will address standards assessed previously oHow individual test items will address multiple, as well as single, standards Applicants should be asked to describe

4 Through Course Assessments External Validity The exams will need to measure and report “readiness” An important measure of the external validity of the through-course assessment system will be the extent to which summative results for each content area accurately predict whether students are on track/ready for college or a career Premise Applicants should be asked to describe  How they plan to report the results of each component exam,  How they plan to aggregate component results, including implications the plan will have for item development, and provide a rationale for the method selected  How they plan to determine the summative score on each exam that predicts readiness; and how those scores will be validated over time

5 Through Course Assessments Reliability & Comparability The level of reliability needed will depend on reporting plans and intended uses of the results High levels of reliability are required for accountability uses Comparability requires high levels of reliability and standardization of all elements of the exams Premise Applicants should be asked to describe  How they will achieve a level of reliability that adequately supports their reporting plans and planned uses of results  The extent to which their plans require standardized test administration within and across schools and how it will be achieved  How they plan to establish high levels of reliability and accuracy in the scoring of constructed response questions within and across years, whether scored externally by contractors or locally by teachers  Preliminary plans for equating results across years

6 Through Course Assessments Feasibility/Practicability Validity and Reliability Are Not Enough In addition to issues of validity and reliability, feasibility (long-term sustainability) is a major factor that needs to be considered in designing, developing and implementing large-scale assessment systems Premise Applicants should be asked to provide  An estimate of the average yearly cost (per student) of the assessment system, and a rationale for sustainability  An estimate of the testing time for each component, and a rationale indicating that the amount of testing time is sustainable  An estimate of LEA staffing time required to prepare for and administer exams, provide accommodations, and score items (where applicable)  The amount of initial and ongoing training and professional development that will be required to launch and maintain system over time

7 End-of-Course High School Exams Premise  Criteria that will be used to certify the quality and rigor of each exam in the set  Criteria used to certify that the quality and rigor of the set of exams are comparable  The criteria that will be used to certify that the results of individual exams, or collections of different exams, can be used to identify students who are ready for the next course in the content area or post-secondary pursuits Applicants should be asked to describe Applicants will be invited to propose a “system” for developing and certifying the quality and rigor of a set of common end-of-course summative exams in multiple high school subjects. If these exams are decentralized: developed, scored, administered and reported by individual LEAs or high schools, then to ensure consistency in quality and rigor across schools …

8 Computer-based Test Administration Implementation challenges Comparability with paper-and-pencil exams Students with disabilities Issues  How exams (1) will be administered in schools where computer/student ratios are low and there is limited or no access to broadband; and (2) their approach to ensure that students will have the opportunity to learn computer-based test taking skills  Not be asked how they will ensure that computer- based tests and any needed paper-and-pencil versions assess comparable levels of student knowledge and skill, if preserving the full power of the computer-based item types is required  Computer-based assessments provide more advantages than challenges for SWD. Applicants should be asked how they will to take advantage computer-based assessments to improve the validity of results for this population Applicants should be asked

9 Innovation and Improvement Applicants must ensure that they have a structured process and/or approach that will lead to innovation and improvement over time. Requirement  Set aside a certain percentage of their budget for research and development  Develop a 4-year research and development agenda identifying specific questions applicants want to answer. Specifically, questions that, once answered, would help them innovate or improve  Identify university and other partners who would help move their research agenda forward and/or serve on advisory panels during the four years of the grant and beyond  Agree to share the findings of their research with other consortia/states at periodic conferences sponsored by the USDE, and through USDE supported research networks. Applicants should be asked to:

10 Issues for Focused Research  This is important, but (1) a statistical modeling issue more than a measurement issue (2) many states will use their RTTT grant funds to conduct research in this area. It might be more productive to provide additional support to states with these grants  Yes, research is needed here, assuming this is about the practical challenges of equating exams, which include extended performance tasks Use of value-added methodology for teacher and school accountability Comparability, generalizability, and growth modeling for assessments that include performance tasks IssueComment