Deanna L. Morgan The College Board

Slides:



Advertisements
Similar presentations
Spiros Papageorgiou University of Michigan
Advertisements

PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
EPAS: Elevating the Postsecondary Aspirations of Students! Using ACTs EPAS Data Effectively Glenn Beer Louisiana Tech University
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Culminating Academic Review Adams State College Department of Teacher education graduate programs.
Setting Performance Standards Grades 5-7 NJ ASK NJDOE Riverside Publishing May 17, 2006.
Presented by Denise Sibley Laura Jean Kerr Mississippi Assessment Center Research and Curriculum Unit.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Standard Setting Inclusive Assessment Seminar Marianne.
National Center on Educational Outcomes N C E O What the heck does proficiency mean for students with significant cognitive disabilities? Nancy Arnold,
Setting Alternate Achievement Standards Prepared by Sue Rigney U.S. Department of Education NCEO Teleconference March 21, 2005.
June 23, 2003 Council of Chief State School Officers What Does “Proficiency” Mean for Students with Cognitive Disabilities Dr. Ron Cammaert Riverside Publishing.
Carl D. Perkins Career and Technical Education Improvement Act of 2006.
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
Graduate Program Review Where We Are, Where We Are Headed and Why Duane K. Larick, Associate Graduate Dean Presentation to Directors of Graduate Programs.
Overall Teacher Judgements
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
Portfolios: A Useful Resource For All Cynthia Cuellar Astrid Fossum Beth Schefelker The Milwaukee Mathematics Partnership (MMP), an initiative of the Milwaukee.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
ASSESSING DEVELOPMENT ASSESSMENT IN COMMUNITY COLLEGES: A REVIEW OF THE LITERATURE Katherine Hughes Community College Research Center Judith Scott-Clayton.
Evaluation and Assessment October 13, 2009 Diane Schilder, Project Evaluator Schilder, Can be used with attribution.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
RPPS Education Development Process Debbie Bender.
A Principled Approach to Accountability Assessments for Students with Disabilities CCSSO National Conference on Student Assessment Detroit, Michigan June.
  “The Common Core State Standards are an example of states recognizing a problem, then working together, sharing what works and what doesn’t.” - Former.
Raising the Bar for Oregon. Adopt New Math Cut Scores and Final Math Achievement Level Descriptors and Policy Definitions Adopt High School Math Achievement.
Virginia Alternate Assessment Program Standard Setting Richmond, Virginia September 18-21, 2001.
Standard Setting Results for the Oklahoma Alternate Assessment Program Dr. Michael Clark Research Scientist Psychometric & Research Services Pearson State.
0 PARCC Performance Level Setting Place your logo here.
Using the Many-Faceted Rasch Model to Evaluate Standard Setting Judgments: An IllustrationWith the Advanced Placement Environmental Science Exam Pamela.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
1 Board “Leanings” on State Assessment System David T. Conley, Ph.D. Professor, University of Oregon Director, Center for Educational Policy Research CEO,
NAEP Achievement Levels Michael Ward, Chair of COSDAM Susan Loomis, Assistant Director NAGB Christina Peterson, Project Director ACT.
© 2012, Community Training and Assistance Center © 2012, Teaching Learning Solutions Linking ISLLC and your Principal Rubrics to a Case.
COLLABORATIVE RESEARCH IN ACCELERATION Jennifer Brezina, Ph. D. Daylene Meuschke, Ed. D. Catherine Parker, M.A. College of the Canyons.
Performance-Based Accreditation
“Excuse Me Sir, Here’s Your Change”
CBU CALIFORNIA BAPTIST UNIVERSITY Assessment, Accreditation, and Curriculum Office CBU - OIRPA.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Assessment & Evaluation Committee
Partnership for Practice
Assessments for Monitoring and Improving the Quality of Education
Arancha Oviedo EQAVET Secretariat
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Field Experiences and Clinical Practice
NCATE Standard 3: Field Experiences & Clinical Practice
ASSESSMENT OF STUDENT LEARNING
The All-important Placement Cut Scores
Next-Generation MCAS: Update and review of standard setting
Directions for Expert Review Panel
RELATING NATIONAL EXTERNAL EXAMINATIONS IN SLOVENIA TO THE CEFR LEVELS
Updates on the Next-Generation MCAS
Principles of Assessment & Criteria of good assessment
Standard Setting for NGSS
Timeline for STAAR EOC Standard Setting Process
Job Analysis CHAPTER FOUR Screen graphics created by:
Jillian Kinzie, Indiana University Center for Postsecondary Research
Parent-Teacher Partnerships for Student Success
Assessment & Evaluation Committee
Assessing Academic Programs at IPFW
Developing a Rubric for Assessment
Assessment Literacy: Test Purpose and Use
This is the second module of the Collaborative Backward Design series
Marilyn Eisenwine Committee Chair
Aligning Academic Review and Performance Evaluation (AARPE)
Western Australian Curriculum: Technologies
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

Deanna L. Morgan The College Board Best Practices for Setting Placement Cut Scores in Postsecondary Education Deanna L. Morgan The College Board

Standard Setting Terms Content Standards The knowledge and skills students should demonstrate for placement into a course. Cut Score The point on the score scale indicating students have demonstrated enough of the content standards. Standard Setting The process of defining the cut score on the score scale. Standard Setting Methods Specific combinations of standardized actions and processes that are used to develop cut score recommendations.

Hypothetical Placement Decision

Major Areas To Consider The Authoritative Body, or Policymaker Signs off on final cut scores allowing implementation May be one or more people Hires facilitator Determines number of cut scores needed Assisted by facilitator in deciding standard setting method Make policy decisions about use of data and related information during the study.

Major Areas To Consider Participants: Facilitator Works with Authoritative Body to identify desirable characteristics of panelists, methods to use, and decisions that must be made. Ensures procedures are followed. Must be external to the process. Subject Matter Experts (SMEs) Experts in subject to which cut scores will be applied. Familiar with student population. Representative of college population in terms of gender, ethnicity, experience, geographic location, etc. Should be at least 10 SME’s but usually less than 25.

Major Areas To Consider Performance Level Descriptors (PLDs) Clearly delineate the differences in content and expectations for examinees in one level course from those in the adjacent course(s). Serve to calibrate the SMEs to the same understanding of each course and provide the basis for decision making during the standard setting task. May be developed during the standard setting meeting or in advance in a meeting dedicated solely to that purpose to allow greater emphasis.

Major Areas To Consider Standardization and Defensibility Standard setting methods have been designed to work with a variety of test configurations and levels of data availability. The standardization of the processes and activities when followed increase the defensibility of cut score decisions: Taking the test; training on the method; multiple iterative rounds; independent judgments combined with discussion from multiple perspectives; collection of SME feedback; and possibly the presentation of impact data. Weigh the choice of method against the stakes attached to the decision.

Major Areas To Consider Documentation and Evaluation Keep all materials used, data entered, evaluation forms, copies of the PLDs, summary of SME characteristics, rating forms, handouts, etc. This is the supporting material that will be needed to prove that a valid and reliable process was followed should the cut score be challenged. Evaluation form data can be used as evidence that SMEs understood and followed the process, as well as, to give input into changes that should be made to the process in the future. The Authoritative Body may use this data to justify an adjustment to the recommended cut score.

Concluding Remarks Standard Setting is an important part of the assessment process. Due diligence in the process and adequate documentation are essential for defensibility. No decision should be made on the basis of a single test score. Multiple sources of evidence are best when making decisions. A key part in the defensibility of cut scores in ensuring the assessment used produces valid and reliable scores for that purpose.

MORE INFORMATION Download event materials and learn how to participate in the online follow-up discussion: www.PostsecondaryResearch.org/conference/afterevent.html NCPR IS FUNDED BY THE INSTITUTE OF EDUCATION SCIENCES OF THE U.S. DEPARTMENT OF EDUCATION and is a partnership of the Community College Research Center, Teachers College, Columbia University; MDRC; the Curry School of Education at the University of Virginia; and faculty at Harvard University.