Assessment Training Nebo School District. Assessment Literacy.

Slides:



Advertisements
Similar presentations
Assessment and Accountability at the State Level NAEP NRT (Iowa Tests) Core CRTs DWA UAA UBSCT NCLB U-PASS Alphabet Soup.
Advertisements

© McGraw-Hill Higher Education. All rights reserved. Chapter 3 Reliability and Objectivity.
+ Utah Comprehensive Accountability System (UCAS) 1 Hal Sanderson, Ph.D. Research and Assessment August 21,
1 Effective Use of Benchmark Test and Item Statistics and Considerations When Setting Performance Levels California Educational Research Association Anaheim,
Chapter Fifteen Understanding and Using Standardized Tests.
1 Alternative Language Services (ALS) November 10, 2008.
1 Utah Performance Assessment System for Students U-PASS Accountability Plan Judy W. Park Assessment & Accountability Director Utah State Office of Education.
By: Michele Leslie B. David MAE-IM WIDE USAGE To identify students who may be eligible to receive special services To monitor student performance from.
Reports and Scores Fen Chou, Ph.D. Louisiana Department of Education August 2006.
STAR Basics.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Assessing Achievement and Aptitude: Applications for Counseling Chapter 8.
Standardized Test Scores Common Representations for Parents and Students.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Formative and Summative Assessment
Vertical Scale Scores.
Norm-Referenced and Criterion- Referenced Assessments A Historical view from 1900 to the Present.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Montana’s statewide longitudinal data system Project Montana’s Statewide Longitudinal Data System (SLDS)
Understanding California’s New State Assessment Cambrian School District August 20, 2015.
Parent Training California Assessment for Student
Science Assessment: Current Issues and Perspectives EDNET Broadcast Thursday, February 17, 3:30 – 4:30 p.m. Kevin King Science Assessment Specialist
How to Interpret Test Scores. 1. What are standardized tests?  A standardized test is one that is administered under standardized or controlled conditions.
Understanding and Using Standardized Tests
Elementary Assessment Data Update Edmonds School District January 2013.
Testing Information Session SAGE Testing Information 1 Information for Parents and Schools.
Taylor Road MS CRCT Parent Breakfast
Common Core State Standards Background and ELA Overview Created By: Penny Plavala, Literacy Specialist.
How Can Teacher Evaluation Be Connected to Student Achievement?
Jasmine Carey CDE Psychometrician Interpreting Science and Social Studies Assessment Results September 2014.
 Closing the loop: Providing test developers with performance level descriptors so standard setters can do their job Amanda A. Wolkowitz Alpine Testing.
Diagnostics Mathematics Assessments: Main Ideas  Now typically assess the knowledge and skill on the subsets of the 10 standards specified by the National.
Integrating Success The Transition of All Students From High School to College November 2007 Iowa Educational Research & Evaluation Association Annual.
MELS 601 Ch. 7. If curriculum can be defined most simply as what is taught in the school, then instruction is the how —the methods and techniques that.
ELA & Math Scale Scores Steven Katz, Director of State Assessment Dr. Zach Warner, State Psychometrician.
Iowa Test for Educational Development Todd Bauer Lynn Carroll Karen Grossi Jeff Walker Tami Wenger.
NAEP 2011 Mathematics and Reading Results NAEP State Coordinator Mark DeCandia.
Chapter 2 ~~~~~ Standardized Assessment: Types, Scores, Reporting.
Standard Setting Results for the Oklahoma Alternate Assessment Program Dr. Michael Clark Research Scientist Psychometric & Research Services Pearson State.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
NECAP Results and Accountability A Presentation to Superintendents March 22, 2006.
Alternate Proficiency Assessment Erin Lichtenwalner.
Welcome to MMS MAP DATA INFO NIGHT 2015.
Pike County Middle GA Milestones 2015
California Assessment of Student Performance and Progress (CAASPP) 1 California Department of Education, September 2015.
RESULTS Spring 2015 End-Of-Course tests Student Score Interpretation Guide.
Understanding AzMERIT Results and Score Reporting An Overview.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
PSAT 8/9: Understanding the Score Report
Standardized Testing EDUC 307. Standardized test a test in which all the questions, format, instructions, scoring, and reporting of scores are the same.
Assessment Assessment is the collection, recording and analysis of data about students as they work over a period of time. This should include, teacher,
Scale Scoring A Revised Format for Provincial Assessment Reports.
LaKenji Hastings, NWLC Assessment Program Specialist Georgia Milestones Parent Informational.
Student Growth Model Salt Lake City School District Christine Marriott Assessment and Evaluation Department Salt Lake City School District State.
Curriculum Night Elementary. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
Curriculum Night Middle School. What do I as a parent need to know to support student assessments at CCAS? Essential Question.
Assessment and Accountability Update Longbranch Elementary School September 27,
Ch4-1. Testing.
Using Data to Improve Student Achievement Summer 2006 Preschool CSDC.
Paulding County School District Hutchens Elementary Parent Presentation Powerpoint information has been adapted from resources available at
Birmingham Falls Georgia Milestones
Standardized Test Reporting
Release of PARCC Student Results
Bursting the assessment mythology: A discussion of key concepts
Interpreting Science and Social Studies Assessment Results
TABE II: Using TABE® Results to Inform Instruction
Understanding and Using Standardized Tests
Presentation transcript:

Assessment Training Nebo School District

Assessment Literacy

Test Acronyms  CRT - Criterion Referenced Test 1-11  IOWA –Iowa Test of Basic Skills and Iowa Test of Educational Development 3,5,8,&11  UBSCT - Utah Basic Skills Competency Test  DWA - Direct Writing Assessment 6&9  UAA – Utah Alternate Assessment 1-12 with severe cognitive disabilities  UALPA - The Utah Academic Language Proficiency Assessment 1-12 ELL

Norm-Referenced Tests  Standardized Tests  Scores interpreted in terms of comparison to a specific group  Percentile scores are most common measurement of achievement  Percentile scores range from 1st to 99 th with the 50 th percentile being used to represent the national average  ITBS and ITED (IOWA) tests are the state adopted Norm-Referenced Assessments

Criterion-Referenced Tests  Standardized Tests  Every question/item is aligned to an explicitly stated educational objective  Used to identify which standards and objectives have been mastered by the examinee  CRT or End-of-Level tests in Language Arts, Math, and Science

Summative Assessment  Used to determine the students’ final understanding of material  State CRT tests are an example

Formative Assessment  Used to identify the students’ understanding of material, to provide feedback for teachers and learning experiences for students  Benchmarks, UTIPS, Running Records, and Student Interviews are all included in this category

Raw score  The number of correct responses on a test  A student answered 48 questions correctly

Percent Correct Score  The number of correct responses divided by the total number if items  49 out of 70 = 70%

Percentile Score  The percent of students who performed worse on a test  75 th percentile – 75% of examinees scored lower on the test than this examinee

Scaled Score  The students performance is based on an arbitrary numerical scale (can be alphabetical)  A scaled score correctly provides comparable information on student performance for different years on different tests

ACT  What is 36?  What is 28?  What is 12?  These numbers represent the value we place on numbers in a scale  Often we have the help of others such as colleges in setting value  Utah State University and University of Utah say you must have at least a score of 18

Scaled Scores  Act Scores range from is considered proficient depending on school  Advanced Placement tests range from is proficient  UBSCT and CRT range from is proficient

Scaled Scores  Scaled scores offer the advantage of simplifying the reporting of results  There can be common score reporting for each level and for each test  No more specific percentages for cut scores for each subject  Far greater comparability between tests and years

Scaled Scores  CRTs and UBSCT use a cut score of 160  Each proficiency level has its own cut score  Proficiency levels range from 1-4 in NCLB and 1a-4 in UPASS (We will discuss this in the next session)

Example  If john has a raw score of 65 in 2004, and a raw score of 58 in 2005,does this show a decrease in performance?  If john has a scaled score of 165 in 2004, and a scaled score of 155 in 2005, does this show a decrease in performance?

Why Not Raw Scores  Most states do not release raw scores  Looking at raw scores can lead to an incorrect assumption  It is incorrect to compare raw scores from one year to those of the next  It is incorrect to compare the raw scores of one test to those of another

Career Home Runs EQUATING

Individually Ability Strength Skill Technique Knowledge Difficulty of the game Tightly Wound Baseballs Improved Bats Higher Pitchers Mound Changes in Season Length Steroids. Who Is The Greatest?

Comparisons  Impossible to compare Barry Bonds with Babe Ruth  Impossible to compare a game in 1914 to a game in 2006

Comparisons  Possible to compare johns ability on the 2005 language arts CRT with johns ability on the 2006 language arts CRT (Scaling)  Possible to compare the difficulty of the 2005 language arts CRT to the 2006 CRT (Equating)

Equating  Statistical process that takes different tests and makes them equal in difficulty  Disentangles differences between test difficulty and student ability

Equating  Common (anchor) items between test forms  Statistical comparison of common items for equivalent difficulty level  This statistical process ensures that results from test to test are accurately comparable and not subject to fluctuations due to unintentional changes in item difficulty

Equating Anchor Items Form X Anchor Items Form Y

Anchor Items  It is the performance of the two sets of anchor items across years that allow us to make interpretations about the relative difficulty of the non-anchor items  If student performance on the anchor items is the same, we conclude that the student achievement is the same  If student performance on the anchor items increases we can interoperate that student achievement increased  If student performance on the anchor items decreases we interoperate that student achievement decreased  We use this information to judge the difficulty of the non-anchor items

Why Equate  One test is more difficult than another  One group of examinees may be more intelligent than another  Both