PurposeTo practice making valid inferences.Related Documents DescriptionThis activity can be used within your Data Team or with other audiences to improve.

Slides:



Advertisements
Similar presentations
1 R-2 Report: Read and write at the end of third grade Review of Progress and Approval of Targets A presentation to the Board by Vince.
Advertisements

Maryland School Assessment (MSA) 2012 Science Results Carolyn M. Wood, Ph.D. Assistant Superintendent, Accountability, Assessment, and Data Systems August.
PVAAS School Consultation Fall 2011 PVAAS Statewide Core Team
Section 3 Systems of Professional Learning Module 1 Grades 6–12: Focus on Practice Standards.
PurposeTo provide the District Data Team with an example of a data overview presentation. Related Documents 2–Inquiry Module 2.3.1T: Data Overview Checklist.
Build Assessment Literacy and Create a Data Overview Oct. 10, 2006.
Communicating through Data Displays October 10, 2006 © 2006 Public Consulting Group, Inc.
Grade 3-8 Mathematics Test Results. 2 The Bottom Line This is the first year in which students took State tests in Grades 3,4,5,6,7, and 8. With.
Grade 3-8 English. 2 The Bottom Line This is the first year in which students took State tests in Grades 3,4,5,6,7, and 8. With the new individual.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
Grade 3-8 English Language Arts and Mathematics Results August 8, 2011.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
Valentine Elementary School San Marino Unified School District Standardized Testing and Reporting (STAR) Spring 2009 California Standards Test.
1. An Overview of the Data Analysis and Probability Standard for School Mathematics? 2.
Title III Accountability. Annual Measurable Achievement Objectives How well are English Learners achieving academically? How well are English Learners.
Review Planning Faribault Public Schools DATA DAY.
MELROSE PUBLIC SCHOOLS MELROSE VETERANS MEMORIAL MIDDLE SCHOOL OCTOBER 2013 MCAS Spring 2013 Results.
The DATA WISE Process and Data- Driven Dialogue Presented by: Lori DeForest (315)
Interpreting Assessment Results using Benchmarks Program Information & Improvement Service Mohawk Regional Information Center Madison-Oneida BOCES.
Prepared by: Scott R. Morrison Director of Curriculum and Instructional Technology 11/3/09.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
Connecticut Mastery Test (CMT) and the Connecticut Academic Achievement Test (CAPT) Spring 2011 Presented to the Guilford Board of Education September.
Out with the Old, In with the New: NYS Assessments “Primer” Basics to Keep in Mind & Strategies to Enhance Student Achievement Maria Fallacaro, MORIC
NECAP DATA ANALYSIS 2012 Presented by Greg Bartlett March, 2013.
ACCESS for ELLs® Interpreting the Results Developed by the WIDA Consortium.
Fall Testing Update David Abrams Assistant Commissioner for Standards, Assessment, & Reporting Middle Level Liaisons & Support Schools Network November.
Qatar Comprehensive Educational Assessment (QCEA) 2007: Summary of Results.
Welcome to Common Core High School Mathematics Leadership
 Closing the loop: Providing test developers with performance level descriptors so standard setters can do their job Amanda A. Wolkowitz Alpine Testing.
CLINTON HIGH SCHOOL 2012 MCAS Presentation October 30, 2012.
375 students took the number sense common formative assessments this school year. These are their stories. (Please view as a slide show)
Data-Driven Dialogue Predicting, Exploring, and Explaining Data
Welcome to the TAYLOR ELEMENTARY SCHOOL Introduction to MCAS.
Introduction to GREAT for ELs Office of Student Assessment Wisconsin Department of Public Instruction (608)
So Much Data – Where Do I Start? Assessment & Accountability Conference 2008 Session #18.
Welcome to the TAYLOR ELEMENTARY SCHOOL Introduction to MCAS.
SGP Logic Model Provides a method that enables SGP data to contribute to performance evaluation when data are missing. The distribution of SGP data combined.
© 2007 Board of Regents of the University of Wisconsin System, on behalf of the WIDA Consortium WIDA Focus on Growth H Gary Cook, Ph.D. WIDA.
Spring 2009 MCAS Results. Dedham Elementary Schools.
Santa Ana Unified School District 2011 CST Enter School Name Version: Intermediate.
1 New Hampshire – Addenda Ppt Slides State Level Results (slides 2-7) 2Enrollment - Grades 3-8 for 2005 and Reading NECAP 4Mathematics
Melrose High School 2014 MCAS Presentation October 6, 2014.
MCAS 2007 October 24, 2007 A Report to the Sharon School Committee and Dr. Barbara J. Dunham Superintendent of Schools Dr. George S. Anthony Director of.
1 Mitchell D. Chester Commissioner of Elementary and Secondary Education Report on Spring 2009 MCAS Results to the Massachusetts Board of Elementary and.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST High School.
Section 3 Systems of Professional Learning Module 1 Grades K–5: Focus on Practice Standards.
State Test Results MSP/HSPE/EOC Presentation to the Shelton School District Board of Directors September 9, 2014.
Evaluation Institute Qatar Comprehensive Educational Assessment (QCEA) 2008 Summary of Results.
Claim 1 Smarter Balanced Sample Items Grade 7 - Target H Draw informal comparative inferences about two populations. Questions courtesy of the Smarter.
1 Grade 3-8 English Language Arts Results Student Growth Tracked Over Time: 2006 – 2009 Grade-by-grade testing began in The tests and data.
2009 Grade 3-8 Math Additional Slides 1. Math Percentage of Students Statewide Scoring at Levels 3 and 4, Grades The percentage of students.
MCAS Progress and Performance Index Report 2013 Cohasset Public Schools.
1 Testing Various Models in Support of Improving API Scores.
PARCC Results Summary Report.
Measuring College and Career Readiness
WASL Science: Grade 10 Specific Title Slide for School.
Metropolitan Nashville Public Schools
Smarter Balanced Assessment Results
Student Achievement Data Displays Mathematics & Reading Grade 3
WASL Data Only Slides Mathematics & Reading Grade 8
Session 4 Objectives Participants will:
EVAAS Overview.
Meredith cargilL director of curriculum, instruction, and technology
Measuring College and Career Readiness
Created by Jena Parish Austell Intermediate July 2011 School Faculty
Jayhawkville Central High School
PARCC Results Spring 2018 Administration
Claim 1 Smarter Balanced Sample Items Grade 7 - Target H
Involving Families Early Childhood Outcomes Center.
Measuring College and Career Readiness
Presentation transcript:

PurposeTo practice making valid inferences.Related Documents DescriptionThis activity can be used within your Data Team or with other audiences to improve data analysis skills. During this activity, you will have the chance to view multiple data displays and “check” the inferences made by another data team for validity. 3–Information Module TimeAbout 30 minutes. PRACTICE MAKING VALID INFERENCES 3.2.1T 3.2.1T: Practice Making Valid Inferences—Version 1.0 1

Practice Making Valid Inferences 1.Read a scenario. 2.Review the accompanying data display to observe what the data say. 3.Consider the statements provided. 4.As a Data Team, decide which statements are observations (factual interpretations) and which are inferences (conclusions, explanations, or conjectures). 5.Also, note whether each statement is true, not necessarily true, or false. 6.Possible answers are provided at the end T: Practice Making Valid Inferences—Version 1.0

Scenario #1 SCENARIO The Data Team in District A wanted to examine the performance of 8 th grade students on the 2007 MCAS ELA and Mathematics tests. The Team posed this focusing question. How did our 8 th graders, district-wide, perform on the 2007 MCAS tests? FOCUSING QUESTION T: Practice Making Valid Inferences—Version 1.0

4 ESE Data Warehouse Pre-defined Report R-303: District Performance Distribution 3.2.1T: Practice Making Valid Inferences—Version 1.0

District Performance Distribution Report (R-303) Statement Observation or Inference? True Not Necessarily True False 1A. Our students are smarter in English than they are in mathematics. 1B. Compared to the state, our students performed poorly in mathematics. 1C. About one third of our students performed below proficient in mathematics T: Practice Making Valid Inferences—Version 1.0

Scenario #2 SCENARIO After comparing the performance of the students in District A to the performance of students statewide, the Data Team posed a clarifying question. How did the mathematics performance of the 8 th graders in our district change over the past three years? CLARIFYING QUESTION T: Practice Making Valid Inferences—Version 1.0

7 ESE Data Warehouse Pre-defined Report R-305: District Performance Distribution by Year 3.2.1T: Practice Making Valid Inferences— Version 1.0

District Distribution by Year Report (R-305) Statement Observation or Inference? True Not Necessarily True False 2A. Students who were in 8 th grade in 2007 made gains from year- to-year since B. 8 th grade performance has improved from year-to-year. 2C. Our year-to-year trend performance follows the state’s trend performance T: Practice Making Valid Inferences—Version 1.0

Scenario #3 SCENARIO The District Data Team reviewed the longitudinal performance of the district’s students and concluded that the percent of students scoring at the lowest level decreased each year and the percent scoring at the Advanced level increased dramatically in This was encouraging, but the Team felt that performance could still be improved. The Team formulated the following clarifying question. With which specific strands and standards did the students have the most difficulty? CLARIFYING QUESTION T: Practice Making Valid Inferences—Version 1.0

10 ESE Data Warehouse Pre-defined Report R-306: District Standards Summary Report 3.2.1T: Practice Making Valid Inferences— Version 1.0

District Standards Summary Report (R-306) Statement Observation or Inference? True Not Necessarily True False 3A. Our students performed better than students statewide in each of the strands. 3B. Out of all the strands, our students performed worst in Measurement. 3C. Compared to student performance statewide in the strand Patterns, Relations, and Algebra, our students performed the best on the symbols standard T: Practice Making Valid Inferences—Version 1.0

Scenario #4 SCENARIO The review of the District Standards Summary Report helped the Team determine specific areas where the students were weak on the 2007 test. The Team delegated several members to review this report for the three prior years to see if these strands were problems for the students on those tests. The Data Team also wanted to learn more about the performance of subgroups on specific test items. The Team posed the following clarifying question. How did the ELL students in our LEP program perform across all test items as compared to all students in the district and in the state? CLARIFYING QUESTION T: Practice Making Valid Inferences—Version 1.0

13 ESE Data Warehouse Pre-defined Report R-302 District Item Analysis Graph 3.2.1T: Practice Making Valid Inferences—Version 1.0

District Item Analysis Graph (R-302) Statement Observation or Inference? True Not Necessarily True False 4A. The performance pattern of all students in our district follows the state more closely than the pattern for LEP students. 4B. Item 36 is the most difficult item. 4C. Our LEP program is not preparing our students adequately T: Practice Making Valid Inferences—Version 1.0

Possible Answers Instructions: Reflecting on the statements that have been made using the data reports: Which are Observations and which are Inferences? Which are True, False, or Not Necessarily True (more data needed)? What clarifying questions would help make better inferences? Scenario #1 1A. Inference (NNT) – The English (reading comprehension and writing) assessments are developed to assess completely different knowledge and skills, so a mathematics score cannot be directly compared to an English score. 1B. Inference – (F) The statement is false because our students performed BETTER than the state at each performance level. It is an inference because poorly is a conclusion that is not factually precise. 1C. Observation – (T) 29% of our students performed below proficient. Scenario #2 2A. Inference – (NNT) While the data do show increases in MCAS performance from 2006 to 2007, there could be slight differences in the student cohort due to changes in population since Additionally, the data do not reflect individual student growth over time, only MCAS scores for a class from one year to another. You would need more data to be sure. 2B. Observation – (T) The percent of students below proficient decreased over time. Caveat–again, these are different groups of students. 2C. Observation – (F) For the first two years, our students showed a small decrease in the percent of students below proficient, while the state stayed at about the same level (percent at warning actually increased slightly). In the most recent year, there was a decrease in percent below proficient at the state level, but a much larger decrease among the tested students in District A T: Practice Making Valid Inferences—Version 1.0

Possible Answers (continued) Instructions: Reflecting on the statements that have been made using the data reports: Which are Observations and which are Inferences? Which are True, False, or Not Necessarily True (more data needed)? What clarifying questions would help make better inferences? Scenario #3 3A. Observation – (T) A larger percentage of District A students was successful in each strand than students statewide. 3B. Observation – (T) Relative to all other strands, our students did indeed score the poorest in Measurement. 3C. Observation – (F) They performed best in Models relative to the state (10 percentage points difference). Scenario #4 4A. Observation – (T) LEP student pattern is up and down and district pattern and state pattern are very similar overall. Stress that this is probably due to the relatively small size of the population. Smaller populations show greater variation. 4B. Inference – (NNT) A factual interpretation (observation) is that the LEP group and the State scores are lowest for Item 36, but not for the District. It is an inference that this is the most difficult item for these groups, as there could be other reasons why so many students scored low on it. 4C. Inference – (NNT) You can’t infer this from the data. For example, the most difficult items for the LEP group may be those that have the most language, such as story problems. The next step is looking at the actual items and drawing conclusions about what might have made the items difficult for the LEP subgroup. Also, the LEP students did better than the other two populations on several items T: Practice Making Valid Inferences—Version 1.0