Evaluation Results 2002-2004 Missouri Reading Initiative.

Slides:



Advertisements
Similar presentations
Achievement Analyses – Matched Cohort Groups Oklahoma A+ Schools® vs. Randomly Matched OKCPS Students  OKLAHOMA CITY PUBLIC SCHOOLS  PLANNING, RESEARCH,
Advertisements

Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
Woodland Park School District Educator Effectiveness 101 August 2014.
Beyond the Classroom: The Use of Essential Skills for Remediation and Extension Christine Koch November 2008.
1 Prepared by: Research Services and Student Assessment & School Performance School Accountability in Florida: Grading Schools and Measuring Adequate Yearly.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
Delaware’s Accountability Plan for Schools, Districts and the State Delaware Department of Education 6/23/04.
Bluebonnet Elementary School Celebrations and Recommendations for Continuous School Improvement Round Rock Independent School District Module 7 Assignment.
Van Hise Elementary School Improvement Plan (SIP) UPDATE October 29, 2013.
Student Achievement Plan A Guide for Development.
SLO Workshop #2. Overview Objectives for today: You will be able to..  Recognize well written components of an SLO  Practice the process of analyzing.
The Bucks County Montessori Charter School PSSA Results, Local District Comparisons, and Year to Year Progressions.
Educator Evaluations: Growth Models Presentation to Sand Creek Schools June 13, 2011.
School Performance Index School Performance Index (SPI): A Comprehensive Measurement System for All Schools Student Achievement (e.g. PSSA) Student Progress.
Springfield Public Schools Adequate Yearly Progress 2010 Overview.
Michigan’s Accountability Scorecards A Brief Introduction.
Missouri Reading Initiative Evaluation Plan: Goals, Activities, and Responsibilities.
Department of Research and Evaluation Santa Ana Unified School District 2011 CST API and AYP Elementary Presentation Version: Elementary.
1 Paul Tuss, Ph.D., Program Manager Sacramento Co. Office of Education August 17, 2009 California’s Integrated Accountability System.
WBCSD District Strategic Goals Update August 10, 2015.
BOARD ENDS POLICY REVIEW E-2 Students will demonstrate a strong foundation in academic skills by working toward the Kansas Standards of Excellence in reading,
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
2009 MCAS Analysis & Adequate Yearly Progress Report Mendon – Upton Regional School District.
State and Federal Testing Accountability: Adequate Yearly Progress (AYP) Academic Performance Index (API) SAIT Training September 27, 2007.
The Evaluation of Charter School Impacts June 30, 2010 Presentation at the 2010 IES Research Conference Philip Gleason ● Melissa Clark Christina Clark.
Assessment in Early Childhood Legislation. Legislation for Young Children The need for measurement strategies and tests to evaluate federal programs led.
Project Director – Dr. Mark Lung Dept of Natural & Environmental Sciences Western State College of Colorado Project Evaluator – Dr. Dave Shannon Educational.
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
Understanding AYP and the Assessment Process Luella Middle School: A Success Story Presenter: Aaryn Schmuhl, Principal.
The Missouri Reading Initiative Spring 2008 Annual Participant Survey Results.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Woodland Park School District Educator Effectiveness 101 September 2015.
Evaluation Results Missouri Reading Initiative.
Jackson County School District A overview of test scores and cumulative data from 2001 – 2006 relative to the following: Mississippi Curriculum Test Writing.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
NJ ASSESSMENTS CYCLE II REPORT GRADES 3-8 and 11 October 30, 2008 Haddonfield Public Schools.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of School District School District.
Santa Ana Unified School District 2011 CST Enter School Name Version: Intermediate.
Supported by: Federally Funded Title I / Reading Recovery Programs This presentation created by Rhonda Reedy and Shelly Paxson.
Michigan School Report Card Update Michigan Department of Education.
NJASK Comparison Scores to Essex Fells, J Districts, & Statewide (Slides 2-9) Tracking NJASK Scores for 3 rd, 4 th,& 5 th Grades (Slides ) Tracking.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
March 26, 2012 North Scott School Board Presentation.
Provincial Assessment Results Anglophone West School District November 26, 2015.
1 Accountability Systems.  Do RFEPs count in the EL subgroup for API?  How many “points” is a proficient score worth?  Does a passing score on the.
Data for Student Success May, 2010 Ann Arbor, MI “It is about focusing on building a culture of quality data through professional development and web based.
Including analysis and self-help tools for coordination with Section 618: Table 6.
Garrett Elementary Accountability Report Kids are our Business! October 14,
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Annual Progress Report Summary September 12, 2011.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Literacy-Based Promotion Act & 3 rd Grade Summative Assessment Parent Information Night September 29, 2015.
Overview Plan Input Outcome and Objective Measures Summary of Changes Board Feedback Finalization Next Steps.
FES State of the Schools. Reading – 85% of FES students will meet or exceed state standards on the MCA-II in reading. We will improve scores on DIBELS.
Our State. Our Students. Our Success. DRAFT. Nevada Department of Education Goals Goal 1 All students are proficient in reading by the end of 3 rd grade.
School Accountability and Grades Division of Teaching and Learning January 20, 2016.
Using Data in a School Counseling Program Miss M. Brand Pine Grove Area Elementary School.
NAEP What is it? What can I do with it? Kate Beattie MN NAEP State Coordinator MN Dept of Education This session will describe what the National Assessment.
Performance Goals Samples (Please note, these goals are not proficient- they are for training purposes) What do you think?
Academic Performance Index (API) and AYP
Academic Performance Index (API) and AYP
The New Educator Evaluation System
The New Educator Evaluation System
Data-Based Leadership
The New Educator Evaluation System
Measuring College and Career Readiness
Wade Hayashida Local District 8
Solving the Riddle That Is APR Indicator 3
2019 Report Card Update Marianne Mottley Report Card Project Director
Meeting the challenge Every Classroom Every Student Every Day
Presentation transcript:

Evaluation Results Missouri Reading Initiative

MRI’s Evaluation Activities: Surveys *Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire *Data Collection Test Scores Standardized Tests Classroom Assessments (DRA) MAPDemographics Special Education Information MAP Analyses *For schools beginning in 2002

MAP ANALYSES: Map analyses compare schools that have finished the MRI program with randomly chosen samples of non-MRI elementary schools Results indicate MRI schools generally outperform non-MRI schools

Notes for MAP Analyses Note: With all the following MAP Analyses charts the numbers are not as important as the comparative performance between MRI and non-MRI schools. This is because: 1. There is a lot of variation in the data from year to year and school to school 2. The calculation of the baseline changes as more data becomes available: –For the 2002 schools 1999 was the baseline –For the 2003 schools an average of 1999/2000 was the baseline –For the 2004 schools an average of 1999/2001 was the baseline

Notes for Chart 1 1. This chart compares MRI and non-MRI schools performance on the 2002 MAP Communication Arts Index 2. Baseline year is 1999; Outcome year is Each sample has 15 schools = number of schools that finished MRI Spring Total random sample = 150 (large enough number to satisfy statistical significance, high confidence levels)

Notes for Chart 2 1. This chart compares MRI and non-MRI schools performance on the 2002 MAP Reading Index 2. Baseline year is 1999; Outcome year is Each sample has 15 schools = number of schools that finished MRI Spring Total random sample = 150 (large enough number to satisfy statistical significance, high confidence levels)

Notes for Chart 3 1. This chart compares MRI and non-MRI schools performance on the 2003 MAP Communication Arts Index 2. Baseline year is an average of 1999/2000 (smoothes out variations); Outcome year is Each sample has 20 schools = number of schools that finished MRI Spring Total random sample 200 (large enough number to satisfy statistical significance, high confidence levels)

Notes for Chart 4 1. This chart compares MRI and non-MRI schools performance on the 2003 MAP Reading Index 2. Baseline year is an average of 1999/2000 (smoothes out variations); Outcome year is Each sample has 20 schools = number of schools that finished MRI Spring Total sample 200 (large enough number to satisfy statistical significance, high confidence levels)

Notes for Chart 5 1. This chart compares MRI and non-MRI schools performance on the 2004 MAP Communication Arts Index 2. Baseline year is an average of 1999/2001 (smoothes out variations); Outcome year is Each sample has 27 schools = number of schools that finished MRI Spring Total random sample 270 (large enough number to satisfy statistical significance, high confidence levels)

Notes for Chart 6 1. This chart compares MRI and non-MRI schools performance on the 2004 MAP Reading Index 2. Baseline year is an average of 1999/2001 (smoothes out variations); Outcome year is Each sample has 27 schools = number of schools that finished MRI Spring Total sample 270 (large enough number to satisfy statistical significance, high confidence levels)

Adequate Yearly Progress As mandated by federal law, Missouri schools must make yearly progress goals in MAP scores For 3 rd Grade Communication Arts those goals were defined as 19.4% of students achieving levels of Proficient or better in 2003, and 20.4% for The following Table provides a comparison between MRI schools and state-wide results.

MRIState % (60 / 74) 50.9% (1,046 / 2,053) % (50 / 50) 57.4% (1,167 / 2,053) Percentage of Schools Meeting AYP Levels 2003=19.4% 2004=20.4% Proficient and Advanced

Participant Survey Participants rate the usefulness of component utilization, practice change, "buy in", attitudes toward the program and trainer, etc. Results drive program change; e.g., Orientation

Notes for Participant Survey Slide This slide introduces the survey and its uses. The table in the next slide demonstrates how the survey is often used. In this case: survey respondents identified the problem of being “overwhelmed” 2. Program responded by redesigning orientation and other details 3. Program satisfaction improved from 02 to 04

Special Education We track the effects of MRI on Special Education in two ways: 1. Beginning with schools that started MRI in the Fall of 2002, all students with IEPs are identified and the type of IEP is described (Reading, Math, Speech, etc.) 2. Annual reports are made from schools about their IEP evaluation process: referrals, evaluations, and IEPs.

Notes for Special Education Chart This Chart is for schools that were in their 3 rd year Many schools do not have this data, or it is not easily accessed 9 of 18 in of 18 in of 23 in of 23 in of 34 in 2004 The data points to a decrease in referrals, evaluations, and assignment of IEP over the time schools participate in MRI. Data collection began from the onset of schools in with a complete report in 2005.

Developmental Reading Assessment (DRA) The following Table describes the changes in the percentages of cohorts of students who scored At or Above Grade Level according to the DRA at 15 MRI schools who have completed two years in the program. Key: =% At or Above GL S031=1 st Graders in Spring 2003 S042=2 nd Graders in Spring 2004 F02 =2 nd Graders in Fall 2002 S043=3 rd Graders in Spring 2004  = Percentage Change Key: S031=1st grade Spring 2003 S042=2nd Grade Spring 2004 S042=2nd Grade Spring 2004 F022=2nd Grade Fall 2002 S043=3rd Grade Spring 2004 

Change in DRA Grade Level SchoolS031S042  F022S043  % % % % % % % % % % % % % % % % % % % % % % % % % % % % % TOTALS % %