West Mifflin Area School District An Analysis of 4 Sight and How It Is Used to Inform Instruction Dr. Bart Rocco, Assistant Superintendent Dr. Daniel Castagna,

Slides:



Advertisements
Similar presentations
Building Our Future: One Community, One School, One Child at a Time Goals of the Special Administrative Board St. Louis Public Schools October 14, 2008.
Advertisements

School Leadership Team Fall Conference West Virginia Department of Education Division of Educator Quality and System Support Bridgeport Conference Center.
Characteristics of Improving School Districts Themes from Research October 2004 G. Sue Shannon and Pete Bylsma Office of Superintendent of Public Instruction.
A Guide to Implementation
11/2/2009 SWAMPSCOTT PUBLIC SCHOOLS MCAS Results ~ Spring 2009.
Parents as Partners in Education
Using Data Effectively or Why Weigh the Hog If You Aren’t Going To Feed It? Presented by Ronni Ephraim, Chief Instructional Officer Los Angeles Unified.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Delta Sierra Middle School Napa/Solano County Office of Education School Assistance and Intervention Team Monitoring Report #8 – July 2008 Mary Camezon,
Watertown Public Schools Assessment Report 2009 Ann Koufman-Frederick & WPS Administrative Council School Committee Meeting December 7, 2009 Part I MCAS,
Building & Using an Effective Leadership Team Kathi Cooper Aida Molina Bette Harrison Sandy Lam.
PVAAS School Consultation Fall 2011 PVAAS Statewide Core Team
School District of University City Jackson Park Elementary School SCHOOL IMPROVEMENT PLAN Joylynn Wilson, Superintendent Monica Hudson, Principal.
MINNEAPOLIS PUBLIC SCHOOLS. Instructional Core Adapted from Harvard University PELP Framework.
Leader & Teacher SLTs 2014 – ComponentEvaluation for TeachersEvaluation for School Leaders Setting GoalsTeachers set two SLTs in collaboration with.
DRIVING INSTRUCTION THROUGH DATA WITHOUT DATA IT IS JUST AN OPINION.
Leveraging Educator Evaluation to Support Improvement Planning Reading Public Schools Craig Martin
Maximizing Reading Gains to Meet AYP Targets: Decision Support Analytics for School Board Providence School District, RI April 2014.
Educator Preparation, Retention, and Effectiveness Ed Fuller University Council for Educational Administration and The University of Texas at Austin February.
+ Equity Audit & Root Cause Analysis University of Mount Union.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
INSTRUCTIONAL EXCELLENCE INVENTORIES: A PROCESS OF MONITORING FOR CONTINUOUS IMPROVEMENT Dr. Maria Pitre-Martin Superintendent of Schools.
NECAP DATA ANALYSIS 2012 Presented by Greg Bartlett March, 2013.
Elementary & Middle School 2014 ELA MCAS Evaluation & Strategy.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
Update on Virginia’s Growth Measure Deborah L. Jonas, Ph.D. Executive Director for Research and Strategic Planning Virginia Department of Education July-August.
PROGRESS MONITORING OF ACADEMIC ACHIEVEMENT Baldwin-Whitehall School District 2nd Quarter Data.
Rethinking a School District's Approach to Using Data as a Means to Guide Classroom Instruction Dr. Bart Rocco, Superintendent Dr. Todd Keruskin, Assistant.
STUDENT SUPPORT SERVICES The “Tween” Years — Increasing Academic Rigor Administrators’ Management Meeting for Exceptional Education and Student Services.
North Side Elementary School Dr. Mary Nardo, Principal North Side is making AYP. In addition to making Adequate Yearly Progress for the year, North.
1 Up-date on Assessment in Connecticut Dr. Barbara Q. Beaudin, Associate Commissioner Division of Assessment and Accountability Chief, Bureau of Student.
Growth Model for District “X” Why Use Growth Models? Showing progress over time is a more fair way of evaluating It is not just a “snap shot” in time.
Focused Monitoring November 10, 2010 Bureau of Special Education 1.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of Derry Township School District School Board Presentation Sept.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Data for Student Success Using State Assessment Data to Identify School Improvement Goals Lani Seikaly Professional Development Coordinator Data for Student.
Horizonte Instruction and Training Center Salt Lake City School District School Community Council Meeting November 14, 2012.
Reform Model for Change Board of Education presentation by Superintendent: Dr. Kimberly Tooley.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of School District School District.
No Child Left Behind. HISTORY President Lyndon B. Johnson signs Elementary and Secondary Education Act, 1965 Title I and ESEA coordinated through Improving.
Problem Solving and RtI ASCA Conference Denver, 2007 Rich Downs School Counseling Consultant Student Support Services Project Florida Department of Education.
Readiness for AdvancED District Accreditation Tuscaloosa County School System.
CURRICULUM RENEWAL EDD 7913 CRN BY JAMIE LEEDER, GENEVIEVE LEYDIG, KEITH MABE NOVA SOUTHEASTERN UNIVERSITY APRIL 4, 2013.
Michigan School Report Card Update Michigan Department of Education.
South Elementary School Des Plaines, Illinois Mary Ellen Bleeden, Principal Beatriz Cruz-Bradley, 1st Gr. Bilingual Teacher Keys To Success: Assessment,
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
Show Me the Money for C-JEM Elementary School Chris, Jen, Emily, Marie March 9, 2011.
The Pike County School Corporation “The Role of the School Administrator In School Improvement” The Learning Conference Indianapolis, IN January 30, 2006.
Click to edit Master subtitle style 2010 Adequate Yearly Progress Report Lawrence Public Schools August 9, 2010.
B a c kn e x t h o m e BARTON SCHOOL STATE OF THE SCHOOL ADDRESS April 11, 2007.
BISD Update Teacher & Principal Evaluation Update Teacher Evaluation Committee November 29,
Introduction to the Pennsylvania Kindergarten Entry Inventory.
“ Let us not be content to wait and see what will happen, but give us the determination to make the right things happen”- Horace Mann 2014 MCAS Overview.
Teaming/Data/Interventions RtI Infrastructure: Teaming RtI Partnership Coaches meeting January 6, 2011 Terry Schuster, RtI Partnership Lead Coach.
Continuous School Improvement Planning: Developing a School Improvement Plan October 24, 2011 Intermediate Unit 1 Instructional Support Services.
Interboro School District Keystones to Opportunity Grant Four Year Overview School Years.
Statewide System of Support For High Priority Schools Office of School Improvement.
1 PDE Benchmark Initiative An Introduction to the 4Sight Benchmarks September 9, 2005 PaTTAN Pittsburgh.
(Insert Fictional School Name Here) ADMS 618 Leadership for Educational Change and Improvement Jamilah Anderson Christian Nolde Chris Sumner.
1 Testing 1, 2, 3: An Analysis of 4Sight in Pennsylvania A Paper Presented at the 2009 North American Summer Meeting of the Econometric Society Robert.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Smarter Balanced Assessment Results
Bridgewater-Raritan Regional School District
Response to Intervention (RtI) What is a Teacher’s Role?
Overview Edward G. Rendell Governor Tom Gluck Secretary of Education.
Bull Run Middle School School Advisory Meeting, 6:30 – 8:00 p.m. Library.
Start with the Science & Technology Standards (2002, 2008?)
Standards Aligned System Integration Trainer Professional Development
Academic Achievement Report for Washington Manor Middle School
Presentation transcript:

West Mifflin Area School District An Analysis of 4 Sight and How It Is Used to Inform Instruction Dr. Bart Rocco, Assistant Superintendent Dr. Daniel Castagna, Principal, Homeville Elementary

Today’s Presentation  Role of Central Office in Using Data to Make Informed Decisions to Improve Student Performance  Four Key Points to Process  A Quantitative Case Study Analysis of the 4 Sight Benchmark Assessment  Implications  Questions/Answers

West Mifflin Area School District  3,400 Students  4 Elementary Schools,early Childhood Center, Middle School, High School  Pre-K- 12 Program  45% of Students Economically Disadvantaged  40% Special Education Population  AYP Status of Schools

How did we get here?  Two years ago no formalized benchmark assessments  Reviewed Benchmark Assessments Approved for Grants  Developed a Five-Year Plan  Classroom Excellence Initiative  Ed Insight  Technology Limitations/Needs  Curriculum Alignment Process

Never Enough Time  Rearranged professional development time  Act 80 2 hour delays  Insisted on before school/after school grade level, and department level meetings to address student data as a continuous process  10,000 hour rule – Outliers, Malcolm Gladwell  More time with students

Question Data/Hypothesize  Review data and develop focused questions  Look at specific student performance on tests in math and reading  Examine attendance  Look at target populations  What are root causes?  Speak to previous year’s teachers about student performance

Best Practice  Staff’s need to examine best teaching practices  What is the best way to teach a concept in developing main idea or a specific math skill  Develop a toolbox of activities  Share Ideas with each other.  Replicate testing experience

Leadership  Principals need to allocate time for collaboration  Put data into the hands of teachers  Communicate to your staffs and support them in their needs  Be engaged in instruction and learning

Leadership  Curriculum Alignment to Standards  Create a balance between improving test scores and maintaining a culturally enriched curriculum  Don’t forget about what standardized tests cannot do for our students  Build trust with your administrators  Work the vineyard

 Debbie Raubenstrauch  Jim Turner  Steve Biancaniello  On Hands School

Tier Assessment Purpose Rate of Feedback Type of Feedback Primary Target of Feedback Comprehensive District Data Plan III Annual large- scale – (PSSA TEST) Infrequent Frequent General, broad Specific, narrow General accountability audience: Policymakers Community Administrators Others II Periodic grade level/subject area (4 Sight Tests) Administrators Teachers I Ongoing classroom Teachers Students Why use the 4Sight tests???

Source: taken from Pennsylvania and 4Sight, newsletter published by Success for All, October 2007.

 Are the predictions accurate?  Is the money worth it? (over $10,000 annually)  Are we giving in to over-testing?  Why are PVAAS projections lower than 4Sights?  Are the skills tested on the 4Sight correlated to those tested on the PSSA

 Do the 4Sight exams accurately predict student’s raw scores on the PSSA exam?  Is there a high correlation between the predicted categorical classifications of students on the 4Sight tests when compared to actual PSSA Results?  Is the information gained from the fourth 4Sight test worth the loss of instructional time?

Four Benchmark Tests were Administered:  4Sight Test #1 – September 11, 2007  4Sight Test #2 – November 22, 2007  4Sight Test #3 – January 12, 2008  4Sight Test #4 – April 14, 2008  PSSA Exams – April 1, 2, 3, 2008  West Mifflin Middle School Students in Grades 6, 7, and 8.  686 out of 731 qualified for the study (94% of total population).

 Linear Regression with an (r) value.  The independent variable- 4Sight Score (X) was used to predict the dependent variable- PSSA score (Y)  In the January/February 2008 Newsletter published by the Success for All Foundation it was stated that, “The 4Sight correlations are determined by a statistical process called linear regression” (p 2).  Kappa Coefficient Formula.  Measured the agreement between predicted categorical classifications of students and actual PSSA categories.

Linear Regression  The (r) score can range from 0 to 1 with scores closer to 1 showing a stronger correlation and scores close to 0 showing a weaker correlation. Kappa Coefficient  Landis & Koch (1997) suggests the following kappa interpretation scale:  Below Poor  Slight  Fair  Moderate  Substantial  Almost perfect

Linear Regression R (Pearson Correlation) Values Raw ScoresGrade 6Grade 7Grade 8 Total Population Third 4Sight Math/PSSA Math Fourth 4Sight Math/PSSA Math Third 4Sight Reading/PSSA Fourth 4Sight Reading/PSSA

Total School Population – Math PSSA Math Category Below BasicBasicProficientAdvanced Third 4Sight Math Category Below Basic Basic Proficient Advanced00280

Total School Population – Math PSSA Math Category Below BasicBasicProficientAdvanced Fourth 4Sight Math Category Below Basic Basic Proficient Advanced

 Do the 4Sight exams accurately predict student’s raw scores on the PSSA exam?  Is there a high correlation between the predicted categorical classifications of students on the 4Sight tests when compared to actual PSSA Results?  Is the information gained from the fourth 4Sight test worth the loss of instructional time?

MATH

1. All teachers were given a copy of updated anchors for Reading and Math. 2. Open ended items scored by teachers of the school. 3. Department meetings were held to review data and identify trends across students. 4. Item analysis breakdowns for each question (and corresponding anchor) were provided for all teachers.

StudentNumbers/OperationsMeasurementGeometryAlgebraic ConceptsData Analysis/Probability Flexible groups were created within all subjects to highlight students with common areas of concern. Teacher________________________ Subject________________________ Period_________________________

 Assess strengths and weaknesses in light of the PA Reporting Categories and Academic Anchors  Develop strategies to address concerns at both class and individual student levels  Use this information to plan large group (across grade level).  TEST NARROWLY BUT TEACH BROADLEY

With this degree of information, teachers can more easily embed these skills and knowledge in the larger curriculum. SO…. Classroom curriculum is enhanced rather than narrowed! We are not teaching to the test – We are teaching to the needs of our students!

 Practice run for the PSSA testing. Beneficial for both students and staff!  Students get use to the “testing room”.  Scheduling issues were worked out (what do we do with teachers who do not have homerooms?).  We have enough pencils, paper and calculators!!!  Students report that they were less nervous for the 2 nd round of the 4 Sight tests.  Found/identified building level trends in open-ended responses. Has opened communication between departments.  Created incentives for attendance and participation (grade level challenges!).

Better use of Support Resources EdInsight Software Skills Tutor Software Study Island PSSA Coaching Books

2008 Final PSSA Results Including Sub-groups PSSA Math Net Increase All Students White Black IEP Econ PSSA Net Increase All Students White Black IEP Econ

 Investigate a longitudinal study that follows the use of the 4Sight test with a cohort of students over several years.  Identify a model of best practices to use when implementing 4Sight assessments.  Compare specific classroom interventions used in conjunction with benchmark assessment with student achievement growth.

 Expand this study to include all students in grades 3-8 and 11.  Narrow the focus of the study to a particular sub-group or groups.  Analyze specific skills assessed on the PSSA and compare them with skills tested on the 4Sight exam.