School Improvement Network Impact Assessment: Higher Engagement Schools versus Lower Engagement Schools Steven H. Shaha, PhD, DBA Professor, Center for.

Slides:



Advertisements
Similar presentations
Professional Learning Communities At Glasgow High School.
Advertisements

Team Teaching Section 7: Monitoring Teacher. The Monitoring Teacher model One teacher assumes the responsibility for instructing the entire class. The.
Accountability data overview August Topics  Changes to 2014 accountability reporting  Overview of accountability measures  Progress & Performance.
 Reading School Committee January 23,
PD-360 Impact for Title 1 Schools Steven H. Shaha, PhD, DBA July 2011.
© 2011 School Improvement Network 2008 Learning Framework Impact Assessment: St. John the Baptist Parish Public Schools versus Louisiana as a State Prepared.
Team Teaching Section 5: Parallel Instruction. The Parallel Instruction model In this setting, the class is divided into two groups and each teacher is.
Catherine Cross Maple, Ph.D. Deputy Secretary Learning and Accountability
Jefferson County Public Schools University of Louisville.
The Community Schools Evaluation Toolkit: Moving the Research Agenda Forward Reuben Jacobson, University of Maryland Shital C. Shah, Coalition for Community.
Bluebonnet Elementary School Celebrations and Recommendations for Continuous School Improvement Round Rock Independent School District Module 7 Assignment.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
1. The Process Rubrics (40 or 90) should be open soon. 2. The Data Profile and SI Plan are expected to open in December. 3. The complete CNA will.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Addressing Student Growth In Performance Evaluation For Teachers and Administrators Patricia Reeves.
Key SINet Impacts: Immediate Impacts Predictors of Success 2-Year Sustained Impacts 7-Year Longevity Prepared by: Steven H. Shaha, PhD, DBA 1.
BLOCK SCHEDULING June 8, TRADITIONAL 6-HOUR SCHOOL DAY A typical student will engage in 6-8 different activities in 3 different locations A teacher.
High School Mathematics: Where Are We Headed? W. Gary Martin Auburn University.
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Assessing California Standards Test (CST) Data.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
Elementary & Middle School 2014 Mathematics MCAS Evaluation & Strategy.
The Center for the Future of Teaching and Learning California’s Teaching Force 2004 Key Issues and Trends Research conducted by SRI International California.
READING AND MATH CRCT SCORES FOR COMPARING RIVERSIDE INTERMEDIATE SCHOOL, COBB COUNTY SCHOOL DISTRICT, AND STATE RESULTS BY SAM COLSON.
PD 360 Impact Assessment: Impact of PD 360 on Student Proficiency Rates Prepared by Steven H. Shaha, PhD, DBA Summer
1 Results for Students with Disabilities and School Year Data Report for the RSE-TASC Statewide Meeting May 2010.
DeAnn Huinker, UW-Milwaukee MMP Principal Investigator 26 August 2008 This material is based upon work supported by the National Science Foundation under.
Mechanisms for Determining Progress and Grant Renewals Mechanisms for Determining Progress and Grant Renewals National Network of State School Improvement.
1 The Nation’s Report Card: 2007 Writing. 2 Overview of the 2007 Writing Assessment Given January – March 2007 – 139,900 eighth-graders – 27,900 twelfth-graders.
Guilford County Schools Parent and Community Surveys Presentation January 24, 2015 Prepared By Nancy Burnap, Ph.D Research Strategies, Inc. Presented By.
TRAINING AND REALIZING RESULTS USING GREENBELT ACROSS DIVISIONS: A CASE STUDY WITH THE SCHOOL DISTRICT OF MENOMONEE FALLS Wisconsin State Education Convention.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
1 Division of Public Schools (PreK -12) Florida Department of Education Florida Education: The Next Generation DRAFT March 13, 2008 Version 1.0 INSERT.
Why must we do Data Teams? We know the implementation of DT benefit STUDENTS -helps teachers identify exceeding/proficient students and plan for targeted.
STAR3 Project for WS/FCS. STAR3 All students deserve and thrive under a great teacher that cares for their well being. Our responsibility is to provide.
Evaluation Results Missouri Reading Initiative.
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Analysis of the Mathematics Section of the California Standards Test (CST) Data Elementary.
Customer Relationship Management Chapter Fourteen.
April 29, 2011 Developing Effective Leaders: Principal Evaluation Systems CCSSO – National Summit on Educator Effectiveness.
How to use Thematic Units……. The key to successful thematic unit development and teaching is careful and thoughtful planning, combined with a thorough.
The New York State School Improvement Grant Initiative Five Years On Office of Professional Research & Development, Syracuse University, NY.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Capstone Project Michelle Knittle-Bires. Executive Summary Everest College is a for profit college Retention is important to staying in business Attrition.
KCMP SELF-ASSESSMENT PROCESS Winter Reporting Period.
Increasing Efficiency in Data Collection Processes Arie Aharon, Israel Central Bureau of Statistics.
CREP Center for Research in Educational Policy SES Student Achievement Methods/Results: Multiple Years and States Steven M. Ross Allison Potter The University.
How can giving ELL students access to learning games on a computer help them learn in the classroom? By: Lisa Cruz.
User vs Nonuser : A Multi-State, Multi-District Study of the Impact of Participation in PD 360 on Student Performance Prepared by Steven H. Shaha, PhD,
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
AARRGGHH!! Data Analysis! Just Do It For Me & Tell Me What It Says! Laura Boudreaux Pitre Merry Jane Bourgeois WORKING DRAFT 5/24/06.
Update on the Girard Foundation Grant for the Evaluation Analysis of One-to-One Academy Dr. Dennis Johnston Dr. Richard J. Beach Classroom of the Future.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Arkansas State Report Card Are We 5 th or 49 th ? July 8, 2013 Arkansas Rural Ed Association.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Program Evaluation and Impact Assessment: Internet-Delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts.
MEASURING GROWTH IN ACADEMIC ACHIEVEMENT: How Will We Compare Apples to Potatoes? Janet Stephenson School Improvement Resource Teacher 1.
Program Evaluation and Impact Assessment: Internet-delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts.
DESE District Review Center for District and School Accountability Site Visit: April 11-14, 2011.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Overview Plan Input Outcome and Objective Measures Summary of Changes Board Feedback Finalization Next Steps.
1 Wendell J. Callahan, Ph.D. San Diego County Office of Education September 2011.
STAT MINI- PROJECT NKU Executive Doctoral Co-hort August, 2012 Dot Perkins.
DECISION-MAKING FOR RESULTS HSES- Data Team Training.
1 Perspectives on the Achievements of Irish 15-Year-Olds in the OECD PISA Assessment
Welcome and Announcements
Using Edusoft – Data Analysis
Oregon Team : Carla Wade : Jan McCoy :Dave Cook : Jennifer Arns
SEC & State/District Initiatives
Presentation transcript:

School Improvement Network Impact Assessment: Higher Engagement Schools versus Lower Engagement Schools Steven H. Shaha, PhD, DBA Professor, Center for Public Policy and Administration Independent Evaluator December 2012

Overarching Research Question: Does engagement in PD 360 and Observation 360, tools within the Educator Effectiveness System, significantly affect student success and school-wide metrics? Does engagement in PD 360 and Observation 360, tools within the Educator Effectiveness System, significantly affect student success and school-wide metrics?

Sample Description High Video Utilizers – 39 States – 211 Districts – 734 Schools Metrics: Educator Engagement Student Success School Impacts

32 data elements collected or computed through PD 360 and Observation 360 Contrasted higher engagement schools versus lower engagement schools – Improvement in percentages of students who tested advanced or proficient in math and reading – Classified into four quartiles – Analyses of highest and lowest quartiles only Study of Educator Engagement

Metrics for Differentiating Advantages for Higher Engagement Organizations: Focus Objectives Set Up Observations Performed Percent Registered Users Percent of Users in Communities Minutes Viewed Forums Viewed Programs Viewed Segments Viewed Links Viewed Follow-up Questions Answered Reflection Questions Answered Focus Objectives Set Up Forums Posted Downloaded Files Uploaded Files Participation in Communities Leadership, Implementation and Accountability Educator Participation Educator Engagement These are the 15 metrics for which higher engagement schools were significantly higher than their lower engagement counterparts

Sample of Differentiating Metrics of Utilization and Engagement Links Viewed Minutes Viewed Follow-up Questions Answered Observations Performed Passive participation (e.g. video viewing alone) is LESS influential than Active engagement 63.8% advantage (p<.001) 39.0% advantage (p<.001) 70.3% advantage (p<.001) 4.3% advantage (p<.001)

Passive participation (e.g. video viewing alone) is LESS influential than Active engagement Uploaded Files Downloaded Files Forums Viewed Forums Posted Sample of Differentiating Metrics of Utilization versus Engagement 47.3% advantage (p<.001) 30.5% advantage (p<.001) 79.5% advantage (p<.001) 68.6% advantage (p<.001)

Who Cares? Who cares if educators used it more? Did it make a difference for kids and schools?

Student Success: Performance on standardized tests – Percent either proficient or advanced in the following subjects: Reading Math Study of Student Success

Improved Student Performance 4.9% gain for lower engagement schools (p<.01)

Improved Student Performance 4.9% gain for lower engagement schools (p<.01) 18.0% gain for higher engagement schools (p<.001) Closed the Gap: 267% advantage in gains for higher engagement schools (p<.001) Nearly 4 times the impact Closed the Gap: 267% advantage in gains for higher engagement schools (p<.001) Nearly 4 times the impact

Improved Student Performance 0.5% gain for lower engagement schools (p=ns) Actually Important Gains: For every 200 students, 1 more performed at proficient or advanced level than in the previous year Actually Important Gains: For every 200 students, 1 more performed at proficient or advanced level than in the previous year

Improved Student Performance 0.5% gain for lower engagement schools (p=ns) 18.9% gain for higher engagement schools (p<.001) Surpassed the Gap: 3,520% advantage in gains for higher engagement schools (p<.001) 36 times greater impact Surpassed the Gap: 3,520% advantage in gains for higher engagement schools (p<.001) 36 times greater impact

Performance on key indicators from Internet (when publicly available) and structured telephone interviews: – Dropout Rates – Student Discipline Rates – Teacher Retention Rates – College-Bound Rates Metrics of School Impact

Improved Dropout Rates 4.9% improvement for lower engagement schools (p<.01) For every 100 students, 5 fewer dropped out than in the previous year Figures reflect rounding, projections reflect correct math.

Improved Dropout Rates 20.0% improvement for higher engagement schools (p<.001) 309.1% advantage in gains for higher engagement schools (p<.001) Executive Summary Higher engagement schools began statistically equal, then significantly outperformed their counterparts (p<.01) Executive Summary Higher engagement schools began statistically equal, then significantly outperformed their counterparts (p<.01) 4.9% improvement for lower engagement schools (p<.01) For every 100 students, 20 fewer dropped out than in the previous year.

Improved Student Discipline Rates Figures reflect rounding, projections reflect correct math. Y-Axis is inverted to reflect improvement as intuitively upward trend. 7.4% fewer disciplinary incidents for lower engagement schools (p<.01)

33.2% fewer disciplinary incidents for higher engagement schools (p<.001) Improved Student Discipline Rates Executive Summary Higher engagement schools significantly outperformed their counterparts (p<.01) Executive Summary Higher engagement schools significantly outperformed their counterparts (p<.01) 7.4% fewer disciplinary incidents for lower engagement schools (p<.01) 351% advantage in gains for higher engagement schools (p<.001) 4 ½ times the impact 351% advantage in gains for higher engagement schools (p<.001) 4 ½ times the impact For every 100 students, 33 fewer problem students than in the previous year.

Figures reflect rounding, projections reflect correct math. Improved Teacher Retention Rates 1.7% more teachers stayed for lower engagement schools (p<.01)

Improved Teacher Retention Rates Executive Summary Higher engagement schools significantly outperformed their counterparts (p<.01) Executive Summary Higher engagement schools significantly outperformed their counterparts (p<.01) 65.9% advantage in gains for higher engagement schools (p<.001) Nearly twice the impact 65.9% advantage in gains for higher engagement schools (p<.001) Nearly twice the impact 1.7% more teachers stayed for lower engagement schools (p<.01) 2.8% more teachers stayed for higher engagement schools (p<.01) For every 100 teachers, nearly 3 fewer left than in the previous year. Figures reflect rounding, projections reflect correct math.

Improved College-Bound Rates No decrease or gain for lower engagement schools (ns) Figures reflect rounding, projections reflect correct math. Percentage of students schools report as being college-bound.

Improved College-Bound Rates Executive Summary Higher engagement schools began statistically equal, then significantly outperformed their counterparts (p<.01) Executive Summary Higher engagement schools began statistically equal, then significantly outperformed their counterparts (p<.01) Incalculable advantage in gains for higher engagement schools (p<.001) 9.6% improvement for higher engagement schools (p<.001) No decrease or gain for lower engagement schools (ns) For every 100 students, 10 more were college-bound than in the previous year. Figures reflect rounding, projections reflect correct math.

Dropout Rates – Approx. 15 fewer dropouts per 100 students than lower engagement school counterparts Student Discipline Rates – Approx. 33 fewer students “in the office” per 100 students than for lower engagement school counterparts Teacher Retention Rates – Approx. 3 fewer teachers leaving per 100 teachers, which is 1 fewer than for lower engagement school counterparts College-Bound Rates – Approx. 10 more college-bound students per 100 students than for lower engagement school counterparts Summary of School Impacts

Student Success School Impacts Leadership, Implementation and Accountability Educator Participation Educator Engagement A Model for Educational Success