Program Evaluation and Impact Assessment: Internet-delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts.

Slides:



Advertisements
Similar presentations
External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.
Advertisements

Achievement Analyses – Matched Cohort Groups Oklahoma A+ Schools® vs. Randomly Matched OKCPS Students  OKLAHOMA CITY PUBLIC SCHOOLS  PLANNING, RESEARCH,
PD-360 Impact for Title 1 Schools Steven H. Shaha, PhD, DBA July 2011.
KIPP: Effectiveness and Innovation in Publicly-Funded, Privately-Operated Schools October 4, 2012 Presentation to the APPAM/INVALSI Improving Education.
© 2011 School Improvement Network 2008 Learning Framework Impact Assessment: St. John the Baptist Parish Public Schools versus Louisiana as a State Prepared.
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
Mark DeCandia Kentucky NAEP State Coordinator
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Key SINet Impacts: Immediate Impacts Predictors of Success 2-Year Sustained Impacts 7-Year Longevity Prepared by: Steven H. Shaha, PhD, DBA 1.
The Evaluation of Mathematics and Science Partnership Program A Quasi Experimental Design Study Abdallah Bendada, MSP Director
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Assessing California Standards Test (CST) Data.
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
PD 360 Impact Assessment: Impact of PD 360 on Student Proficiency Rates Prepared by Steven H. Shaha, PhD, DBA Summer
Impacts of Comprehensive Teacher Induction: Final Results from a Randomized Trial IES Summer Research Conference, June 2010 Steven Glazerman ● Eric Isenberg.
Mathematics and Science Education U.S. Department of Education.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction Results of the 2005 National Assessment of Educational Progress.
 Student Growth Goals & Plan KASA Conference July 17, 2014.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Evaluation Results Missouri Reading Initiative.
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Analysis of the Mathematics Section of the California Standards Test (CST) Data Elementary.
Mark DeCandia Kentucky NAEP State Coordinator
NAEP 2011 Mathematics and Reading Results Challis Breithaupt November 1, 2011.
NAEP 2011 Mathematics and Reading Results NAEP State Coordinator Mark DeCandia.
WORKING TOGETHER TO IMPROVE SCIENCE EDUCATION PRESENTED BY GIBSON & ASSOCIATES A CALIFORNIA MATH AND SCIENCE PARTNERSHIP RESEARCH GRANT WISE II Evaluation.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
MCS Reading Recovery Program Currently funded with 10 State Read to Achieve Grants Madison County Schools Reading Recovery History 2001 – MCS established.
J. W. DOBBS ELEMENTARY DATA OVERVIEW FOURTH GRADE TEAM NOVEMBER 21, 2013.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
We are WCSD A look at our students JT Stark  Data & Research Analyst Jennifer Harris  Program Evaluator.
What is randomization and how does it solve the causality problem? 2.3.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Evaluation Results Missouri Reading Initiative.
Relational Discord at Conclusion of Treatment Predicts Future Substance Use for Partnered Patients Wayne H. Denton, MD, PhD; Paul A. Nakonezny, PhD; Bryon.
Michigan School Report Card Update Michigan Department of Education.
U.S. Department of Education Mathematics and Science Program State Coordinators’ Meeting.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
School Improvement Network Impact Assessment: Higher Engagement Schools versus Lower Engagement Schools Steven H. Shaha, PhD, DBA Professor, Center for.
Iowa School Report Card (Attendance Center Rankings) December 3, 2015.
User vs Nonuser : A Multi-State, Multi-District Study of the Impact of Participation in PD 360 on Student Performance Prepared by Steven H. Shaha, PhD,
Passport to Science MSP Science Program Indianapolis Public Schools.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Vae View Elementary School Improvement Plan
Custom Reports: SCGs and VCGs. Standard Comparison Group (SCG)
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
No Child Left Behind California’s Definition of Adequate Yearly Progress (AYP) July 2003.
Tanaina Elementary SBA Data Look May rd Grade SBA Advanced Proficient Below Proficient Far Below 91.9%
2015 State PARCC Results Presented to the Board of Elementary and Secondary Education Robert Lee MCAS Chief Analyst and Acting PARCC Coordinator October.
Tuesday, April 12 th 2011 SPDG Performance Measure Discussion.
Program Evaluation and Impact Assessment: Internet-Delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts.
LoFTI. Structured Observations: LoFTI –Determine how technology is being used school-wide. –Record instances of particular uses of technology, not “how.
Public School Accountability System. Uses multiple indicators for broad picture of overall performance Uses multiple indicators for broad picture of overall.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Summer ChAMP Carroll County Schools. Summer can set kids on the Right- or Wrong- Course Study links a lack of academic achievement, high drop out rates,
Patricia Gonzalez, OSEP June 14, The purpose of annual performance reporting is to demonstrate that IDEA funds are being used to improve or benefit.
Third-Party Evaluation Studies as a Basis for Determining Program Effectiveness and Improvement Needs Center for Research and Reform in Education Johns.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
Classroom Network Technology as a Support for Systemic Mathematics Reform: Examining the Effects of Texas Instruments’ MathForward Program on Student Achievement.
MSP Summary of First Year Annual Report FY 2004 Projects.
SNRPDP Self Evaluation
Performance Goals Samples (Please note, these goals are not proficient- they are for training purposes) What do you think?
Arizona State University
Evaluation of An Urban Natural Science Initiative
Local Growth Models for Accountability
CMSD Fall Data Check In November 3, 2017.
Resource Allocation and District Action Reports: RADAR
Presentation transcript:

Program Evaluation and Impact Assessment: Internet-delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts Steven H. Shaha, PhD, DBA Professor, Center for Public Policy and Administration Independent Evaluator July

Overarching Research Question: Does teacher engagement in PD 360 and Observation 360, tools within the Educator Effectiveness System, significantly affect student success? Does teacher engagement in PD 360 and Observation 360, tools within the Educator Effectiveness System, significantly affect student success? 2

Methods Design: Quasi-experimental, retrospective, pre-post, normalized treatment-control / participation vs. non-participation ( , ) Goal: Multi-State, large n with comparable student populations (matched, controlled) Student Change: * Metric was percent students classified as Proficient or Advanced in respective States. 3

Sample Participation – Systematic sample of 169 elementary schools, in 73 districts, in 19 States N determined by a priori Power analysis – Schools eligible for inclusion in the sample as participating Schools met the following criteria: More than 10 teachers total 80% or more of teachers viewed materials Minimum average of 90.0 minutes of viewing per teacher for the school – Districts included were only those for which eligible schools were included Normalizing for difference in socio-economic and demographic factors between participating Schools and their Districts cumulatively as the statistical comparison group Data – Participation data were extracted from the Internet-based professional development application as surveilled – Student performance data were captured from publically available, Internet-accessed sources (school as unit of measure, percent Proficient or Advanced as metric) 4

Impacts on Math 5

6

7 Districts improved by 2.6 net percentage points (p<.01). Improvement is percent change: [(Year2-Year1)/Year1] Comparative change: [School change/District change] 7

Impacts on Math 8 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline.

Impacts on Math 9 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001).

Impacts on Math 10 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline.

Impacts on Math 11 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 8.4 points more than Districts (p<.001).

Impacts on Math 12 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 8.4 points more than Districts (p<.001). That’s 14.7% more versus baselines. Participating Schools improved by 8.4 points more than Districts (p<.001). That’s 14.7% more versus baselines.

Impacts on Math 13 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools experienced 4.2 TIMES greater improvement or Effect Size (p<.001). Participating Schools improved by 8.4 points more than Districts (p<.001). That’s 14.7% more versus baselines. Participating Schools improved by 8.4 points more than Districts (p<.001). That’s 14.7% more versus baselines.

Impacts on Math 14 Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Districts improved by 2.6 net percentage points (p<.01). That’s 4.2% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools improved by 11.1 net percentage points (p<.001). That’s 18.9% better than baseline. Participating Schools experienced 4.2 TIMES greater improvement or Effect Size (p<.001). That’s a 4.5 Effect Size versus baselines (p<.001) Participating Schools experienced 4.2 TIMES greater improvement or Effect Size (p<.001). That’s a 4.5 Effect Size versus baselines (p<.001) Participating Schools improved by 8.4 points more than Districts (p<.001). That’s 14.7% more versus baselines. Participating Schools improved by 8.4 points more than Districts (p<.001). That’s 14.7% more versus baselines.

Impacts on Math Comparative Growth for Participating Schools In percentage of students Proficient or Advanced: 11.1 net improvement for Schools (p<.001) – 8.4 more than their respective Districts (p<.001) – 18.9% better than their Yr. 1 baseline (p<.001) – 14.7% better than Districts vs. baselines (p<.001) Effect Sizes: – 4.2 times greater improvement for net growth vs. Districts (p<.001) – 4.5 times greater improvement for growth from baselines vs. Districts (p<.001) 15

Impacts on Reading

Districts improved by 1.6 net percentage points (p<.01). Impacts on Reading

Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Impacts on Reading

Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). Impacts on Reading

Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Impacts on Reading

Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 8.7 points more than Districts (p<.001). Impacts on Reading

Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 8.7 points more than Districts (p<.001). That’s 12.8% more versus baselines. Participating Schools improved by 8.7 points more than Districts (p<.001). That’s 12.8% more versus baselines. Impacts on Reading

24 Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools experienced 6.5 TIMES greater improvement or Effect Size (p<.001). Participating Schools improved by 8.7 points more than Districts (p<.001). That’s 12.8% more versus baselines. Participating Schools improved by 8.7 points more than Districts (p<.001). That’s 12.8% more versus baselines. Impacts on Reading

25 Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Districts improved by 1.6 net percentage points (p<.01). That’s 2.5% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools improved by 10.3 net percentage points (p<.001). That’s 15.3% better than baseline. Participating Schools experienced 6.5 TIMES greater improvement or Effect Size (p<.001). That’s a 6.1 Effect Size versus baselines (p<.001) Participating Schools experienced 6.5 TIMES greater improvement or Effect Size (p<.001). That’s a 6.1 Effect Size versus baselines (p<.001) Participating Schools improved by 8.7 points more than Districts (p<.001). That’s 12.8% more versus baselines. Participating Schools improved by 8.7 points more than Districts (p<.001). That’s 12.8% more versus baselines. Impacts on Reading

Comparative Growth for Participating Schools In percentage of students Proficient or Advanced: 10.3 net improvement for Schools (p<.001) – 8.7 more than their respective Districts (p<.001) – 15.3% better than their Yr. 1 baseline (p<.001) – 12.8% better than Districts vs. baselines (p<.001) Effect Sizes: – 6.5 times greater improvement for net growth vs. Districts (p<.001) – 6.1 times greater improvement for growth from baselines vs. Districts (p<.001) 26