PD-360 Impact for Title 1 Schools Steven H. Shaha, PhD, DBA July 2011.

Slides:



Advertisements
Similar presentations
Highlights From the Survey on the Use of Funds Under Title II, Part A
Advertisements

Pennsylvania Value-Added Assessment System Overview: PVAAS
Adequate Yearly Progress (AYP) and Accountability Status Determinations Massachusetts Department of Elementary and Secondary Education October 2008.
Understanding Wisconsin’s New School Report Card.
Annotated Bibliography Presentation April 17, 2013 Carol Redmond ED 521: Educational Research and Analysis Flipping Literacy and Improved Reading Levels.
Title I Coordinators’ Meeting: Guiding Students to Proficiency December 07, 2005.
Key SINet Impacts: Immediate Impacts Predictors of Success 2-Year Sustained Impacts 7-Year Longevity Prepared by: Steven H. Shaha, PhD, DBA 1.
Cindy M. Walker & Kevin McLeod University of Wisconsin - Milwaukee Based upon work supported by the National Science Foundation Grant No
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Assessing California Standards Test (CST) Data.
Student Engagement Survey Results and Analysis June 2011.
SIGCOMM Outline  Introduction  Datasets and Metrics  Analysis Techniques  Engagement  View Level  Viewer Level  Lessons  Conclusion.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
Meryle Weinstein, Emilyn Ruble Whitesell and Amy Ellen Schwartz New York University Improving Education through Accountability and Evaluation: Lessons.
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
1 Paul Tuss, Ph.D., Program Manager Sacramento Co. Office of Education August 17, 2009 California’s Integrated Accountability System.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
State and Federal Testing Accountability: Adequate Yearly Progress (AYP) Academic Performance Index (API) SAIT Training September 27, 2007.
Adequate Yearly Progress (AYP) and Accountability Status Determinations.
PD 360 Impact Assessment: Impact of PD 360 on Student Proficiency Rates Prepared by Steven H. Shaha, PhD, DBA Summer
Comparing Two Means Prof. Andy Field.
Mathematics and Science Education U.S. Department of Education.
U.S. Department of Education Mathematics and Science Partnerships: FY 2005 Summary.
TRAINING AND REALIZING RESULTS USING GREENBELT ACROSS DIVISIONS: A CASE STUDY WITH THE SCHOOL DISTRICT OF MENOMONEE FALLS Wisconsin State Education Convention.
Department of Research and Planning November 14, 2011.
CALIFORNIA DEPARTMENT OF EDUCATION Jack O’Connell, State Superintendent of Public Instruction Results of the 2005 National Assessment of Educational Progress.
Teacher Engagement Survey Results and Analysis June 2011.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
Evaluation Results Missouri Reading Initiative.
Adequate Yearly Progress (AYP) Academic Performance Index (API) and Analysis of the Mathematics Section of the California Standards Test (CST) Data Elementary.
NAEP 2011 Mathematics and Reading Results Challis Breithaupt November 1, 2011.
Human Resources Research Organization (HumRRO) 66 Canal Center Plaza, Suite 700 Alexandria, Virginia | Phone: | Fax:
4.1 Statistical Measures. When one needs to compare individual values to others in a data set, the following statistical measures are used: Per Capita.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Mathematics and Science Partnerships: Summary of the FY2006 Annual Reports U.S. Department of Education.
Precision Gains from Publically Available School Proficiency Measures Compared to Study-Collected Test Scores in Education Cluster-Randomized Trials June.
Descriptive & Inferential Statistics Adopted from ;Merryellen Towey Schulz, Ph.D. College of Saint Mary EDU 496.
Evaluation Results Missouri Reading Initiative.
U.S. Department of Education Mathematics and Science Program State Coordinators’ Meeting.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
CREP Center for Research in Educational Policy SES Student Achievement Methods/Results: Multiple Years and States Steven M. Ross Allison Potter The University.
School Improvement Network Impact Assessment: Higher Engagement Schools versus Lower Engagement Schools Steven H. Shaha, PhD, DBA Professor, Center for.
Iowa School Report Card (Attendance Center Rankings) December 3, 2015.
Framework of Preferred Evaluation Methodologies for TAACCCT Impact/Outcomes Analysis Random Assignment (Experimental Design) preferred – High proportion.
User vs Nonuser : A Multi-State, Multi-District Study of the Impact of Participation in PD 360 on Student Performance Prepared by Steven H. Shaha, PhD,
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
1 Getting Up to Speed on Value-Added - An Accountability Perspective Presentation by the Ohio Department of Education.
No Child Left Behind California’s Definition of Adequate Yearly Progress (AYP) July 2003.
Tanaina Elementary SBA Data Look May rd Grade SBA Advanced Proficient Below Proficient Far Below 91.9%
Program Evaluation and Impact Assessment: Internet-Delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
CALMAC July 18, 2007 Meeting Attribution and Net to Gross Examples for Discussion Clark Bernier, RLW Analytics, Inc.
Program Evaluation and Impact Assessment: Internet-delivered, On-Demand Professional Development Participating Schools versus Their Respective Districts.
Evaluation Results MRI’s Evaluation Activities: Surveys Teacher Beliefs and Practices (pre/post) Annual Participant Questionnaire Data Collection.
Public School Accountability System. Uses multiple indicators for broad picture of overall performance Uses multiple indicators for broad picture of overall.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
Summer ChAMP Carroll County Schools. Summer can set kids on the Right- or Wrong- Course Study links a lack of academic achievement, high drop out rates,
Project VIABLE - Direct Behavior Rating: Evaluating Behaviors with Positive and Negative Definitions Rose Jaffery 1, Albee T. Ongusco 3, Amy M. Briesch.
Performance Goals Samples (Please note, these goals are not proficient- they are for training purposes) What do you think?
Comparing Two Means Prof. Andy Field.
Cedar Falls Board of Education October 2017
CMSD Fall Data Check In November 3, 2017.
Myers EXPLORING PSYCHOLOGY (6th Edition in Modules)
Student Growth and Performance Update:
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
5/16/2017 Inspiring excellence!
Central Middle School August 20, 2019
Tomlinson Middle School August 27, 2019
Presentation transcript:

PD-360 Impact for Title 1 Schools Steven H. Shaha, PhD, DBA July 2011

Overarching Research Question: Does engagement in PD 360 significantly affect student success in Title I schools? Does engagement in PD 360 significantly affect student success in Title I schools?

Methods Design: Quasi-experimental, retrospective, pre-post, normalized treatment-control / participation vs. non-participation ( , ) Goal: Multi-State, large n with comparable student populations (matched, controlled) Student Change: * Metric was percent students classified as Proficient or Advanced in respective States. Change was computed as net change year-over-year, divided by Year 1 baseline * Improvement is percent change: [(Year2-Year1)/Year1] * Comparative change: [School change/District change] 3

Sample Description High Video Utilizers – 422 Schools Metrics: – Percent of users viewing – Student Success Note: High video utilizers was defined as minimum average of 90.0 minutes of participation in PD 360 per academic year per teacher for any school collectively. Percent of users viewing was the percentage of teachers within any school verified as participating in any module within PD 360 Student success was quantified as the sum of percent students classified as either Proficient or Advanced on the respective standardized state test.

School-wide Title 1 Findings Title 1 PD-360 Schools significantly outperformed their respective Districts Math: – 13.2% Advantage for Combined Pct Prof and Adv (p<.001) Reading: – 4.6% Advantage for Combined Pct Prof and Adv (p<.001) NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year.

Math Advantages The Districts fell 5.9%, while PD-360 Schools gained 7.3% for a 13.2% advantage (p<.001) The Districts fell 5.9%, while PD-360 Schools gained 7.3% for a 13.2% advantage (p<.001) NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year.

Math Advantages The Districts fell 5.9%, while PD-360 Schools gained 7.3% for a 13.2% advantage (p<.001) The Districts fell 5.9%, while PD-360 Schools gained 7.3% for a 13.2% advantage (p<.001) NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year

Math Advantages The Districts fell 5.9%, while PD-360 Schools gained 7.3% for a 13.2% advantage (p<.001) The Districts fell 5.9%, while PD-360 Schools gained 7.3% for a 13.2% advantage (p<.001) NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year.

Reading Advantages The Districts rose 0.1%, while PD-360 Schools gained 4.8% for a 4.6% advantage* (p<.001) The Districts rose 0.1%, while PD-360 Schools gained 4.8% for a 4.6% advantage* (p<.001) * Figures do not sum perfectly due to rounding NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year.

Reading Advantages * Figures do not sum perfectly due to rounding NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year. The Districts rose 0.1%, while PD-360 Schools gained 4.8% for a 4.6% advantage* (p<.001) The Districts rose 0.1%, while PD-360 Schools gained 4.8% for a 4.6% advantage* (p<.001)

Reading Advantages The Districts rose 0.1%, while PD-360 Schools gained 4.8% for a 4.6% advantage* (p<.001) The Districts rose 0.1%, while PD-360 Schools gained 4.8% for a 4.6% advantage* (p<.001) * Figures do not sum perfectly due to rounding NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year.

Predictors of Change in PD-360 Title 1 Schools Math: – #1 predictor – Percent of Users Viewing Reading – #1 predictor – Percent of Users Viewing *Statistically significant predictors from the regression model NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss.

Predictors of Change in PD-360 Title I Schools: – Math predictors: 1 – Percent of Users Viewing 2 – Average Minutes Viewed 3 – Total Users – Reading predictors: 1 – Percent of Users Viewing 2 – Registered Users Viewing 3 – Average Minutes Viewed *Statistically significant predictors from the regression model NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss.

Math Advantages – Approx. 13.2% growth advantage per 100 students than their respective district counterparts Reading Advantages – Approx. 4.6% performance advantage per 100 students than their respective district counterparts Summary of School Impacts NOTES: Results reflect comparative percent change year-over-year: (Year 2 – Year 1)/Year 1. A 0.0% change would indicate same scores for Year 1 and Year 2 – no gain or loss. Advantage reflects the net difference in percent change for School vs. Districts: (Pct Sch-Pct Dist). A 0.0% advantage would indicate no difference between Schools and Districts in the percent change year-over-year.