Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Title I Directors Conference Sept 2007 Carol Diedrichsen Gwen Pollock Surveys of the Enacted Curriculum for English.
AUSD Mathematics Summit Presentation to the Board of Education November 25, 2008.
How Can Using Data Lead to School Improvement?
LINDSAY CLARE MATSUMURA HELEN GARNIER BRIAN JUNKER LAUREN RESNICK DONNA DIPRIMA BICKEL June 30, 2010 Institute of Educational Sciences Conference Evidence.
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Third-Party Evaluation Studies as a Basis for Determining Program Effectiveness and Improvement Needs Center for Research and Reform in Education Johns.
Campus Staffing Changes Positions to be deleted from CNA/CIP  Title I, Title II, SCE  Academic Deans (211)  Administrative Assistants.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
PISA Partnership to Improve Student Achievement through Real World Learning in Engineering, Science, Mathematics and Technology.
What is program success? Wendy Tackett, Ph.D., Evaluator Valerie L. Mills, Project Director Adele Sobania, STEM Oakland Schools MSP, Michigan.
Reading First Evaluation in Georgia: A Multidimensional Approach Ken Proctor Reading First Director Georgia Department of Education Michael C. McKenna.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Literacy Coaching as a Component of Professional Development Joanne F. Carlisle, PhD Coauthors: Kai Cortina, Dan Berebitsky (University of Michigan), and.
The Workshop Model: Optimizing the Mini-lesson By: Lori Grabel & Klarisa Konstantinovsky Education – Spring 2009 Dr. O’Connor- Petruso.
The Targeted Reading Intervention: How Early Reading Intervention for Rural Kindergarten and First-Grade Students Affects Teachers’ Ratings of Students’
What is Effective Professional Development? Dr. Robert Mayes Science and Mathematics Teaching Center University of Wyoming.
GUIDANCE SYSTEM OF SUPPORT COLLEGE AND CAREER READY FOR ALL Guidance and Counseling Fall 2011.
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Dr. Bonnie J. Faddis & Dr. Margaret Beam RMC Research Fidelity of Implementation and Program Impact.
School Improvement Improving what’s happening in the classroom for students with disabilities: instruction & its impact on student learning Systems that.
School Leadership Teams Collaborating for Effectiveness Begin to answer Questions #1-2 on the Handout: School Leadership Teams for Continuous Improvement.
EVALUATION REPORT Derek R. Lane, Ph.D. Department of Communication University of Kentucky.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Evaluating a Literacy Curriculum for Adolescents: Results from Three Sites of the First Year of Striving Readers Eastern Evaluation Research Society Conference.
Evaluating the Vermont Mathematics Initiative (VMI) in a Value Added Context H. ‘Bud’ Meyers, Ph.D. College of Education and Social Services University.
The Research Design Research for Better Schools Philadelphia, PA Jill Feldman, Ph.D., Director of Evaluation.
PCG Education Helping Florida Schools and Districts Successfully Transition to the Florida Standards Palm Beach Charter School Administrators.
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Larry Condelli Stephanie Cronen American Institutes for Research, USA LESLLA Sixth Annual Symposium Cologne, Germany August 26,
Monica Ballay Data Triangulation: Measuring Implementation of SPDG Focus Areas.
What Was Learned from a Second Year of Implementation IES Research Conference Washington, DC June 8, 2009 William Corrin, Senior Research Associate MDRC.
Making a Difference Update on the National i3 Evaluation of the Implementation and Impact of Diplomas Now.
Impacts of Comprehensive Teacher Induction: Final Results from a Randomized Trial IES Summer Research Conference, June 2010 Steven Glazerman ● Eric Isenberg.
Mathematics and Science Education U.S. Department of Education.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
CRESST’s Evaluation of the Artful Learning Program: “Findings,” Contexts, and Future Explorations Noelle Griffin,Ph.D UCLA Graduate School of Education.
Conducting RCTs in Schools: Challenges and Solutions 2007 AEA Annual Conference Research for Better Schools Kelly Feighan, Senior Research Coordinator.
Teacher and Principal Evaluations and Discipline Under Chapter 103.
By: Jaqueline Lundie & Darren West EDIT 6900 Spring 2011.
Mathematics and Science Partnerships: Summary of the Performance Period 2008 Annual Reports U.S. Department of Education.
MSRP Year 1 (Preliminary) Impact Research for Better Schools RMC Corporation.
®® The Impact of Professional Development Models and Strategies on Teacher Practice and Student Achievement in Early Reading IES Research Conference Michael.
CaMSP Cohort 8 Orientation Cohort 8 State and Local Evaluation Overview, Reporting Requirements, and Attendance Database February 23, 2011 California Department.
 Development of a model evaluation instrument based on professional performance standards (Danielson Framework for Teaching)  Develop multiple measures.
Secondary Systems Overview What to Know Before Venturing Out of Bounds OrRTI Project January 15, 2009.
Coaches Survey: Mining the Data May 8, 2006 PA High School Coaching Initiative.
Urban Middle Grades Mathematics Curriculum Implementation Karen D. King Monica Mitchell Candace Barriteau Phaire Jessica Tybursky.
Changing Teaching Behaviors: The Road to Student Achievement Powell et al: Technology as a potentially cost-effective alternative to on-site coaching Research.
Exploring the Relationship between Teachers’ Literacy Strategy Use and Adolescent Achievement Kelly Feighan, Research for Better Schools Elizabeth Heeren,
American Educational Research Association Annual Conference New York – March 24-28, 2008 Noelle Griffin, Ph.D. Evaluation of an Arts-Based Instructional.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Aligning Assessments to Monitor Growth in Math Achievement: A Validity Study Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information Washington.
Office of Service Quality
Three ‘R’s for Evaluating the Memphis Striving Readers Project: Relationships, Real-World Challenges, and RCT Design Jill Feldman, RBS Director of Evaluation.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
Evaluating a Multi-Year, Federally-Funded Educational Initiative: Lessons from a Successful School District-Evaluator Partnership Kelly Feighan, Research.
Producing Data: Experiments BPS - 5th Ed. Chapter 9 1.
Internal Evaluation of MMP Cindy M. Walker Jacqueline Gosz Razia Azen University of Wisconsin Milwaukee.
Quality Review Updates for Presented by Mary Barton, SATIF CFN 204 Assistant Principals’ Conference September 2, 2011.
Presented by Mary Barton SATIF CFN 204 Principals’ Conference September 16, 2011.
APR 2014 Report: Data, Analysis and Action Plan for Full Accreditation.
Third-Party Evaluation Studies as a Basis for Determining Program Effectiveness and Improvement Needs Center for Research and Reform in Education Johns.
Research Questions  What is the nature of the distribution of assignment quality dimensions of rigor, knowledge construction, and relevance in Math and.
Middle School Training: Ensuring a Strong Foundation of Supports
Evaluation of An Urban Natural Science Initiative
Brahm Fleisch Research supported by the Zenex Foundation October 2017
Linking Evaluation to Coaching and Mentoring Models
Presentation transcript:

Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik Research for Better Schools, Philadelphia, PA American Evaluation Association Annual Meeting, November 12, 2009 in Orlando, Florida 1

Study Purpose  Investigate which variables best explain student reading outcomes following teacher professional development  Explore the contextual reasons that help explain why no intervention “impact” was detected  Inform educational policy and improve rigor of educational research 2

Project Background  Federal Striving Readers program aimed at improving pedagogy and student achievement  Schools were matched in pairs and then randomly assigned to the treatment or control condition  Professional Development: four-semester course, onsite literacy coaching, leadership seminar, and curricular material  Developer’s hypothesis: integrating literacy strategies in content areas will yield student gains 3

Factors Affecting Student Learning  Student - level: SES, socio-demographic variables, family background, early development (Barton & Coley, 2009)  Teacher/classroom-level: expectations, preparation, experience, class size (Cohen, McCabe, Mitchelli, and Pickeral, 2009)  School-level: school climate - safety, student- adult and peer relationships, curriculum rigor (Cohen, McCabe, Mitchelli, and Pickeral, 2009) 4

Study Participants 30 ELA teachers taught at eight schools –16 taught at intervention schools –14 taught at comparison schools 2,114 students linked to these teachers –state assessment reading scores (N = 2,064) –ITBS scale reading scores (N = 1741) 5

Methodology  Quantitative data sources:  RBS teacher survey  School district school climate survey  Department of Education teacher HQT statistics and student discipline data  Students’ scores on state assessment and ITBS 6

Methodology  Qualitative data sources:  Observations  56 classrooms (Year 1)  48 classrooms (fall of Year 2)  10 paired observations (spring of Year 2)  Interviews  8 principals and 19 school improvement team members in Years 1 and 2  Focus groups: seven groups with 62 teachers 7

Research Hypotheses  Exposure to professional development participants will yield gains in reading achievement  Including contextual variables in impact analysis will increase explanatory power of results 8

Quantitative Analysis  Used Hierarchical Linear Modeling (HLM) to predict student performance based on student-, teacher-, and school-level characteristics  Fully unconditional model represents how variation in an outcome measure is allocated across the three different levels 9

Variables Included in the HLM 10 Student Gender Race ELL State pre-test ITBS pre-test Grade level Attendance Teacher Had >8 hours of PD in past year Education level Years teaching Completed intervention Level of preparedness & frequency of using literacy strategies School % of classes taught by HQT Principal climate score % suspensions % perceiving they are college bound School safety score Staff stability

Outcome Variables: Reading Scores Random EffectVariance Component dfChi- square P ValueVariance Decomposition (% by level) State Test Students Teachers Schools ITBS Students Teachers Schools

Student-Level Variation  Across multiple model specifications, the only predictors with statistical significance were the student’s  Pre-test score  Gender  ELL status  Modeling teacher-level factors produced no significant results 12

Classroom Observation Results  No baseline differences in levels of engagement & cognitive demand, or in instructional strategies  Cognitive demand level of lessons was low in Year 2, irrespective of research condition  Intervention teachers tended to use more literacy strategies than comparison teachers in Year 2  38.5% of intervention teachers used multiple literacy strategies vs 18.2% of comparison teachers 13

Why We May Not Find Impact  Low c ognitive demand of lessons  Counterfactual situations may “water down” the treatment’s effect  Low implementation fidelity  Limitations in outcomes measures (just say measurement error) 14

Implications for Further Research  Better understanding of  the relationship between a school-level intervention and its potential to affect student achievement  Correlates of student achievement  Why an intervention that did not show impact may nevertheless be of value 15