How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public.

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Synthesizing the evidence on the relationship between education, health and social capital Dan Sherman, PhD American Institutes for Research 25 February,
How Can Using Data Lead to School Improvement?
Donald T. Simeon Caribbean Health Research Council
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
Designing an Effective Evaluation Strategy
Benefits and limits of randomization 2.4. Tailoring the evaluation to the question Advantage: answer the specific question well – We design our evaluation.
Gathering Credible Data Seattle Indian Health Board Urban Indian Health Institute Shayla R. Compton, MPH University of Washington School of Public Health.
Evaluation Research, aka Program Evaluation. Definitions Program Evaluation is not a “method” but an example of applied social research. From Rossi and.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
Knowing What We Learn and Demonstrating Our Success: Assessment and Evaluation of FLCs Andrea L. Beach, Ph.D. Western Michigan University Presented at.
Title I Needs Assessment and Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
Types of Evaluation.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Using An Organizational Assessment : A framework to Help Agencies Build on Strengths, Recognize Challenges, and Develop a Comprehensive Work Plan, CWDA.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop the Right Research Questions for Program Evaluation
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Needs Analysis Session Scottish Community Development Centre November 2007.
An Update on Florida’s Charter Schools Program Grant: CAPES External Evaluation 2014 Florida Charter Schools Conference: Sharing Responsibility November.
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Striving to Link Teacher and Student Outcomes: Results from an Analysis of Whole-school Interventions Kelly Feighan, Elena Kirtcheva, and Eric Kucharik.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Ph. D. Completion and Attrition: Analysis of Baseline Data NSF AGEP Evaluation Capacity Meeting September 19, 2008 Robert Sowell Council of Graduate Schools.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Setting the Stage: Workshop Framing and Crosscutting Issues Simon Hearn, ODI Evaluation Methods for Large-Scale, Complex, Multi- National Global Health.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Leading (and Assessing) a Learning Intervention IMPACT Lunch and Learn Session August 6, 2014 Facilitated By Ozgur Ekmekci, EdD Interim Chair, Department.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
November 15, Regional Educational Laboratory - Southwest The Effects of Teacher Professional Development on Student Achievement: Finding from a Systematic.
Evaluation of the Indiana ECCS Initiative. State Context Previous Early Childhood System Initiatives –Step Ahead –Building Bright Beginnings SPRANS Grant.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Program Evaluation Principles and Applications PAS 2010.
Understanding Positive Behavior Support Closing the Gap Conference November 12, 2009 Pamela A. Clark, MSW.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Characteristics of Studies that might Meet the What Works Clearinghouse Standards: Tips on What to Look For 1.
Erik Augustson, PhD, National Cancer Institute Susan Zbikowski, PhD, Alere Wellbeing Evaluation.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
Assessment of Your Program Why is it Important? What are the Key Elements?
Randomized Control Trials: What, Why, How, When, and Where Mark L. Davison MESI Conference March 11, 2016 Department of Educational Psychology.
Program Planning for Evidence-based Health Programs.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
STRATEGIES TO IMPROVE SYLLABUS DESIGN Mr. Philip Montgomery.
Business Leadership Network: Program Evaluation in Public Health March 28, 2016.
June 25, Regional Educational Laboratory - Southwest Review of Evidence on the Effects of Teacher Professional Development on Student Achievement:
Model of an Effective Program Review October 2008 Accrediting Commission for Community and Junior Colleges.
Designing Effective Evaluation Strategies for Outreach Programs
Technical Assistance on Evaluating SDGs: Leave No One Behind
NC State Improvement Project
Introduction to Program Evaluation
Data-Driven Instructional Leadership
Evidence-Based Practices Under ESSA for Title II, Part A
Presentation transcript:

How Do We Know if a Charter School is Really Succeeding? – Various Approaches to Investigating School Effectiveness October 2012 Missouri Charter Public Schools Association 2012 Conference Presented By: Otoniel Lopez Jing Zhu

Introduction Why data and evaluation? Evaluation components Impact/outcome studies Implementation studies Choosing the right design 1 Presentation Outline

Focus areas: K-12 Education, higher education, children and family services, youth development, juvenile justice, etc. Conducted six evaluations of two charter school programs spanning six years 2 Metis Associates National applied research and consulting firm (New York City, Philadelphia, Atlanta) Over 35 years of expertise in research/evaluation, grants development, and information technology

3 Why data and evaluation? Demonstrate program impact Identify successful practices and challenges Assess overall program fidelity Engage key stakeholders Facilitate the daily management of the grant Inform programmatic decisions Fulfill federal and state reporting requirements

4 Evaluation Components PurposeData sources and methods Impact/ Outcome Study Assess program impact on: 1.Academic performance 2.Customer impact and satisfaction Statistical analyses Review of school characteristics and their association with outcomes Stakeholder surveys Analysis of demographic, program participation, academic achievement and attendance data Implementation Study Assess implementation regarding: 1.Program fidelity 2.Promising practices, challenges and lessons learned Review of project documentation Interviews with project staff and partners Observations of cross-school activities

Randomized controlled trial (RCT) –The gold standard –Random assignment of students, classes or schools –A number of long-standing concerns (e.g., ethical, logistical, and financial) –Attrition and other issues 5 Impact Study Designs (I)

Quasi-experimental design (QED) –Need for a comparison group Naturally occurring Statistically well-matched –Common matching characteristics (baseline achievement, gender, race/ethnicity, ELL status, poverty status, etc.) –Assess baseline equivalence of two groups –Cannot control for potential unobserved differences between groups 6 Impact Study Designs (II)

The What Works Clearinghouse (WWC) –Initiative of the U.S. DOE’s IES –Started in 2002, reports since 2005 Three possible study ratings –Meets WWC Evidence Standards without Reservations (RCT with low attrition) –Meets WWC Evidence Standards with Reservations (RCT with high attrition OR QED; must establish baseline equivalence) –Does Not Meet WWC Evidence Standards 7 WWC Study Ratings

RCT studies –Random lottery (oversubscription to enrollment) Gleason, P., Clark, M., Tuttle, C. C., & Dwoyer, E. (2010). The Evaluation of Charter School Impacts: Final Report. Dobbie, W., & Fryer, R. G., Jr., (2009). Are High-Quality Schools Enough to Close the Achievement Gap? Evidence from a Social Experiment in Harlem. QED studies –Statistical matching of students Center for Research on Education Outcomes. (2011). Charter School Performance in Indiana. Center for Research on Education Outcomes. (2010). Charter School Performance in New York City. 8 Rigorous Charter School Evaluations

Survey Research Observations CAPTURE experience of participants PROVIDE quantifiable data that can be used in associating that experience with other hard data (e.g., student achievement) MEASURE changes in perceptions overtime ASSESS instruction using quantitative tools developed from a set of standards or known best practices QUANTIFY a set of items or behaviors within a school or classroom Two Popular Qualitative Methods in Impact Studies for Charter Schools 9

Assessment of program fidelity Question of resources and capacity Are intended populations being reached? Are services appropriate? Alignment of outcomes and implementation –Logic model 10 Implementation Studies

11 Logic Model

Interviews with key personnel Focus Groups with a set of individuals closely tied to the particular program (e.g., teachers) Observations of instruction, faculty meetings, or school walkthroughs Some survey research Collection of program documentation 12 Methods for Collecting Implementation Data

Implementation: Provides ongoing data (i.e., formative) Provides a real-world look at what is actually going on at a school Does not require long periods to gather useful information Doesn’t require a comparison group Outcome: Measures program impact Can provide an evidence base Provides useful information to policy makers 13 Advantages of Implementation and Outcome Components

Complex designs vs. point-in-time descriptive studies Balancing design approaches in current economic climate Before identifying right fit: –Use of theories of change, logic models, information systems and self-evaluation to inform research. 14 One Size Doesn’t Fit All

What Works Clearinghouse (WWC) Official Website ( American Evaluation Association Online Resources ( American Education Evaluation Association ( Kellogg Foundation ( and-resources.aspx) and-resources.aspx W.K. Kellogg Foundation Logic Model Development Guide W.K. Kellogg Foundation Evaluation Handbook The Evaluator’s Institute ( Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004).Evaluation: A Systematic Approach. (7 ed.). Thousand Oaks: Sage Publications, Inc. 15 Evaluation Resources

16 Otoniel Lopez Jing Zhu