Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University.

Slides:



Advertisements
Similar presentations
The Journey – Improving Writing Through Formative Assessment Presented By: Sarah McManus, Section Chief, Testing Policy & Operations Phyllis Blue, Middle.
Advertisements

School Improvement Grants Tier I and Tier II Schools March, 2010.
Response to Intervention: Linking Statewide Initiatives.
We are using GoToWebinar for our Distance Learning sessions this year. Please be sure that you are using a headset with microphone and muting all other.
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
West Virginia Achieves Professional Development Series Volume II Standards-Based Curriculum.
Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.
Rubrics-Based Evaluation of a Statewide System of Support A Tool to Enhance Statewide Systems of Support.
May Dr. Schultz, Dr. Owen, Dr. Ryan, Dr. Stephens.
Rutland High School Technical Review Visit Looking At Results Planning Next Steps Learning About Resources.
Essential Elements in Implementing and Monitoring Quality RtI Procedures Rose Dymacek & Edward Daly Nebraska Department of Education University of Nebraska-
What is the Parent Involvement Plan (PIP)? Why do we have a Parent Involvement Plan (PIP)? (PIP) PARENT INVOLVEMENT PLAN 1.
Title I Needs Assessment/ Program Evaluation Title I Technical Assistance & Networking Session October 5, 2010.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Our Children Are Our Future: No Child Left Behind No Child Left Behind Accountability and AYP A Archived Information.
What We Know About Effective Professional Development: Implications for State MSPs Part 2 Iris R. Weiss June 11, 2008.
Leading from the front – the role of English in developing literacy across the school 20 March 2015 Lesley Daniel Associate inspector.
Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Theory & Practice – the new Common Inspection Framework and what it means to governors UCU-LSIS-UNISON FE Staff Governors’ Conference 3 December 2012 Lorna.
Prof. György BAZSA, former president Hungarian Accreditation Committee (HAC) CUBRIK Workshop IV Beograd, 13 March, 2012 European Standards and Guidelines.
Guidance from the CSDE on SRBI Implementation May 14, 2010 CAPSS Assistant Superintendents’ Meeting Mary Anne Butler, Education Consultant Iris White,
© New Zealand Ministry of Education copying restricted to use by New Zealand education sector. Page 1 Consider the Evidence Evidence-driven.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Utilizing the School Restructuring Resources Lauren Morando Rhim & Bryan C. Hassel Public Impact For Center on Innovation and Improvement.
The revised Common Inspection Framework for further education and skills Charlie Henry HMI Principal Officer Special Educational Needs and Disability Natspec.
Maryland’s Journey— Focus Schools Where We’ve Been, Where We Are, and Where We’re Going Presented by: Maria E. Lamb, Director Nola Cromer, Specialist Program.
MI draft of IDEIA 2004 (Nov 2009) WHAT HAS CHANGED? How LD is identified:  Discrepancy model strongly discouraged  Response To Instruction/Intervention.
Another New Framework Major Changes: No more satisfactory 2 strikes and you are out All criteria changed Very short notice No pre-inspection brief.
INTERNATIONAL SOCIETY FOR TECHNOLOGY IN EDUCATION working together to improve education with technology Using Evidence for Educational Technology Success.
Division Liaison Update Division Liaison Meeting The College of William and Mary January 7, 2013.
Reservoir Primary School Literacy Share Day
The Instructional Decision-Making Process 1 hour presentation.
Karen Seay PARENTAL INVOLVEMENT 101 – Writing a compliant policy and compact We’re all in this together:  State Department of Education 
Results of Survey on Level Organization June 2012.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
{ Principal Leadership Evaluation. The VAL-ED Vision… The construction of valid, reliable, unbiased, accurate, and useful reporting of results Summative.
Program Evaluation NCLB. Training Objectives No Child Left Behind Program Series: Program Evaluation To provide consistency across the State regarding.
ANNUAL AND FINAL PERFORMANCE REPORTS 524B FORM REPORTING PERIOD BUDGET EXPENDITURES INDIRECT COST RATE PERFORMANCE MEASURES.
Lessons Learned about Going to Scale with Effective Professional Development Iris R. Weiss Horizon Research, Inc. February 2011.
Staying on Message in Changing Times Oklahoma Statewide System of Support (SSOS) January 7, 2011 Dr. Cindy Koss, Assistant State Superintendent Oklahoma.
Delaware’s Performance Appraisal System for Administrators DPAS 2.5 Jacquelyn O. Wilson, Ed.D. University of Delaware Director Delaware Academy for School.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
No Child Left Behind. HISTORY President Lyndon B. Johnson signs Elementary and Secondary Education Act, 1965 Title I and ESEA coordinated through Improving.
Problem Solving and RtI ASCA Conference Denver, 2007 Rich Downs School Counseling Consultant Student Support Services Project Florida Department of Education.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
The Significance Section Jennifer Doolittle, Ph.D. April 23, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Our Theory of Action and Multi-Tiered Framework are anchored in the Vision and Mission for the Superintendent of Public Instruction. Office of Student.
SACS/CASI District Accreditation  January 2007  April 2007  May 2007  January – April 2008  Board Approval for Pursuit of District Accreditation.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Raising standards, improving lives
ESEA on Teacher Quality Pros Requires licensure, BA/BS, subject area knowledge Provides funding to states for PD Requires annual, measurable objectives.
1 Restructuring Webinar Dr. Zollie Stevenson, Jr., Ph.D. Director Student Achievement and School Accountability Programs Office of Elementary and Secondary.
Australian Council for Educational Research School Improvement Christian Schools National Policy Forum Canberra, 26 May 2014.
Statewide System of Support For High Priority Schools Office of School Improvement.
Third-Party Evaluation Studies as a Basis for Determining Program Effectiveness and Improvement Needs Center for Research and Reform in Education Johns.
Outcomes By the end of our sessions, participants will have…  an understanding of how VAL-ED is used as a data point in developing professional development.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Fostering a Culture of Data Use
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
2007 Article VII # ELFA 8 Education, Labor, and Family Assistance
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluating the Quality of Student Achievement Objectives
Presentation transcript:

Evaluating the Outcomes of SSOS: Qualities of an Effective Evaluation Steven M. Ross, Ph.D., Johns Hopkins University

Questions to Ponder Is evaluation substantively and routinely embedded in your SSOS? Are we as states going beyond completing the rubrics and looking at root causes and data? Is evaluation resulting in scaling up successful activities and discontinuing others (e.g., certain providers)? Are external evaluators being used for support? Should they be?

The Evaluation Rubric : A Quick Review The Evaluation Rubric : A Quick Review Part A: SSOS Plan and Design 1. Specified comprehensive plan 2. Defined evidence-based programs 3. Plan for evaluation Part A: SSOS Plan and Design 1. Specified comprehensive plan 2. Defined evidence-based programs 3. Plan for evaluation

The Evaluation Rubric : A Quick Review The Evaluation Rubric : A Quick Review Part B: Resources 4. Staff 5. Funding 6. Data analysis and storage 7. Distinguished educators, consultants, & experts 8. External providers

Part C: Implementation 9. Removal of barriers for change and innovation 10. Incentives for change 11. Communications 12. Technical Assistance 13. Dissemination of knowledge 14. Monitoring and program audits The Evaluation Rubric : A Quick Review The Evaluation Rubric : A Quick Review

Part D: Outcomes for Schools Served by SSOS 15. Student achievement 16. Student attendance 17. Graduation rate The Evaluation Rubric : A Quick Review The Evaluation Rubric : A Quick Review

43 Rubrics and Their Rating Scales I. Limited or No Development or Implementation II. Limited Development or Partial Implementation III. Mostly Functional Level of Development and Implementation IV. Full Level of Implementation and Evidence of Impact

SSOS Evaluation Rubric 2.5 From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 24, Table 2. Copyright 2009 by Academic Development Institute. Reprinted with permission. 2. Defined evidence-based programs/interventions for all students & subgroups

SSOS Evaluation Rubric 15.1 From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 37, Table 15. Copyright 2009 by Academic Development Institute. Reprinted with permission.

Why Evaluate SSOS: Isn’t There Enough to Do Already? Why Evaluate SSOS: Isn’t There Enough to Do Already? Being able to make reliable and valid judgments of the status of the services provided How fully are services being implemented? To what extent are expected outcomes being achieved?

Accountability for SDE and external organizations Why Evaluate SSOS: Isn’t There Enough to Do Already? Why Evaluate SSOS: Isn’t There Enough to Do Already?

Demonstrating accountability to consumers (districts, schools, educators, parents) Why Evaluate SSOS: Isn’t There Enough to Do Already? Why Evaluate SSOS: Isn’t There Enough to Do Already?

Developing a process for continuous program improvement Why Evaluate SSOS: Isn’t There Enough to Do Already? Why Evaluate SSOS: Isn’t There Enough to Do Already?

Properties of an Effective Evaluation Validity (rigorous, reliable) Evidence-Based – Documented plan – Meeting agenda – Survey responses – AYP outcomes

Strong Evidence: “80% of principals surveyed rated the Distinguished Educators’ support as very helpful and provided a specific example of how the DE helped their school.” Weak Evidence: “The Governor touted the state’s progress in assisting low-performing schools during his tenure.” Properties of an Effective Evaluation

Evaluative rather than descriptive Properties of an Effective Evaluation From Evaluating the Statewide System of Support, by S. Hanes, T. Kerins, C. Perlman, S. Redding, & S. Ross, 2009, p. 91, Table 1. Copyright 2009 by Academic Development Institute. Reprinted with permission.

Evaluating Educational Outcomes Part D, Section 15 of Rubric SSOS is ultimately about improving student achievement and educational success These are “distal” or “culminating” outcomes that may not show immediate change Nonetheless, it is educationally and politically important to monitor these indicators

Evaluating Educational Outcomes: Recommendation 1 1.Treat the essential indicators (Part D, Section 15) as a starting point only Given the 43 rubric targets, which services appear to be the most essential to improve? Prioritize these improvement needs

Evaluating Educational Outcomes: Recommendation 2 2. Supplement the essential indicators with follow- up analyses of “root causes” and data Potentially successful turnaround strategies that may be scalable Unsuccessful strategies that need replacement Explanation of the outcomes relative to SSOS service provided Example: Although we train “distinguished educators and revise the training each year, what evidence is there that the training is effective?

Analyses of “Root Causes” Example A: Schools showing the strongest increases in mathematics are visited by SDE and found to be using highly interactive teaching strategies and expanded learning opportunities Implication for SSOS?

Analyses of “Root Causes” Example B: School B increased its reading scores significantly over the past year. Follow- up study of student enrollment patterns reveals that student rezoning decreased the number of disadvantaged students by 50%. Implication for SSOS?

Analyses of “Root Causes” Example C: School C had several student subgroups fail to attain AYP in reading. Follow- up interviews with the principal and literacy coaches reveal that the new R/LA curriculum was poorly supported by the provider. Implication for SSOS?

Evaluating Educational Outcomes: Recommendation 3 3. Supplement the beginning evaluation (Recommendation 1) and follow-up analyses (Recommendation 2) with rigorous evaluations of selected interventions RFP’s for external studies Assistance to school districts interested in evaluation research Rigorous data analyses by SDE to study achievement patterns associated with SSOS interventions

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? “Teachers liked the professional development activities.”

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? “ Systematic observation by independent observers shows significant increases in student-centered instruction.”

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? “Teachers indicate that they use more student- centered instruction than in the past.”

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? “Principals and grade-level leaders indicate observing more frequent cooperative learning than last year.”

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? “The providers of the professional development believed the offerings to be successful.”

Accurate and Relevant Evidence Strong, Suggestive, or Weak ? “Reading scores increased by 15% for the schools receiving SSOS in literacy.”

Working with External Evaluators Question: Is it more or less “costly” than using SDE staff? Answer: It depends on the expertise and availability of the latter.

Working with External Evaluators What types of evaluation tasks most need external evaluators? The Basic Rubric (“Study I”) and the essential indicators (“Study II”) might best be performed in-house The external evaluator (at low cost) would be helpful to corroborate the Study I and II findings Rigorous studies of specific interventions (“Study III”) are most appropriate for external evaluators

Working with External Evaluators Advantages of External Evaluators Expertise in research design/data analysis School/district staff likely to be more disclosive Independence/credibility

Working with External Evaluators Key Steps Use systematic process to select the evaluator (careful review of prior work and client satisfaction) Establish clear plan of work and budget Clearly define evaluation/research questions Monitor the study via regular meetings/reports, etc. Work with the evaluator to disseminate results to improve policies and practices

Concluding Comments Unless SSOS is evaluated, it is unlikely to improve The benefits of the evaluation depend on its rigor and quality There is little to gain by painting rosy pictures of mediocre outcomes – The message is that all is well and should be left alone – A truthful negative evaluation is a stimulus for change There is much to gain by presenting objective results to sustain services that work and improve those that are ineffective