Evaluating SES Providers Steven M. Ross Allison Potter Center for Research in Educational Policy The University of Memphis

Slides:



Advertisements
Similar presentations
New Eligibility and Individualized Educational Program (IEP) Forms 2007 Illinois State Board of Education June 2007.
Advertisements

ACCOMMODATIONS MANUAL
Supplemental Educational Services in the State of North Carolina: Evaluation Findings and Activities Steven M. Ross & Martha J Alberg Center for Research.
Response to Intervention (RtI) in Primary Grades
Pennsylvania’s Continuous Improvement Process. Understanding AYP How much do you know about AYP?
1 The Federal No Child Left Behind Act and the Financial Impact on Manchester Public Schools Fiscal Year
Ensuring Effective Services to Immigrant &/or LEP/ELL Children & Families: It’s Right, & It’s the Law! © Statewide Parent Advocacy Network.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Supplemental Educational Services Evaluations Data Collection Process Allison Potter Steven M. Ross Center for Research in Educational Policy The University.
Response to Intervention RTI – SLD Eligibility. What is RTI? Early intervention – General Education Frequent progress measurement Increasingly intensive.
Middle Level Education Kyrene School District November 2007.
National Center on Educational Outcomes N C E O Strategies and Tools for Teaching English Language Learners with Disabilities April 9, 2005 Kristi Liu.
IDENTIFICATION 1 PROPOSED REGULATORY CHANGECOMMENTS Implement a four step ELL identification process to ensure holistic and individualized decisions can.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Toolkit Series from the Office of Migrant Education Webinar: CNA Toolkit August 21, 2012.
performance INDICATORs performance APPRAISAL RUBRIC
GEMS: G rowth & E ffectiveness M easures for S chools State of Israel Ministry of Education Culture & Sport Office of the Director General Division of.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Principles of Assessment
Evaluating SES Providers Steven M. Ross Allison Potter Center for Research in Educational Policy The University of Memphis
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
Creating Assessments with English Language Learners in Mind In this module we will examine: Who are English Language Learners (ELL) and how are they identified?
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
ICSD District RtI Committee Agenda 3/13/12 3:45- Review of Our Norms and today’s agenda 4:00- Defining RtI and screening tool criteria 4:30- Begin review.
AFT 7/12/04 Marywood University Using Data for Decision Support and Planning.
Provided by Education Service Center Region XI 1 Title I, Part A Overview Provided by Education Service Center Region XI
A Parent’s Guide to Understanding the State Accountability Workbook.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
Stronge Teacher Effectiveness Performance Evaluation System
CREP Center for Research in Educational Policy The Center for Research in Educational Policy Best Practices in Program Evaluation: Strategies for Increasing.
Classroom Assessments Checklists, Rating Scales, and Rubrics
2. NLTS2 Study Overview. 1 Prerequisites Recommended module to complete before viewing this module  1. Introduction to the NLTS2 Training Modules.
Robert Barnoski (Barney) Washington State Institute for Public Policy Phone: (360) Institute Publications:
Assessing Students With Disabilities: IDEA and NCLB Working Together.
 Collecting Quantitative  Data  By: Zainab Aidroos.
SES Data Collection Methods and Multi-State Results Allison Potter Steven M. Ross Center for Research in Educational Policy The University of Memphis.
The Instructional Decision-Making Process 1 hour presentation.
Karen Seay PARENTAL INVOLVEMENT 101 – Writing a compliant policy and compact We’re all in this together:  State Department of Education 
Supplemental Educational Services (SES) Data Collection Process: Roles and Responsibilities of LEAs GaDOE Data Collections Conference August 17, 2011 Athens,
Critical Elements Effective teaching Alignment of curriculum and instruction to standards Instructional program coherence Fidelity of implementation Evaluation.
NCLB Federal Funding Planning Meeting Private Non Profit Schools LEA Date.
Monitoring and Evaluating SES Provider Programs
SES Overview Supplemental Education Services. What is SES? Additional academic instruction that is provided outside of the regular school day Designed.
Special Education Law for the General Education Administrator Charter Schools Institute Webinar October 24, 2012.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Title I, Part A Improving Basic Programs Program Requirements and Guidelines.
PLCS & THE CONNECTION TO RESPONSE TO INTERVENTION Essentials for Administrators Sept. 27, 2012.
CREP Center for Research in Educational Policy SES Student Achievement Methods/Results: Multiple Years and States Steven M. Ross Allison Potter The University.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
 Three Criteria: Inadequate classroom achievement (after intervention) Insufficient progress Consideration of exclusionary factors  Sources of Data.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Minnesota Manual of Accommodations for Students with Disabilities Training January 2010.
Ensuring Progress in the General Education Curriculum ED 222 Spring 2010.
1 Welcome to the Title I Annual Meeting for Parents Highland Renaissance Academy.
THE METLIFE SURVEY OF THE AMERICAN TEACHER: CHALLENGES FOR SCHOOL LEADERSHIP Gwendolyn Thomas Kimberly Patterson Shannon Biggs.
Supporting the Development of Student Learning Objectives How to Create an SLO.
Why set Student Growth Objectives (SGOs)?  Currently the Aldine Growth Model used to measure student growth can only be applied to teachers in content.
Learning today. Transforming tomorrow. REED: Review Existing Evaluation Data 55 slides.
Revisiting SPL/IIT/SAT/SLD AND OTHER ALPHABETIC ANOMOLIES!
Overview of Types of Measures Margaret Kasimatis, PhD VP for Academic Planning & Effectiveness.
Specific Learning Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
“All kids get to go to school and get a fair chance to learn. That’s the idea behind IDEA. Getting a fair chance to learn, for kids with disabilities,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
American Institutes for Research
Classroom Assessments Checklists, Rating Scales, and Rubrics
ANNUAL TITLE I MEETING NOBLE ACADEMY COLUMBUS.
School Title I Stakeholder Meeting
Assessing Students With Disabilities: IDEA and NCLB Working Together
Presentation transcript:

Evaluating SES Providers Steven M. Ross Allison Potter Center for Research in Educational Policy The University of Memphis

Effectiveness: Increased student achievement in reading/language arts or mathematics. Customer satisfaction: Positive perceptions by parents of SES students. Service delivery and compliance: Positive perceptions by principals, teachers, LEA staff, etc. Determining Evaluation Measures

Effectiveness (Student Achievement) Service Delivery and Compliance Customer Satisfaction Provider Survey District Coordinator Survey Principal/Liaison Survey Teacher Survey Parent Survey Additional Tests State Tests Figure 1. Components of a Comprehensive SES/Evaluation Modeling Plan Overall Provider Assessment

1. Student-level test scores from state-mandated assessments Effectiveness Measures Considerations: availability only for certain grades (e.g., 3-higher)? Lack of pretest scores prevents gains from being determined

2. Supplementary individualized assessments in reading/language arts or math Effectiveness Measures Considerations: Without pretest scores and comparison students, SES gain cannot be determined Validity may be suspect if assessments not administered by trained independent testers

3.Provider-developed assessments in reading/language arts or math Effectiveness Measures Considerations: Test results may not be valid or suitable for state’s evaluation purpose Tests may favor provider’s strategies

1. Parent and family perceptions Customer Satisfaction Measures Considerations: Parent respondents may not be representative of the population served by provider Samples sizes will vary due to provider size Comparisons limited due to parent familiarity with only one provider

2. Student perceptions Customer Satisfaction Measures Considerations: Young students may have difficulty judging quality of services and communicating impressions Time consuming and may require parent permission to obtain

1. Records of services provided, student attendance rates, and costs Service Delivery and Compliance Measures Considerations: States may obtain data from a variety of sources, including providers, teachers, principals, and district staff Corroborating data from multiple sources can increase accuracy of evaluation conclusions

2. Feedback from SES customers Service Delivery and Compliance Measures Considerations: First-hand impressions or observations may be lacking Translation may be needed to reach parents who do not speak English Obtaining representative samples may be difficult

3. Feedback from district staff Service Delivery and Compliance Measures Considerations: Districts may lack firsthand impressions or observations of tutoring services Some districts may also be SES providers

4. Feedback from school staff Service Delivery and Compliance Measures Considerations: Teachers may also be SES instructors or lack first- hand impressions of providers Teachers may need to provide information on multiple providers, which may be confusing and time consuming Identifying teachers to solicit responses may be difficult

States will need to collect a large amount of data to evaluate SES providers, which may require a regional database that connects: Technology and Database Considerations Achievement data and related characteristics for all students who are eligible for SES Each student served by SES with a specific SES provider Details about the services offered by each SES provider

A. Benchmark Comparison Rating = ++ (Low to Moderate rigor) Percentage of SES students by provider attaining “proficiency” on state assessment Evaluation Designs: Student Achievement

A. Benchmark Comparison Evaluation Designs: Student Achievement Upgrades Percentage of SES in all performance categories (“Below Basic”, “Basic”, etc.) Comparison of performance relative to prior year and to state norms Comparison to a “control” sample

A. Benchmark Comparison Evaluation Designs: Student Achievement Advantages Inexpensive and less demanding Easily understood by practitioners and public Linked directly to NCLB accountability Disadvantages Doesn’t control for student characteristics Doesn’t control for schools Uses broad achievement indices

B. Multiple Linear Regression Design Rating = +++ (Moderate rigor) Compares actual gains to predicted gains for students enrolled in SES, using district data to control for student variables (e.g., income, ethnicity, gender, ELL, special education status, etc.). Evaluation Designs: Student Achievement

B. Multiple Linear Regression Design Evaluation Designs: Student Achievement Advantages More costly than Benchmark, but relatively economical Student characteristics are statistically controlled Disadvantages Doesn’t control for school effects Less understandable to practitioners and public Effect sizes may be less stable than for Model C.

C. Matched Samples Design Rating = ++++ (High Moderate to Strong rigor) Match and compare SES students to similar students attending same school (or, if not feasible, similar school) Use multiple matches if possible Evaluation Designs: Student Achievement

C. Matched Samples Design Evaluation Designs: Student Achievement Advantages Some control over school effects Easily understood by practitioners and public Highest potential rigor of all designs Disadvantages More costly and time consuming Within-school matches may be difficult to achieve

Surveys for LEAs, principals/site coordinators, teachers, parents, and providers. Common core set of questions from all groups to permit triangulation. Choice of “frequently”, “occasionally”, “not at all”, and “don’t know” Open-ended question, “Additional comments” Data Collection Tools

Selected survey questions: Data Collection Tools What was the start date of provider services? In which subjects did your students receive services from this provider? Are you employed by the provider for which you are completing this survey? How often does the provider: Communicate with you during the school year? Meet the obligations for conducting tutoring sessions?

Selected survey questions: Data Collection Tools The provider: Adapted the tutoring services to this school’s curriculum Aligned their services with state and local standards Offered services to Special Education and ESL students Complied with applicable federal, state, and local laws

Selected survey questions: Data Collection Tools Overall assessment: I believe the services offered by this provider positively impacted student achievement Overall, I am satisfied with the services of this provider

Sample questionnaire responses to open ended question The program began much too late in the school year (after testing) to impact learning this year. I have never spoken to the instructors. I have no knowledge as to the structure of the program. [Provider] never called his classroom teacher, never looked at student records, or coordinated efforts until finally his classroom teacher got through and spoke of learning problems. I saw great gains with the kids who were served by this provider – they benefited from this program. Data Collection Tools Teachers:

Provider survey selected questions: Data Collection Tools Describe the format of your services: Program duration Setting Format (small groups, individual) What is your general instructional plan? Describe qualifications of tutors (including data on background checks) List information regarding students served, goals achieved, and tutoring sessions attended

Rubric of Overall Evaluation of Provider Effectiveness Outcome Insufficient Information Below Standards Marginal Quality AcceptableAbove Standards 1. Student Achievement There is insufficient information available to determine student achievement outcomes. Students have not shown gains related to tutoring received from service providers. About half of the students have made some gain related to tutoring received from service providers. There has been some gain for the majority (over 60%) of students related to tutoring received from service providers. The effect size for students in the provider’s program is in the top one-third of all the effect sizes demonstrated by providers meeting standards for student achievement. 2. CommunicationThere is insufficient information available to determine communication outcomes. Provider has not communicated with the principals, teachers, and parents of students served. There has been limited communication throughout the year between the provider and at least two of the following: principals, teachers, and parents. There has been some regular communication throughout the year between the provider and the principals, teachers, and parents of students served. There is an ongoing and sustained system of communication between the provider and the school- level educators as well as parents of students served. 3. Instructional Plans There is insufficient information available to determine instructional plans of the provider. Provider does not plan instruction explicitly geared to student needs or to reinforce their regular academic program. Provider is in the planning stages of gearing instruction to student needs, and reinforcing the regular academic program. Provider has made some attempt with the majority of students to plan instruction explicitly geared to student needs and to reinforce the regular academic program. Provider instructional plans are explicitly geared to the needs of most or all students and reinforce the regular academic program. 4. Local and State Standards There is insufficient information available to determine alignment with local and state standards. None of the instructional plans used by the provider are aligned with local and state academic standards for students. Provider is in the process of aligning instructional plans with local and state academic standards for students. Some of the instructional plans used by the provider are presently aligned with local and state academic standards for students. Most or all of the instructional plans are presently aligned with local and state academic standards for students. 5. Special Ed/ELL students There is insufficient information available to determine special ed/ELL student outcomes. Provider does not offer accommodations for addressing the needs of special ed or ELL students. Provider has made limited accommodations for addressing the needs of special ed and ELL students. Provider has made some accommodations for addressing the needs of special ed and ELL students. Provider offers appropriate services, if needed, to special education and ELL students. 6. Provider Overall There is insufficient information available to determine provider overall outcomes. There is overall dissatisfaction with the provider at the district and school levels. There is more dissatisfaction than satisfaction with the provider at the district and school levels. There are mixed but mostly positive reactions about the provider at the school and district levels. There is overall satisfaction with the provider at the district and school levels.

Decision Tree for SES Providers Probation I

CONCLUSION SES Evaluation models that are both suitably rigorous and practical for states to employ are still evolving. Each state has unique needs, priorities, access to resources, and procedures used to implement SES. States may face a trade-off between practicality concerns (cost and time) and rigor (the reliability and accuracy of findings).

CONCLUSION Each state should begin its SES evaluation planning process by identifying a)the specific questions that its SES evaluation needs to answer, and b)the resources that can be allocated reasonably to support further evaluation planning, data collection, analysis, reporting, and dissemination.

CONCLUSION Work through the hierarchy of evaluation designs presented here and select the design that allows the highest level of rigor. States may wish to engage third-party evaluation experts in helping to plan and conduct these evaluations.