Download presentation
Presentation is loading. Please wait.
Published byMarshall Holt Modified over 8 years ago
1
Chair: Steven M. Ross, Center for Research in Educational Policy; Center on Innovation & Improvement Collaborating Researchers: Jen Harmon, Center on Innovation & Improvement Kenneth Wong, Brown University; Center on Innovation & Improvement Supplemental Educational Services Approving, Monitoring, Evaluating
2
Promising Practice Briefs: Approving, Monitoring, and Evaluating Providers Commissioned by the Office of Innovation and Improvement To be developed and released in fall 2008
3
Promising Practice Briefs Sources of data State SES Director Survey Completed by All States National Meetings Site Visits to States Interviews with SES Directors Authors’ Experiences as SES Consultants and Researchers
4
Recruitment Two-thirds of the states actively (14%) or informally (52%) recruit providers via: Direct invitations Web announcements District publicity State meetings and other means
5
Application Requirements Aside from core application information, states include as optional components: Attendance at informational meetings Recommendations from former clients A detailed plan for communicating with teachers, parents, and district coordinators In-person interview Demonstration/description of a tutoring lesson Identification of minimum tutor qualifications
6
Strategies Used in the Approval Process
7
The Most Successful Practices Application review using independent review teams (f = 19) Clear scoring rubrics (f = 9) Technical assistance to applicants (f = 5) Requesting curriculum and tutoring descriptions (f = 2) Provider interview (f = 2)
8
Challenges
9
Desired Improvements Multiple states want to improve their process by: Requiring submission of lesson plans Adding an interview process Strengthening scoring rubric Improving reviewer training
10
Increased Federal Assistance Increased federal assistance is desired in the areas of: Specific guidance in practices and policies Facilitating networking and information sharing between states
11
Monitoring Focus Nearly all states view the main focus of monitoring to be : Provider compliance with rules and regulations (93%) Districts’ implementation of SES (84%)
12
Applications Three-fourths (74%) of the states use a “formal” monitoring process Almost 80% use monitoring results formally (38%) or informally (40%) in evaluating providers
13
Feedback and Capacity Feedback – 55% of states produce a written report – 23% have face-to-face meetings Capacity – 45% monitor all providers each year – 75% monitor at least half yearly
14
Types of Technical Assistance
15
On-Site Monitoring Activities (33%) Visits may be announced or random Includes online and in-home providers Review of tutoring documents, materials, etc. Uses checklist, rubric, or rating scale May be one person or a team Tutors or students may be interviewed Most often at school or community site
16
Desk Monitoring End-of-year fiscal and participation report Quarterly reports On-line implementation tracking Provider self-evaluation Parent and student satisfaction surveys Complaints regarding provider compliance Comparison of provider vs. district enrollment data
17
District Monitoring Supplementary for some states The only monitoring done in other states
18
Most Successful Practices
19
Challenges
20
Desired Improvements
21
Implementation of Provider Evaluations 30 states “regularly” evaluate 15 are still in planning stages Remainder “informally” evaluate
22
Is the Provider Evaluation Effective?
23
Evaluation Component
25
Student Achievement Analysis Approaches
26
Most Successful Evaluation Practices
27
Challenges
28
Desired Improvements
29
Contact Information Sam Redding, sredding@centerii.org Marilyn Murphy, mmurphy@centerii.org Steven Ross, smross@memphis.edu Jen Harmon, jharmon@centerii.org Kenneth Wong, kenneth_wong@brown.edu Visit our web site at www.centerii.org
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.