Final Summary Evaluation: State Projects Serving Individuals with Deaf-Blindness October 23, 2006 NCDB Sponsored Webinar Presented by Richard Zeller.

Slides:



Advertisements
Similar presentations
A Roadmap to Successful Implementation Management Plans.
Advertisements

Intro. Website Purposes  Provide templates and resources for developing early childhood interagency agreements and collaborative procedures among multiple.
From QA to QI: The Kentucky Journey. In the beginning, we were alone and compliance reigned.
August 2006 OSEP Project Director's Conference 1 Preparing Teachers to Teach All Children: The Impact of the Work of the Center for Improving Teacher Quality.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 State Monitoring Under IDEA A Snapshot of Past Practices.
 Reading School Committee January 23,
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
The Role and Expectations for School-wide PBS Coaches Rob Horner and George Sugai OSEP TA-Center on PBS Pbis.org.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
The Academic Assessment Process
How to write a Report On Assessment Source: AUN Secretariat.
How to Write Goals and Objectives
Standards and Guidelines for Quality Assurance in the European
Monitoring Accommodations in South Dakota Linda Turner Special Education Programs.
Codex Guidelines for the Application of HACCP
How to Develop the Right Research Questions for Program Evaluation
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Implementation & Evaluation Regional Seminar ‘04 School Development Planning Initiative “An initiative for schools by schools”
Webinar on the OSEP Self Assessment and Site Review Process for State and Multi-State Deaf-Blind Projects October 29, 2004.
This product was developed by Florida’s Positive Behavior Support Project through University of South Florida, Louis de la Parte Florida Mental Health.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Individual Education Plan Overview Presented By: Pamela Cameron Fall 2014.
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
Ways for Improvement of Validity of Qualifications PHARE TVET RO2006/ Training and Advice for Further Development of the TVET.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
Significant Changes to the Monitoring Process  Self-assessment by school districts.  Greater involvement of parents and other stakeholders.  Improved.
National Consortium On Deaf-Blindness Families Technical Assistance Information Services and Dissemination Personnel Training State Projects.
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
September 2007 Survey Development Rita O'Sullivan Evaluation, Assessment, & Policy Connections (EvAP) School of Education, University of North Carolina-Chapel.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
LANSING, MI APRIL 11, 2011 Title IIA(3) Technical Assistance #2.
Aggregating Outcomes for Effort and Effect: What NTAC Learned from its Site Review Ella L. Taylor, Ph.D. NTAC Teaching Research Institute Western Oregon.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
The Facts About Schoolsite Councils The Roles and Responsibilities of a Schoolsite Council.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
Focused Review of Improvement Indicators A Self-Assessment Process SPP Stakeholder Meeting December 16, 2009.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Learning More About Oregon’s ESEA Waiver Plan January 23, 2013.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
The Literacy and Numeracy Secretariat Le Secrétariat de la littératie et de la numératie October – octobre 2007 The School Effectiveness Framework A Collegial.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Building Bridges: Embedding outcome evaluation in national and state TA delivery Ella Taylor Diane Haynes John Killoran Sarah Beaird August 1, 2006.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Helping Teachers Help All Students: The Imperative for High-Quality Professional Development Report of the Maryland Teacher Professional Development Advisory.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
School Improvement Updates Accreditation (AdvancED) Process ASSIST Portfolio for Schools May 2016 Office of Service Quality Veda Hudge, Director Donna.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Classroom Assessments Checklists, Rating Scales, and Rubrics
School Community Council Roles and Responsibilities
Child Outcomes Summary Process April 26, 2017
The Federal programs department September 26, 2017
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Classroom Assessments Checklists, Rating Scales, and Rubrics
OSEP Project Directors Meeting
Measuring Project Performance: Tips and Tools to Showcase Your Results
2018 OSEP Project Directors’ Conference
Discussion and Vote to Amend the Regulations
Using Data for Program Improvement
Assessments: Beyond the Claims
Using Data for Program Improvement
Presentation transcript:

Final Summary Evaluation: State Projects Serving Individuals with Deaf-Blindness October 23, 2006 NCDB Sponsored Webinar Presented by Richard Zeller

Presentation Overview Self-assessment and verification review purpose and design Summary of Year One (37) and Year Two (11) project self-assessments Summary of on-site reviews Feedback from projects and review teams Evaluator Recommendations Discussion

Evaluation Requirement During year 2, each project must... "conduct a comprehensive self-evaluation. The evaluation must include a review of the degree to which the project is meeting proposed objectives and goals and an evaluation of outcome data... In addition, the Department of Education intends to conduct a limited number of on-site evaluations based on a stratified randomized sample of sites. ” (RFP, page C-4)

Evaluation Purposes Summative project evaluation questions: Are projects goals and objectives being achieved? Do projects address each RFP priority? Do projects have appropriate outcome data? Formative project evaluation: Provide a continuous improvement process for individual projects to use. National report - both summative and formative: provide information OSEP can use to guide needed system improvements.

Evaluation Constraints 48 projects - single and multi-state Staffing from partial to several FTE Common design to allow summaries Resources come primarily from the projects Assess whether projects are addressing RFP priorities

Evaluation Design “Work scope” standards: Priorities (a) - (i) and General Requirements (a) - (c) Priority questions: What types of strategies are used? Is work being completed in a timely fashion? Are intended results being achieved? Are outcome data available (efforts and effects)? Are improvement plans in effect? “General Requirements:” Are these requirements being appropriately addressed?

Evaluation Design (continued) Self-assessments parallel MSIP’s Continuous Improvement and Focused Monitoring System (now the SPP & APR). Verification Reviews (site visits) during this evaluation, reviews became a check on the self- evaluation process and a way to provide TA to the project. Adjustments made in both self-assessment and review designs during implementation.

Self-Assessment Summary Priorities (a) - (e): Strategies Timeliness Results Data - Effort and Effect Adjustments/Future Plans Priorities (g) - (h) and General (a) - (c): Are the priorities addressed (Yes/No) Are there standards that apply?

Strategies Described While the relative application of “ongoing” strategies was higher in year two, they were also more distributed across projects than in year one. Year OneYear Two Linear 1% Cyclical 15%17% Ongoing 36%45% Combined 48%38%

For all timeliness item ratings Year One Year Two active/behind schedule15%19% not implemented/on schedule 1.4%3% active/on schedule83%78%

For all result item ratings: Year One Year Two exceeding expectations13% meeting expectations72%65% below & approaching expectations 12%20% well below expectations1% cannot rate3%1%

For all effort item ratings: Ratings cluster: 3 projects in year one and 2 in year two rated “extensive” data for more than 6 items Year One Year Two no data2% some level of effort data13%33% some quality of effort data6%7% some level & quality of effort data62%45% extensive level & quality of effort data 18%13%

For all effect item ratings: Year One Year Two no data12%20% outcome data71%68% impact data16%12% Clarification: Outcomes are the immediate results of your assistance (e.g., teacher skills gained in training) Impacts are results your clients have when they apply what you have taught them (e.g., they teach & children learn communications skills)

Overall Item Ranking (hi to lo) Year Two (a)(3) R-B practices (a)(1) State capacity (d) Collaboration (a)(4) Provider skills (b)(1) Census (a)(5) Address child/family needs (e) Disseminate (b)(2) Assess critical child needs (a)(2) Systemic change (c)(1) Evaluate effectiveness (b)(3) Assess state needs (c)(3) Advisory evaluation design (c)(2) Measure child outcomes Year One (c)(3) Advisory evaluation design (d) Collaboration (b)(1) Census (a)(1) State capacity (a)(3) R-B practices (a)(4) Provider skills (b)(3) Assess state needs (a)(5) Address family/child needs (e) Disseminate (a)(2) Systemic change (c)(1) Evaluate effectiveness (b)(2) Assess critical child needs (c)(2) Measure child outcomes

Areas Needing More Attention? (c)(2) Measure child outcomes (a)(2) Systemic change (b)(3) Assess state needs (b)(2) Assess child needs (c)(1) Evaluate effectiveness (c)(3) Advisory evaluation design?

Adjustments/Future Plans All projects/all strategies: about 81% (Year One) v. 63% (Year Two) of strategies are to “continue as proposed” In Year One, 8 projects accounted for 55% of planned changes In Year Two, all projects plan some changes in strategy, with 7 adopting new strategies The most common areas of adjustment across both years were priorities (c)(1), evaluation and (c)(2), measurement of child outcomes

Priorities (g) & (h) Affirmative Response: Year One Year Two (g) OSEP Directed TA27%0% (g) Web-based TA92%82% (g) Community of Practice92%82% (h) Advisory Standards97%100% (h) Act on Advisory Recs100% (h) Advisory Change38%36%

General Priorities (a) - (c) General Priority Area Year One Year Two (a) Employ people with disabilities?59%73% (a) Try to employ people?68%73% (a) Advance people with disabilities?62%36% (a) Change employment practices?35%36% (b) Involve people with disabilities?100% (c) Does project have a website?86%91% (c) Is web-site accessible?81%82% (c) Planning website improvement?95%45%

How Projects Relate Priorities to Work Design (Part 1, Year 2) Objs or GoalsPriority CitesPriorities / Obj Objs / Priority

Total Priority Cites by 11 Projects (Year 2, Part 1) (a)(1) State capacity117 (a)(2) Systemic change 76 (a)(3) R-B practices96 (a)(4) Provider skills87 (a)(5) Assess family & child needs 64 (a)(6) Other17 (b)(1) Census34 (b)(2) Assess critical child needs 60 (b)(3) Assess state needs 54 (c)(1) Evaluate effectiveness 84 (c)(2) Measure child outcomes 49 (c)(3) Advisory evaluation42 (d) Collaboration70 (e) Disseminate80 (g) OSEP specified TA26 (h) Maintain Advisory23 (a) Employ Individuals with disabilities 7 (b) Involve Individuals43 (c) Web site accessibility33

Verification Visit Summary Sites Visited (in order): Year One: IN, FL, NJ, NY, WA, MO, CO, MT, CA Year Two: KY, MI, NC, TN Process: Team of 3 reviewers each rated their agreement with the Project’s ratings for each priority and offered comments on each priority. Revisions to the process and report form were made during the first year (simplifications) and again before year two (in response to suggestions).

Site Review Participation # Staff# Stakeholders IN420 FL512 NJ58 NY519 WA414 CO49 MT410 MO57 CA812 KY65 MI513 NC212 TN35

Who were the reviewers and how many sites did they visit? Reviewer NameYear OneYear Two Zambone1 Sharpton11 Bove2 McLetchie22 Rafalowski Welch2 Syler22 Fankhauser31 Dalke43 Rachal42 Steveley61

Did site reviews tend to validate project self-assessments? Agreement with project: “any reviewer’s rating of each project item self assessment rating” Agreed with ProjectYear One Year Two Strongly Agree85%60% Mostly Agree10%25% Somewhat Agree3%2% Somewhat Disagree2%1% Strongly Disagree<1%0%

Were reviewer ratings (after discussion) reliable? Agreement here was defined as “all reviewers rated the way on a given item” Year One: 32 actual disagreements, or 96.9% complete agreement on all items Year Two: 1 disagreement, or 99%+ agreement on all items Only 5 sites where disagreements among reviewers occurred; most of them in one site

How did projects and reviewers view the value of these two processes? Self-assessment and improvement planning is a necessary function for the system of state projects. The past and current processes and forms are complex and redundant, given the way the work is organized. Both processes (self-assessments and review visits) have value, but both need substantial redesign.

Projects’ View of the Value of the Self-Assessments Project Ratings (# reporting):Year 1Year 2 High value93 Moderate value187 Some value91 More trouble than it was worth1

Year Two

What Some Projects Liked Prompted communication with state program sites Forced staff to consider value of work The forms forced project to focus and limit narrative Improvement over earlier self-evaluation processes Aligned proposal to RFP, so not hard to use Format was easy and more logical [than year 1] Separate narrative allowed project to show how priorities were woven into goals & objectives

What Projects Didn’t Like Priorities, criteria & proposed work not aligned Evaluation rules were not in the RFP Accessibility problems with form Redundancy (e.g., attachments & narratives) Word functions don’t work in the template form Too many reports for one year Too much time - takes away from TA Form accessibility (couldn’t enlarge print?) Narrative, priorities & ratings in three documents Format - impossible to match priorities to activities

Review Team Recommendations to Sites: Expand partnerships (B, C, 619, others) - others must do the work of system change Family networking/support (parent-to-parent) Define/structure TA and intent - child change, local capacity building, systems change Systematize data collection (census, needs, efforts and effects on individuals/systems) Use evaluation for program improvement

Review Team Suggestions Better self-assessment instructions Consolidate Progress Report & Project Evaluation Clarify evaluation standards in the RFP Cluster priorities (eliminate redundancies) Value of the review process is the TA provided Effort & effect need better definition Change forms: Neither Year 1 or 2 worked for all Align Priorities and evaluation model In future evaluation & review processes:

Evaluator Recommendations The next RFP should have 5 program priorities (e.g., skill development, system capacity/change, child census/performance, family services, dissemination R-B practices) Combine self-assessment and reporting in a single system with prescribed indicator measures for each priority for all projects For larger projects (>$500K) adopt standard 3+2 procedures

Discussion: Were Evaluation Purposes Achieved? Summative project evaluation questions: Are projects goals and objectives being achieved? Do projects address each RFP priority? Do projects have appropriate outcome data? Formative project evaluation: Provide a continuous improvement process for individual projects to use. National report - both summative and formative: provide information OSEP can use to guide needed system improvements.