Presentation is loading. Please wait.

Presentation is loading. Please wait.

Descriptive Evaluation Summary Year 4 & Grant-to-Date December 1, 2007 – November 30, 2008.

Similar presentations


Presentation on theme: "Descriptive Evaluation Summary Year 4 & Grant-to-Date December 1, 2007 – November 30, 2008."— Presentation transcript:

1 Descriptive Evaluation Summary Year 4 & Grant-to-Date December 1, 2007 – November 30, 2008

2 Evaluation Plan Designed from RFP Objectives Obj. 1: Knowledge Development Obj. 2: Technical Assistance Obj. 3: Recommendations from Advisory Committee Obj. 4: Evaluate & Manage NPSO Center

3 Data Sources Aggregate of monthly report data –Year 4 –Grant-to-Date (Years 1 - 4) Survey of individuals who received any TA from NPSO across the 4 years : –108 respondents (27% response rate) –43 of 60 states were represented (72%)

4 Objective 1: Knowledge Development Activities SPP Analyses –Reporting for OSEP –Used for defining TA topical needs targeting TA to specific states in need of Intensive TA Product & Tool Development

5 Year 4: Tools District-by-District Data Display (SUNY) Trend Data Display (SUNY) Indicator 14 SEA Timeline

6 Year 4: Products Making Connections Across Indicators to Improve Post- School Outcomes: Early State Efforts (NDPC-SD, Year 3 product) Part B Indicator 14 APR Writing Suggestions & Examples (for Feb. 2009) NASDSE Roundtable Discussion: Collecting & Using Post- School Outcome Data on Dropout & Other Hard-to-Locate Former Students (NASDSE) Advice from the Field: Perspectives of State Directors of Special Education Regarding PSO Data & Ind. 14 (NASDSE, year 3) NPSO Student & Parent Fliers (PACER)

7 Use of Tools/Products Number of States

8 Quality of Year 4 Products Resp. Used 44 64 38 47 49 52 47 49 92 Percent of Respondents

9 Positive Product Feedback “All the products I have used are excellent and our Indicator 14 heavily depends on the tools developed by NPSO.” “These [products] are excellent and indispensible to states. It would be nice to have such well organized and pertinent material for APR development for all indicators.” “We have used several NPSO resources to create state resources that are a bit shorter but have nearly the same content.”

10 Product Improvement Feedback “I would like to see more innovation both highlighted and suggested in Indicator 14 in collecting the data and survey techniques.” “Suggestions on how to manipulate data display tables - we used them in our state and SPP report, but we were only able to insert them as images. Would like to have had them in a format we could manipulate (labeling. design, color, etc.).” “Sometimes they [tools & products] are not available when we really need them as they are being vetted etc. and this takes time.”

11 Objective 2: Technical Assistance Activities NPSO Targeted TA Activities Collaborative TA with: –NDPC-SD –NSTTAC OSERS Transition Initiative

12 Broad TA Strategies Teleconferences Information Requests Conference Presentations Website Access Intensive TA Strategies Phone/e-mail Consultation Structured Workshops On-site Consultation

13 Year 4 Intensive & Broad TA in Specified States 58 of 60 states, jurisdictions, & territories received TA from the NPSO Brown = Intensive TA; Tan = Broad TA Light Blue=No TA

14 Year 4: Number of States Participating in NPSO TA activities

15 On-Site Consultations 9 states participated in on-site consultations: CA, KS, KY, NM, OH, OR, PR, SC, VI

16 Interactive Workshops 51 (85%) of states participated –37 states participated in Building for the Future: State Planning Institute –42 states participated in 3 Regional Making Connections Across Indicators Baltimore Kansas City Salt Lake City

17 Phone/e-mail Consultations 22 states (37%) requested & participated in phone/e-mail consultations 44 phone/e-mail consultations were conducted Primary topics included: –Defining & calculating representativeness –SPP/APR calculations –Assistance with use of products (e.g., response calculator, data display templates)

18 Information Requests 49 total requests with responses 32 (53%) states participated Primary request included: –Information on tools & products –Requests for states who had similar situations –Support in SPP/APR submissions

19 Teleconferences 31 (52%) states participated Multiple types –NPSO Community of Practice –Webinars for tools –RRC collaborations –OSEP oriented Community of Practice topics: Reviewing & adjusting data collection protocols Strategies for Improving Student Outcomes Tips for Working with Contractors State Updates & Lessons Learned Representativeness: Why It’s Important Webinar of products This much We Know: Anecdotal Findings from Data Collection APR Writing Suggestions & Examples

20 Conferences 5 National Conference Presentations* –National Alliance Conference –National High School Center Summer Institute –TATRA –National Accountability Conference –OSEP Data Manager’s Meeting *# of states were not able to be measured

21 Year 4: Number of States Accessing Multiple TA Types Of 13 states using 1 Type of TA: –8 used Collaborative TA event(s) (intensive) –6 used Information requests (primarily PacRim) –1 used Teleconference 29 states accessed 3 or more types of TA

22 Grant-to-Date Summary

23 Use of TA by States

24 Number of States Accessing Multiple Types of TA All states have participated in at least 1 Intensive TA event 52 (87%) of all states have access 4 or more types of TA Range of TA events by states is between 6 and 72 Average # of TA events by states is 21 (sd=13.3) Median # TA events is 17.5

25 Use of TA Across Years 1 - 4 Number of States

26 Survey Responses t o Use of TA & Products/Tools Number of States

27 Objective 3: Gaining Perspective & Feedback from Advisory Committee Combined TWG & Advisory Committee meeting in February Used feedback to plan Year 4 products & targeted TA

28 Objective 4: Management & Evaluation Conducted bi-monthly core staff meetings Hosted Site Visit of OSEP Project Officer Held Monthly teleconference with OSEP Project Officer External Evaluation –Check-in with Project Officer –Perceptions of states representing 4 quadrants Developed Year 5 contracts for partners

29 Recommendations

30 Survey Respondents Recommendations for Products “Couldn't make it without the availability of this Center for guidance, feedback and resources. We love tools...more tools!” “I'm looking forward to the next steps, e.g., using data to affect change” “We need more of your expertise on implementation, measuring outcomes...” “We are in need of a relatively simple tool to help districts make the most of what the data shows.”

31 Survey Respondents Recommendations for TA “Continue to provide opportunities for state and national parties to discuss the needs of students and families and collaborate on problem-solving strategies and action plans.” “I would like to see more innovation both highlighted and suggested in Indicator 14 in collecting the data and survey techniques.” “Continue making connections with other Indicators” “Papers and Teleconference formats are very helpful given restrictions on travel which will continue for the foreseeable future.”

32 Recommendations from OSEP Project Officer Continue to work on replicating your successes with other needy states Continue to work on how SEAS can help make LEAs understand data utility Continue to work on and improve on the procedural fidelity of the data collection systems in LEAs Continue to work on involving the hard to reach (minority and under representatives) in project planning and implementation activities.

33 QUESTIONS


Download ppt "Descriptive Evaluation Summary Year 4 & Grant-to-Date December 1, 2007 – November 30, 2008."

Similar presentations


Ads by Google