Descriptive Evaluation Summary Year 4 & Grant-to-Date December 1, 2007 – November 30, 2008.

Slides:



Advertisements
Similar presentations
UCSC History. UCSC: A brief history 60s University Placement Committee A lot of field trips/interaction with employers.
Advertisements

Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Audrey Desjarlais, Signetwork Coordinator Survey Findings SPDG Initiative Goals SPDG Initiative Outcomes.
An Introduction to the “new” NCDB …a webinar for the National Deaf-Blind TA Network November 13, 2013 November 15, 2013 Presented by:
REL Appalachia and the Virginia Middle School Research Alliance Justin Baer, Director, REL Appalachia Virginia School-University Partnership Steering Committee.
Missouri Schoolwide Positive Behavior Support (MO SW-PBS) Implementation Mary Richter MO SW-PBS State Coordinator.
Best Start Conference January Peel Health Great Beginnings Initiative  In 1999, McCain and Mustard’s Early Years Study documented the importance.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
RESULTS DRIVEN ACCOUNTABILITY SSIP Implementation Support Activity 1 OFFICE OF SPECIAL EDUCATION PROGRAMS.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
Designing and Implementing An Effective Schoolwide Program
Special Ed. Administrator’s Academy, September 24, 2013 Monitoring and Program Effectiveness.
1 Overview of IDEA/SPP Early Childhood Transition Requirements Developed by NECTAC for the Early Childhood Transition Initiative (Updated February 2010)
1 Working with Contractors for Indicator 14: Tips and Strategies Capacity Building Institute Charlotte, NC May 8, 2008.
New Partners for Smart Growth 11th Annual Conference San Diego February 2, 2012 New Parking Standards for Affordable Housing.
Welcome to the Learning Community 2015 Roll out webinar Hosted by the Family Institute for Education, Practice & Research The webinar will begin shortly.
Collaborating with Business: A Survey of Employers Participating in PWDNET December, 2012 Leah Lobato, Utah State Office of Rehabilitation Carol Ruddell,
 AKA CIPP  Evaluators: Elaine Carlson and Tom Munk  Assist in summative evaluation of the center  Helped develop standardized logic model  Helped.
Funding Opportunity: Supporting Local Community Health Improvement Sylvia Pirani Director, Office of Public Health Practice New York State Department of.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Improving Student Outcomes: Using Technical Assistance.
Working with Your RRC to Improve Secondary Transition Education Presented by: Lucy Ely Pagán, NERRC and Jeanna Mullins, MSRRC.
The Access Center’s Technical Assistance Activities Amy Klekotka Technical Assistance Liaison Wisconsin Summer Institute 2006: Addressing Disproportionality.
The RRCP Program A Framework for Change Presented to our SPDG Partners June 2010.
Chase Bolds, M.Ed, Part C Coordinator, Babies Can’t Wait program Georgia’s Family Outcomes Indicator # 4 A Systems Approach Presentation to OSEP ECO/NECTAC.
Committee of Practitioners ESEA Flexibility Waiver Review June 25, 2014.
Citizens Redistricting Commission Civic Engagement Proposal February 11, 2011 Center for Collaborative Policy, California State University, Sacramento.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
VR Counselors Working with Schools During Transition Laura Spears & Kelley Ali Transition Specialists, South Carolina Vocational Rehabilitation Department.
August 7, Market Participant Survey Action Plan Dale Goodman Director, Market Services.
Significant Changes to the Monitoring Process  Self-assessment by school districts.  Greater involvement of parents and other stakeholders.  Improved.
CEBP Learning Institute Fall 2009 Evaluation Report A collaborative Partnership between Indiana Department of Corrections & Indiana University November.
Georgetown University National Technical Assistance Center for Children’s Mental Health 1.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
V Technical Assistance Center on Social Emotional Intervention (TACSEI)
Family & Professional Networks in Disability Policy: A Qualitative Inquiry.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
National High School Center Summer Institute What’s the Post-School Outcomes Buzz? Jane Falls Coordinator, National Post-School Outcomes Center Washington,
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Continuous Improvement and Focused Monitoring System US Department of Education Office of Special Education Programs Overview of the OSEP Continuous Improvement.
Indicator 14 Frequently Asked Questions Frequently Asked Questions Revised May 2010 (Revisions indicated in red font)
New Indicator 14 Frequently Asked Questions Frequently Asked Questions 3 rd Annual Secondary Transition State Planning Institute Charlotte, NC May12-14,
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Presented by the Early Childhood Transition Program Priority Team August 11, 2010 Updated September 2010.
Focused Review of Improvement Indicators A Self-Assessment Process SPP Stakeholder Meeting December 16, 2009.
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
National Secondary Transition Technical Assistance Center Connecting TA for Part B Indicators 1, 2, 13, & 14: Working Together to Support States OSEP Project.
JACK O’CONNELL State Superintendent of Public Instruction Improving Special Education Services November 2010 Sacramento, CA SPP/APR Update.
Office of Child Development & Early Learning Project MAX: Maximizing Access and Learning Tom Corbett, Governor Ronald J. Tomalis, Secretary of EducationCarolyn.
A Framework for Assessing Needs Across Multiple States, Stakeholders, and Topic Areas Stephanie Wilkerson & Mary Styers REL Appalachia American Evaluation.
Early Childhood Program Accountability: Cross Walks Between Strengthening Families, Head Start Performance Standards and NAEYC Accreditation standards.
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
6/18/2016 DES / AzEIP 2011 Cycle Two Self Report Overview & Training Cycle Two Self Report Overview & Training.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
KCMP Quarter 3 Indicators 1, 2, 4, and 20 November - January.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
North Carolina Council on Developmental Disabilities
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
G-CASE Fall Conference November 14, 2013 Savannah, Ga
DB Summit 2016 Early Identification/Referral Session
SPR&I Regional Training
Early Childhood Transition APR Indicators and National Trends
Research on UTeach: Past, Present, and Future Research
Marketplace of Strategies
Measuring Outcomes for Programs Serving Young Children with Disabilities Lynne Kahn and Christina Kasprzak ECO/NECTAC at FPG/UNC June 2,
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Measuring Child and Family Outcomes Conference August 2008
Special Ed. Administrator’s Academy, September 24, 2013
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Presentation transcript:

Descriptive Evaluation Summary Year 4 & Grant-to-Date December 1, 2007 – November 30, 2008

Evaluation Plan Designed from RFP Objectives Obj. 1: Knowledge Development Obj. 2: Technical Assistance Obj. 3: Recommendations from Advisory Committee Obj. 4: Evaluate & Manage NPSO Center

Data Sources Aggregate of monthly report data –Year 4 –Grant-to-Date (Years 1 - 4) Survey of individuals who received any TA from NPSO across the 4 years : –108 respondents (27% response rate) –43 of 60 states were represented (72%)

Objective 1: Knowledge Development Activities SPP Analyses –Reporting for OSEP –Used for defining TA topical needs targeting TA to specific states in need of Intensive TA Product & Tool Development

Year 4: Tools District-by-District Data Display (SUNY) Trend Data Display (SUNY) Indicator 14 SEA Timeline

Year 4: Products Making Connections Across Indicators to Improve Post- School Outcomes: Early State Efforts (NDPC-SD, Year 3 product) Part B Indicator 14 APR Writing Suggestions & Examples (for Feb. 2009) NASDSE Roundtable Discussion: Collecting & Using Post- School Outcome Data on Dropout & Other Hard-to-Locate Former Students (NASDSE) Advice from the Field: Perspectives of State Directors of Special Education Regarding PSO Data & Ind. 14 (NASDSE, year 3) NPSO Student & Parent Fliers (PACER)

Use of Tools/Products Number of States

Quality of Year 4 Products Resp. Used Percent of Respondents

Positive Product Feedback “All the products I have used are excellent and our Indicator 14 heavily depends on the tools developed by NPSO.” “These [products] are excellent and indispensible to states. It would be nice to have such well organized and pertinent material for APR development for all indicators.” “We have used several NPSO resources to create state resources that are a bit shorter but have nearly the same content.”

Product Improvement Feedback “I would like to see more innovation both highlighted and suggested in Indicator 14 in collecting the data and survey techniques.” “Suggestions on how to manipulate data display tables - we used them in our state and SPP report, but we were only able to insert them as images. Would like to have had them in a format we could manipulate (labeling. design, color, etc.).” “Sometimes they [tools & products] are not available when we really need them as they are being vetted etc. and this takes time.”

Objective 2: Technical Assistance Activities NPSO Targeted TA Activities Collaborative TA with: –NDPC-SD –NSTTAC OSERS Transition Initiative

Broad TA Strategies Teleconferences Information Requests Conference Presentations Website Access Intensive TA Strategies Phone/ Consultation Structured Workshops On-site Consultation

Year 4 Intensive & Broad TA in Specified States 58 of 60 states, jurisdictions, & territories received TA from the NPSO Brown = Intensive TA; Tan = Broad TA Light Blue=No TA

Year 4: Number of States Participating in NPSO TA activities

On-Site Consultations 9 states participated in on-site consultations: CA, KS, KY, NM, OH, OR, PR, SC, VI

Interactive Workshops 51 (85%) of states participated –37 states participated in Building for the Future: State Planning Institute –42 states participated in 3 Regional Making Connections Across Indicators Baltimore Kansas City Salt Lake City

Phone/ Consultations 22 states (37%) requested & participated in phone/ consultations 44 phone/ consultations were conducted Primary topics included: –Defining & calculating representativeness –SPP/APR calculations –Assistance with use of products (e.g., response calculator, data display templates)

Information Requests 49 total requests with responses 32 (53%) states participated Primary request included: –Information on tools & products –Requests for states who had similar situations –Support in SPP/APR submissions

Teleconferences 31 (52%) states participated Multiple types –NPSO Community of Practice –Webinars for tools –RRC collaborations –OSEP oriented Community of Practice topics: Reviewing & adjusting data collection protocols Strategies for Improving Student Outcomes Tips for Working with Contractors State Updates & Lessons Learned Representativeness: Why It’s Important Webinar of products This much We Know: Anecdotal Findings from Data Collection APR Writing Suggestions & Examples

Conferences 5 National Conference Presentations* –National Alliance Conference –National High School Center Summer Institute –TATRA –National Accountability Conference –OSEP Data Manager’s Meeting *# of states were not able to be measured

Year 4: Number of States Accessing Multiple TA Types Of 13 states using 1 Type of TA: –8 used Collaborative TA event(s) (intensive) –6 used Information requests (primarily PacRim) –1 used Teleconference 29 states accessed 3 or more types of TA

Grant-to-Date Summary

Use of TA by States

Number of States Accessing Multiple Types of TA All states have participated in at least 1 Intensive TA event 52 (87%) of all states have access 4 or more types of TA Range of TA events by states is between 6 and 72 Average # of TA events by states is 21 (sd=13.3) Median # TA events is 17.5

Use of TA Across Years Number of States

Survey Responses t o Use of TA & Products/Tools Number of States

Objective 3: Gaining Perspective & Feedback from Advisory Committee Combined TWG & Advisory Committee meeting in February Used feedback to plan Year 4 products & targeted TA

Objective 4: Management & Evaluation Conducted bi-monthly core staff meetings Hosted Site Visit of OSEP Project Officer Held Monthly teleconference with OSEP Project Officer External Evaluation –Check-in with Project Officer –Perceptions of states representing 4 quadrants Developed Year 5 contracts for partners

Recommendations

Survey Respondents Recommendations for Products “Couldn't make it without the availability of this Center for guidance, feedback and resources. We love tools...more tools!” “I'm looking forward to the next steps, e.g., using data to affect change” “We need more of your expertise on implementation, measuring outcomes...” “We are in need of a relatively simple tool to help districts make the most of what the data shows.”

Survey Respondents Recommendations for TA “Continue to provide opportunities for state and national parties to discuss the needs of students and families and collaborate on problem-solving strategies and action plans.” “I would like to see more innovation both highlighted and suggested in Indicator 14 in collecting the data and survey techniques.” “Continue making connections with other Indicators” “Papers and Teleconference formats are very helpful given restrictions on travel which will continue for the foreseeable future.”

Recommendations from OSEP Project Officer Continue to work on replicating your successes with other needy states Continue to work on how SEAS can help make LEAs understand data utility Continue to work on and improve on the procedural fidelity of the data collection systems in LEAs Continue to work on involving the hard to reach (minority and under representatives) in project planning and implementation activities.

QUESTIONS