So What? How Do We Know States Are Making an Impact, and How Do We Report It? Patricia Mueller and Jocelyn Cooledge, Evergreen Evaluation & Consulting,

Slides:



Advertisements
Similar presentations
Evaluating Outcomes of Federal Grantee Collaborations Patricia Mueller, Chair AEA Conference 2014 Denver, CO The National Center on Educational Outcomes-National.
Advertisements

Margaret Miller. Rationale Lack of tx material for this age group Difficult to measure pragmatic language quantitatively Quantitative data necessary to.
Materials Support Assessment Professional Development Community/ Administrative Involvement Curriculum Materials Science: It’s Elementary Bringing science.
Pat Mueller David Merves October 6, 2008 NH RESPONDS Evaluation Component.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
SPDG Performance Measure Discussion Monday, March 14 th 2011.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
1 Quality Programs for Students with Significant Support Needs Gina Quintana Senior Consultant Colorado Department of Education November 15, 2010.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
The Significance Section Jennifer Doolittle, Ph.D. April 23, 2009.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Texas Regional Collaboratives for Excellence in Science Teaching Guidelines for Program Evaluation Presented by Carol L. Fletcher, Ph.D. Project Manager.
Office of Child Development & Early Learning Project MAX: Maximizing Access and Learning Tom Corbett, Governor Ronald J. Tomalis, Secretary of EducationCarolyn.
MSP Regional Meeting February 13-15, 2008 Calli Holaway-Johnson, Ph.D. Charles Stegman, Ph.D. National Office for Research on Measurement and Evaluation.
Challenge Grant Funding And SB  The Challenge Grant program is authorized by section (4), Florida Statutes, to provide grant funding to.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Data Team Implementation and Sustainability. Planning for Data Team Implementation Align Data Teams with the mission, values, and purpose of the school.
PLC Year 2 Day 2 Inquiry Cycle
First Things First Grantee Overview.
Introduction & Background Team AIM Statement
Monitoring and evaluation of disability-inclusive development
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
SSIP Implementation: Alignment & Evaluation Across the State System
Cross-Institutional Collaboration for Sustainability
There is great power in harmony and mutual understanding.
The assessment process For Administrative units
Pat Mueller, EEC David Merves, EEC Vitaliy Shyyan, NCEO
High Quality Coaching: How Do We Know It? April 1, 2015
MaryCatherine Jones, MPH, Public Health Consultant, CVH Team
Welcome and Announcements
Investing in Innovation (i3) Fund
Measuring Project Performance: Tips and Tools to Showcase Your Results
Overview for Alternate Assessment
SPDG/SSIP TA Alignment, Implementation & Evaluation
Partnership Approaches to Professional Learning that Activate Teacher Buy-In and Increase Implementation Janice Creneti, Cindy Medici,
Farmers Market and Local Food Promotion Program Grant Writing Workshop
Early Childhood Data System Framework Partner States Application Process June 3, 2013.
Grace Zamora Durán, Ed.D. April 19, 2010
G-CASE Fall Conference November 14, 2013 Savannah, Ga
WP4. Development of Evaluation framework
Parent-Teacher Partnerships for Student Success
Educator Effectiveness System Overview
Structures for Implementation
General Notes Presentation length - 10 – 15 MINUTES
2018 OSEP Project Directors’ Conference
Designed for internal training use:
Early Childhood Data System Framework Partner States Application Process June 3, 2013.
Results of the Organizational Performance
School of Dentistry Education Research Fund (SDERF)
Using outcomes data for program improvement
A3 – Improving State Level Supports and Stakeholder Engagement through Effective Evaluation Kim Gulbrandson, Justyn Poulos – Wisconsin RtI Center Key.
North Carolina Positive Behavior Support Initiative
Engaging with leaders Thursday 8th March 2011 Tim Heywood
Using Data for Program Improvement
There is great power in harmony and mutual understanding.
Response to Instruction/Intervention (RtI) for Parents and Community
Response to Instruction/Intervention (RtI) for Parents and Community
Annual Title I Meeting and Benefits of Parent and Family Engagement
Using Data for Program Improvement
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Re-Framing Agendas: From the Personal to the Policy Level
NC Mathematics and Science Partnership Program
Measuring Child and Family Outcomes Conference August 2008
Evaluating Community Link Working in Scotland: Learning from the ‘early adopters’ Jane Ford, NHS Health Scotland Themina Mohammed & Gordon Hunt, NSS Local.
SPDG Implementation Conversations - Discussion
Title Slide.
Exploring the Instructional Shifts Inherent in the 2020 CAS
Presentation transcript:

So What? How Do We Know States Are Making an Impact, and How Do We Report It? Patricia Mueller and Jocelyn Cooledge, Evergreen Evaluation & Consulting, Inc. Pat do the intro…

Identifying Impact Early to show impact Poll #1 results 22/46 grantees are in their second year. 65% of grantees are in Years 2 or 3. http://signetwork.org/content_pages/5 Poll #1 results Jocelyn – do the intro. Pat: Poll #1 – what year of the grant are you in? Point out where they are with worksheet, etc.

Demonstrating Impact Discussion about impact and points to keep in mind when working with states. Knight’s work on research to practice (KU-CRL) Drew parallels with SPDG evaluation Please share your experiences! Jocelyn - Jim Knight’s research to practice was with the KU Center for Research on Learning. Pat – please share experiences.

Concepts for Demonstrating Impact 1. Explicit Can you describe an impact? How do you take an “I know there is something different” to measurable and explicit data? 2. Take Time Impacts of scale take time Reporting on small-scale impacts qualitatively Reporting qualitative on the 524B Jocelyn take this slide

Concepts for Demonstrating Impact 3. Generalized Impact Widespread impact vs. impact for a few Depends on the scale of the initiative Ex: 3 students out of 200 showed enormous gains on assessment—good, but an initiative impact? Low incidence initiatives Does it make a difference from a state’s perspective for public relations? How do you address impact that is not generalized? Jocelyn will start the slide. Pat’s example – use SLP Master’s Program in a rural state. 11 graduates total, not sure if they will stay in the state…no service requirement. Sustainability – IHE is picking up the Program. E.g., improvement in 2 – 3 schools/districts, but not statewide.

Concepts for Demonstrating Impact 4. Routine Do you see reliable changes in the practices? What if you’re not—is there an impact? May not see reliable changes in Indicator data with changes in practice. Do you focus on Indicator data for your outcomes? Jocelyn do the intro. Pat – is there an impact? If you see changes and they aren’t consistent, is there an impact? How do interpret the data? What should we be looking at? Talk about the story behind the data…explanation of progress. Maybe dependent on the Project Officer. Do projects focus on SPP/APR Indicators?

Concepts for Demonstrating Impact 5. Barrier-Free Implementation Reducing barriers increases likelihood of positive outcomes We encourage states to mitigate barriers Do you talk to states about barriers to implementation? If so, how do you help them identify the barriers? 6. Changing the Ladder From student to State Department How do you address the issue with states if the changes need to be made at the state level? Jocelyn - #5 Changing the ladder—implementation science triangle #6 Pat – creating impact…need to change all the way up the cascade: student – teacher- school – district – region- state How do you address issues if changes need to be made at the state level?

Demonstrating an Impact What if you’re not able to demonstrate any impacts? Initiative stalled or slow Availability of data Data measures are weak Data results are weak Jocelyn’s slide

Demonstrating an Impact Working with partners PTI, IHEs, and early childhood initiatives What kinds of strategies do you use for collecting and obtaining data from partners? Activities for ####### Time Activity Purpose Participants 4/2/2013 PD PD for schools   4/22/2013 Planning Planning with team 2 5/1/2013 Meeting 5 6/3/2013 Office Office Work Pat – this slide. PD logs, getting data from them. How to focus the data collection; strategies for working with partners.

Reporting on Impact How do you allocate your time for evaluating the impact when the performance measures emphasize implementation? Reporting outcome data on the 524B Reporting outcome data for states Poll #2 Jocelyn’s slide. Poll – how many are in NCE?

Reporting on Impact PM #1 worksheets No-cost extensions Who completes them for your states? No-cost extensions How do they affect impact, if at all? Qualitative measures Pat’s slide.

Thank you! Please contact Pat or Jocelyn at pat@evergreenevaluation.net jcooledge@evergreenevaluation.net Adios! See you in November.