2018 OSEP Project Directors’ Conference

Slides:



Advertisements
Similar presentations
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Advertisements

August 2006 OSEP Project Director's Conference 1 Preparing Teachers to Teach All Children: The Impact of the Work of the Center for Improving Teacher Quality.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
The SWIFT Center SCHOOLWIDE INTEGRATED FRAMEWORK FOR TRANSFORMATION.
CEEDAR Cross State Convening Overview: Innovation Configurations Course Enhancement Modules H325A
SCHOOLWIDE INTEGRATED FRAMEWORK FOR TRANSFORMATION
The RRCP Program A Framework for Change Presented to our SPDG Partners June 2010.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
Policy for Results: How Policy Meets Preparation to Lead the Way to Improved Outcomes: H325A
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Collaboration for Effective Educator Development, Accountability and Reform H325A
Mapping the logic behind your programming Primary Prevention Institute
2014 National Call Collaboration for Effective Educator Development, Accountability and Reform H325A
Office of Service Quality
Introduction to Strong Educator Support System.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
C ollaboration for E ffective E ducator D evelopment, A ccountability, and R eform (CEEDAR) Center U.S. Department of Education, H325A
Monitoring and evaluation of disability-inclusive development
Georgia State University
COMMUNITY ORGANIZATION
Guidance on Construction
Capacity Building: Drafting an Evaluation Blueprint
Rorie Fitzpatrick & Dona Meinders, WestEd
Collaboration for Effective Educator Development, Accountability and Reform H325A
Using Formative Assessment
RECOGNIZING educator EXCELLENCE
Educator preparation policy as a lever for improving teacher and leader preparation: Keeping promises in Tennessee Collaboration for Effective Educator.
As use of 619 data increases in state accountability systems, new and challenging issues for data sharing/governance arise particularly as these data are.
High-Leverage Practices
Supporting Improvement of Local Child Outcomes Measurement Systems
Developing & Refining a Theory of Action
2018 OSEP Project Directors’ Conference
Meg Kamman & Erica McCray University of Florida Nichelle Robinson
Ohio Deans Compact Overview CEEDAR Webinar June 21, 2018
Ohio Dean’s Compact Meeting September 14, 2018 ceedar.org
Child Outcomes Data: A Critical Lever for Systems Change
Realizing the Promise of Learning Networks: Getting to Real Results
2018 OSEP Project Directors’ Conference
Mary T. Brownell, Director
2018 OSEP Project Directors’ Conference
Implementation Guide for Linking Adults to Opportunity
Integrating Practice and coursework
Poster Presentations for the CEEDAR-IRIS Cross-State Convening
Poster Presentations for the CEEDAR Cross-State Convening 2017
Supporting Improvement of Local Child Outcomes Measurement Systems
Grantee Guide to Project Performance Measurement
Strategic Boards Toolkit
2018 Improving Data, Improving Outcomes Conference
Effective practices for online professional collaboratives across special, general, and leadership education Meg Kamman, CEEDAR Lauren Artzi, CEEDAR/NCSI.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Ohio Presentation Building Ohio’s Community of Practice to Meet the Needs of All Learners.
Georgia’s Tiered System of Supports for Students Karen Suddeth, Project Director Carole Carr, Communications & Visibility Specialist
Integrating Quality Practice-based Opportunities within Campus-based Coursework and Field Experiences H325A
Essential Questions How can states, districts, and educator preparation programs work together to ensure equitable outcomes for diverse learners? What.
2019 OSEP Leadership Conference
Multiple background possibilities
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
IDEA plus ESSA: Supporting All Students Through Strategic Alignment
Access, Equity, and Progress
From “Talking the Talk” to “Walking the Walk:” RI’s Engagement Story
Using Data to Build LEA Capacity to Improve Outcomes
Moving the Needle: Cross-Sector Indicators of High Quality Inclusion
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
COS Training to Promote Data Quality: What’s Working, What’s Needed
Want to Integrate Your Data? Let’s Start at the Beginning
Sustaining Success: Proactive Leadership for Lasting Impact
Alignment Across the Ages
Staff Turnover and Silos in Our State, Oh My!
Leveraging Partnerships with Families to Achieve Positive Outcomes
Presentation transcript:

2018 OSEP Project Directors’ Conference OSEP Disclaimer 2018 OSEP Project Directors’ Conference DISCLAIMER: The contents of this presentation were developed by the presenters for the 2018 Project Directors’ Conference. However, these contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. (Authority: 20 U.S.C. 1221e-3 and 3474)

Realizing the Promise of Learning Networks: Getting to Real Results Rorie Fitzpatrick, NCSI / WestEd Meg Kamman, CEEDAR / University of Florida Megan Vinh, DaSy & ECTA / UNC-Chapel Hill OSEP Project Directors’ Conference | July 2018

A look at the Research Effective Networked Improvement Communities (NICs) are: Focused on a well-specified aim Guided by a deep understanding of the problem, the system that produces it, and a theory of improvement relative to it Disciplined by the rigor of improvement science Coordinated to accelerate the development, testing, and refinement of interventions and their effective integration into practice across varied educational contexts. Learning to Improve: How America's Schools Can Get Better at Getting Better, edited by Bryk . S., Gomez L. M., Grunow A., and LeMahieu P. Harvard University Press, Cambridge, MA, 2015. 280 pp. ISBN‐10: 1612507913.

Learning Networks Featured Today

CEEDAR Cross-State Learning Groups

ECTA/DaSy Cross-State Opportunities

NCSI Networked Improvement Communities

Launching, Growing/Sustaining, and Measuring the Impact of Learning Networks Lessons Learned

Launching Learning Networks Will you use an existing platform or create a new space? Ensuring adequate virtual infrastructure What are burning issues related to your mission? Identifying common problems of practice How can you provide assistance to staff leading networks? Planning Supports What strategies can we use to get started? Getting Started Lessons Learned

Launching Learning Networks: Adequate Virtual Infrastructure Will you use an existing platform or create a new space? Ensuring adequate virtual infrastructure

Launching Learning Networks: Identifying Common Problems of Practice Affinity groups: State leads submitted topics based on state interests and needs Compiled common areas to select 4 groups Dyslexia/literacy Licensure/ shortages Data/outcomes for preparation programs Added two groups in topics the center hopes to push the field in 4. Inclusive Leadership 5. Culturally Responsive Policy and Practice Topical Action Groups: Selected topics aligned to common state blueprint goals High Leverage Practices, Inclusive Leadership and Clinical Practice What are burning issues related to your mission? Identifying common problems of practice

Launching Learning Networks: Planning Supports How can you provide assistance to staff leading networks? Planning Supports Templates for planning Scope and sequence Synchronous A-synchronous Engagement strategies

Launching Learning Networks: Getting Started What strategies can we use to get started? Getting Started Initial Activities Building Relationships Setting Group Norms Establishing a shared vision and goals

Growing and Sustaining Learning Networks How will you clarify what participation means? TA Agreements How will you support alignment of the work with ongoing efforts? Align work with ongoing efforts What “practices” will you use? Capacity Building Practices What data do you need to collect to ensure the community meets participants needs? Formative evaluation Lessons Learned

Growing and Sustaining Learning Networks continued Application process Participants apply Selection criteria Clearly articulate the TA activities and the expectations of participation in the learning network Cross-state Individualized action plan Evaluation expectations Clearly articulate outcomes Lessons Learned

Growing and Sustaining Learning Networks continued. How will you support alignment of the work with ongoing efforts? Align work with ongoing efforts Ensure cross-state activities are driven by common needs Action Plans should be based on data and connect with ongoing participant work Needs to be meaningful Lessons Learned

Growing and Sustaining Learning Networks continued.. What “practices” will you use? Capacity Building Practices Capacity building is key! Mix of cross-state and individualized TA providers should be using best practices to support participants in building capacity and making progress Structure activities to maximize participant engagement and learning All participants bring important knowledge, but may need support in engaging in learning networks Lessons Learned

Growing and Sustaining Learning Networks continued… What data do you need to collect to ensure the community meets participants needs? Formative evaluation Gather data on an ongoing basis about the processes used within the learning network Are people actively engaging and sharing knowledge? Are people making progress and building capacity? If not, why not? Lessons Learned

Measuring the Impact of Learning Networks Lessons Learned

Measuring Impact | Step 1 What does impact mean relative to the TA/PD effort? Define impact Look to existing evaluation guidance logic model, theory of action, outcomes chart, etc.… Get concrete What does it mean to “succeed” in the TA/PD? What tangible actions or artifacts will show change? Be aware of outputs vs outcomes Did policies, procedures, and/or practices change? Lessons Learned

Measuring Impact | Step 2 What data will show resulting impact? Determine appropriate measures Data access Do you have connections with the right sources to access the data that would tell your story? Data legitimacy Think about causality; consider inference Be realistic Think about grain size and level of impact within pragmatic parameters Child/student/family impact; systems impcat; incremental benchmarks vs large scale impact Plan for mixed methods Qualitative and quantitative outcomes Lessons Learned

Measuring Impact | Step 3 How can you get the information you need? Get the data Partner among program staff and project evaluators to develop and implement data collection; use data Coach/explain purpose of impact data to clients so they can give you the kinds of evidence you need Not fabricating results —giving useful info when it exists Help staff understand importance of impact data and project’s approaches to capturing / using impact data Set expectations up front for data collection Coordinate data collection to reduce over-burden Elevate importance of measuring impact E.g., Director-level communication about purpose, approach, use of impact data…both internally and externally Lessons Learned

Measuring Impact | Step 4 Who needs to know about impact and how? Tell the story Use your data Formative planning Return on investment Annual Performance Reports Stakeholder partnerships Lead with Qualitative data, bolster with quantitative data to support contextual understanding and nature of the systems change Lessons Learned

Let’s Talk! The research… The pragmatics… To what extent are you anchoring the design and implementation of your current learning networks on the research-based principals of networked improvement communities? Do you need to up your game in translating research to practice? The pragmatics… How can these lessons learned from national TA projects inform the launch, growth, sustainability or measurement of learning networks in your own projects?

For more info or follow up… Meg Kamman mkamman@coe.ufl.edu Megan Vihn megan.vinh@unc.edu Rorie Fitzpatrick rfitzpa@wested.org

Thank you to our funders! This content was produced under U.S. Department of Education, Office of Special Education Programs, Award Nos. H325A170003, HR326R140006, H325A17, need DaSy, The views expressed herein do not necessarily represent the positions or polices of the U.S. Department of Education. No official endorsement by the U.S. Department of Education of any product, commodity, service, or enterprise mentioned in this content is intended or should be inferred.

2018 OSEP Project Directors’ Conference OSEP Disclaimer 2018 OSEP Project Directors’ Conference DISCLAIMER: The contents of this presentation were developed by the presenters for the 2018 Project Directors’ Conference. However, these contents do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the Federal Government. (Authority: 20 U.S.C. 1221e-3 and 3474)