Dial-in: 1-877-512-6886 Pass code: 16 16 32 2775 SPDG Directors’ Webinar Communities of Practice State Systemic Improvement Plan Evaluation Coaching 1.

Slides:



Advertisements
Similar presentations
WV High Quality Standards for Schools
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Quality Assurance Review Team Oral Exit Report District Accreditation Forsyth County Schools February 15, 2012.
PD Plan Agenda August 26, 2008 PBTE Indicators Track
Amy Jenks, Grant Coordinator NH RESPONDS Grant (SPDG) (603) 1 New Hampshire’s Advisory Board for SPDG.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Professional Growth= Teacher Growth
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
CONNECTICUT ACCOUNTABILTY FOR LEARNING INITIATIVE Executive Coaching.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
College, Career, & Life Readiness The contents of this PowerPoint were developed under a grant from the US Department of Education, H323A However,
Developing School-Based Systems of Support: Ohio’s Integrated Systems Model Y.S.U. March 30, 2006.
Principal Evaluation in Massachusetts: Where we are now National Summit on Educator Effectiveness Principal Evaluation Breakout Session #2 Claudia Bach,
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
Iowa’s Teacher Quality Program. Intent of the General Assembly To create a student achievement and teacher quality program that acknowledges that outstanding.
Washington State Teacher and Principal Evaluation 1.
Leading Change Through Differentiated PD Approaches and Structures University-District partnerships for Strengthening Instructional Leadership In Mathematics.
Leadership: Connecting Vision With Action Presented by: Jan Stanley Spring 2010 Title I Directors’ Meeting.
Assistant Principal Meeting August 28, :00am to 12:00pm.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Elementary & Middle School 2014 Mathematics MCAS Evaluation & Strategy.
The Academy of Pacesetting Districts Introducing...
Elementary & Middle School 2014 ELA MCAS Evaluation & Strategy.
The contents of this presentation were developed under a grant from the US Department of Education, #H323A However, these contents do not necessarily.
Performance-based Contracting and Maine’s State Personnel Development Grant (SPDG) Dawn Kliphan March 28, 2010.
PBIS Tier 1 Coaches Training
College, Career, & Life Readiness The contents of this PowerPoint were developed under a grant from the US Department of Education, H323A However,
Communities of Practice February 16, Community of Practice: What is it? A group of people who engage in a process of collective learning. “CoPs.
Katie A. Learning Collaborative For Audio, please call: Participant code: Please mute your phone Building Child Welfare and Mental.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Quality Assurance Review Team Oral Exit Report District Accreditation Bibb County Schools February 5-8, 2012.
2011 SIGnetwork Regional Meetings Professional Development: the heart of the matter.
Reform Model for Change Board of Education presentation by Superintendent: Dr. Kimberly Tooley.
The NCATE Journey Kate Steffens St. Cloud State University AACTE/NCATE Orientation - Spring 2008.
Angela M. Denning State Special Education Director Nancy Konitzer State Title I Director Authentic Stakeholder.
Quality Assurance Review Team Oral Exit Report District Accreditation Murray County Schools February 26-29, 2012.
Quality Assurance Review Team Oral Exit Report District Accreditation Rapides Parish School District February 2, 2011.
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
SHERRI YBARRA, SUPERINTENDENT OF PUBLIC INSTRUCTION SUPPORTING SCHOOLS AND STUDENTS TO ACHIEVE.
Project Design Jennifer Coffey OSEP May 4,
Quality Assurance Review Team Oral Exit Report School Accreditation AUTEC School 4-8 March 2012.
Quality Assurance Review Team Oral Exit Report School Accreditation Center Grove High School 10 November 2010.
Focus on Professional Learning Communities State Personnel Development Grant D. Ahrens 5/10/2013.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
State Systemic Improvement Plan (SSIP) Office of Special Education January 20, 2016.
Common Core Parenting: Best Practice Strategies to Support Student Success Core Components: Successful Models Patty Bunker National Director Parenting.
What does it mean to be a RETA Instructor this project? Consortium for 21 st Century Learning C21CL
The Georgia Department of Juvenile Justice Board of Education Presentation May 26, 2011.
A Professional Development Series from the CDC’s Division of Population Health School Health Branch Professional Development 101: The Basics – Part 1.
Evidence-Based and Promising Practices to Increase Graduation and Improve Outcomes Dr. Loujeania Bost Dr. Catherine Fowler Dr. Matthew Klare.
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
Instructional Leadership Supporting Common Assessments.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
Clinical Practice evaluations and Performance Review
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
Zelphine Smith-Dixon, State Director of Special Education
Measuring Project Performance: Tips and Tools to Showcase Your Results
G-CASE Fall Conference November 14, 2013 Savannah, Ga
2018 OSEP Project Directors’ Conference
Assessing Professional Learning via Data Management Systems
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Presentation transcript:

Dial-in: Pass code: SPDG Directors’ Webinar Communities of Practice State Systemic Improvement Plan Evaluation Coaching 1

Attendance: Please enter name, state, and role with the SPG in the chat pod Roll Call

Mute phones and computers *6 / #6

4 Overview of Today’s Webinar Jennifer Coffey State Systemic Improvement Plan Evaluation Coaching

Office of Special Education Programs (OSEP) Using Best Practices from the State Personnel Development Grants (SPDG) Work to Build a Stronger State Systemic Improvement Plan (SSIP) Tara Courchaine, Ed.D. and Janine Rudder Education Program Specialists Office of Special Education Programs

Proposed Mission Create a forum for States to learn from and collaborate with peers across the U.S. consistently and in real time around how the SPDG work supports the SSIP work to improve student outcomes.

Why Have a SPDG / SSIP CoP? SPDG best practices in professional development and implementation may be solid foundation for SSIP work Relationship building inspires and creates opportunities Problem Solving with peers who may have similar issues Capacity building Increasing sustainability

Format ( Webinar) State Spotlight: Webinars will include presentations from states on how they are currently integrating their SPDG and SSIP work, including the biggest barriers and any successful practices they would like to share with the community for feedback and discussion. Experts from the Field: In addition, some webinars will include presentations from experts in the field on specific topics to align with participant needs

Proposed Problem of Practice ( Virtual Meeting) States (1-3) will describe, in great detail, a problem or barrier it is encountering with its SPDG or SSIP work, including previous attempts to solve the problem. Community members will respond and offer suggestions and / or strategies.

Problem of Practice Cont’d…. The problem would focus on a specific aspect of the work such as: ◦ Evaluation ◦ Sustainability ◦ Supporting LEAs in implementing evidence based practices ◦ Infrastructure development

Past Webinar Topics December- State Spotlight- State Example of SPDG/SSIP alignment February- Evidence Based Practices and Evaluation April- Data Collection and Data-based decision making

Tools Shared Hexagon Tool

Tools Shared Data Collection Tool

SSIP Community of Practice Alayna Gee Idaho SPDG

Takeaways from the Community Participation Problem of Practice Presentations

- Created the opportunity for collaboration between the SPDG and SSIP State Leads - Required we carve out dedicated time to meet - Led to scheduled collaborative meetings - Conversations moved from SSIP or SPDG related to including both

- Required us to identify many barriers and narrow them down to the most important - Feedback provided a place for us to begin our conversations - Led to the addition of team members (assessme nt)

- Guided us in how to begin to identify similarities in our processes Evaluation paradigms Hexagon Tool Evidence Based Practices Resources (National Center on Intensive Intervention and What Works Clearinghouse)

- Amount of Feedback - Time - Magnitude of not knowing what we don’t know

- Dedicate 5-10 minutes during a regularly scheduled meeting to get on the wiki - Identify states who have similar barriers and build a bridge

Alignment of Nevada SSIP and SPDG Presented by: Ann Alexander and Julie Bowers - Nevada Department of Education

Alignment of Nevada SSIP and SPDG Goals Objectives Professional Development Evaluation

Common Goal for SPDG & SIMR The NDE will support improved performance of third-grade students with disabilities on statewide assessments of reading/language arts through building LEA capacity to strengthen the skills of special education teachers in assessment, instructional planning, & teaching.

Alignment of NV Initiatives to SSIP

Professional Development Assess, Plan, Teach (APT)  Structured framework for on-going Professional Development to strengthen the skills of teachers  Aligns to Key State and District Improvement Strategies  APT model will be expanded to other districts through SPDG

Evaluation  SSIP Evaluation Plan aligns to Evaluation Plan in SPDG  Identified data collection and analysis systems are the same for both SPDG and SSIP  Use of State/District Data System - provides efficient data collection

Benefits  Clear alignment between all State and LEA initiatives  Maximizes resources - human and fiscal  No need to have multiple data collection and analysis for each plan For more information:

 Welcome & Overview – Corinne Weidenthal, OSEP  Survey of Needs/Interests – Polly Maccini, OSEP  Presentations ◦ Brent Garrett, SPDG Evaluator- Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models ◦ Jennifer Gonzales, SPDG Director and SSIP Coordinator, Arkansas Department of Education - Alignment of Arkansas’ SPDG and SSIP  Next meeting: June 28 th – 1pm EST – Presentation on SSIP/SPDG Alignment – Barb Guy, Iowa

The purpose of the SPDG Evaluators’ ISC is to share knowledge and resources related to the implementation of SPDG evaluation plans. Meetings to date-  Jan. 12, Kick-Off  Feb. 23 rd - Brent Garrett: Evaluation Frameworks  April 26 th - Janine Rudder & Jennifer Gonzales: SSIP/SPDG Evaluation Alignment Through bimonthly meetings and the use of the wiki for follow-up discussions, the following items may be considered for future meetings as reported on the survey:

Ideas, examples and implications on evaluation of the alignment of the SPDG with the SSIP and SIMR *4.0 Processes and dashboards related to data reporting and data-based decision making 4.7 Assessments/tools for measuring the effectiveness of implementation and professional development 4.7 Resources for performance-based assessments, site visit protocols and observation tools 4.8 Assessments/tools for measuring the effectiveness of instructional coaching (quality) and the impact of coaching on student performance 5.2 Ideas for measuring sustainability5.3 Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models 5.8 Methods for evaluating big data sets and reporting results to reach a wide audience 7.7 Ideas for ensuring sustainability and completion of evaluations7.9 The evaluators’ role particularly during times of project staff turnover8.5

CompletedTopicRanking √ Ideas, examples and implications on evaluation of the alignment of the SPDG with the SSIP and SIMR *4.0 Processes and dashboards related to data reporting and data-based decision making 4.7 Assessments/tools for measuring the effectiveness of implementation and professional development 4.7 Resources for performance-based assessments, site visit protocols and observation tools 4.8 Assessments/tools for measuring the effectiveness of instructional coaching (quality) and the impact of coaching on student performance 5.2 Ideas for measuring sustainability5.3 √ Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models 5.8 Methods for evaluating big data sets and reporting results to reach a wide audience 7.7 Ideas for ensuring sustainability and completion of evaluations7.9 The evaluators’ role particularly during times of project staff turnover8.5

Meeting DateTopic June 28 th – 1pm ESTContinued - Evaluation Alignment of the SPDG with the SSIP and SIMR – Barb Guy, IA Department of Education August 1 st -3 rd – During PD Conference - TBD TBD Oct th – SPDG National meeting - TBD TBD Dec 13 th – 1pm ESTTBD

Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models ◦ Overview of the Presentation and 3 Takeaways ◦ Questions

Evaluation Frameworks Used in SPDG Grant Applications May 6, 2016 Brent Garrett

Discussion Topics Reduction in the length of the SPDG narrative Goals, objectives, and activities Logic model Evaluation plan

Goals, Objectives, and Activities A well thought out Project Design section is critical for a well thought out logic model and evaluation plan. Goals – Long-Term Outcomes Objectives – Intermediate Outcomes Activities – Short-Term Outcomes or Outputs Aligning objectives with SPDG Program Measure 1 - Evidence-Based Professional Development practices

Logic Models Resources Using Logic Models to Enhance Evaluation, Amy Germuth, WESTAT Using Logic Models to Enhance Evaluation Conceptual presentation of program/initiative. Links goals/objectives/activities with expected outputs and outcomes. Represents causal, or if-then, relationships. Logic models provide a strong foundation for the evaluation plan.

Goal 2: The DOE will support improved performance of third grade students with disabilities on statewide assessments of reading/language arts through building LEA capacity to strengthen the skills of special education teachers in assessment, instructional planning, & teaching. Inputs Process Outcomes Program Investments Objectives Outputs Short Term Less Than A Year Short Term Less Than A Year Intermediate One to Two Years Intermediate One to Two Years Long Term Three years out Long Term Three years out Office of Special Education Title 1 Office Brent County Special Education administrators, coaches, & teachers Materials & processes from the Consortium on Excellence in Reading Reading assessments PEP Distance technology for meetings & trainings Office of Special Education Title 1 Office Brent County Special Education administrators, coaches, & teachers Materials & processes from the Consortium on Excellence in Reading Reading assessments PEP Distance technology for meetings & trainings 1: To select LEAs & PD providers with the capacity & expectations necessary to implement APT. 2: To enhance the capacity of LEA personnel to implement, replicate, & sustain APT through evidence-based (EB) training strategies. 3. To enhance the capacity of district & schools to implement, replicate, & sustain APT through EB coaching strategies. 4: To increase the use of implementation, intervention, & outcome data to support decision making at schools, LEAs, & the SEA. 5: To ensure administrators are trained to support their staff & initiatives to develop & sustain APT. 1: To select LEAs & PD providers with the capacity & expectations necessary to implement APT. 2: To enhance the capacity of LEA personnel to implement, replicate, & sustain APT through evidence-based (EB) training strategies. 3. To enhance the capacity of district & schools to implement, replicate, & sustain APT through EB coaching strategies. 4: To increase the use of implementation, intervention, & outcome data to support decision making at schools, LEAs, & the SEA. 5: To ensure administrators are trained to support their staff & initiatives to develop & sustain APT. 1a: 75 CCSD schools implement APT 1b: 40 schools in 2 additional districts implement APT 2a: 3 days of trainings in 15 Performance Zones/year 2b: 4 trainer meetings/ year to review training data 2c: Participants satisfied with quality, relevance, & usefulness of training 3a: Coaching model validated 3b: Monthly coaching contacts with teachers 3c: Monthly coach meetings to review coaching data 4a: APT evaluation tools administered & reported on a regular basis 4b: Evaluation data are shared on a regular basis 5a: Administrators in all APT schools trained. 5b: Administrators in all APT schools coached. 1a: 75 CCSD schools implement APT 1b: 40 schools in 2 additional districts implement APT 2a: 3 days of trainings in 15 Performance Zones/year 2b: 4 trainer meetings/ year to review training data 2c: Participants satisfied with quality, relevance, & usefulness of training 3a: Coaching model validated 3b: Monthly coaching contacts with teachers 3c: Monthly coach meetings to review coaching data 4a: APT evaluation tools administered & reported on a regular basis 4b: Evaluation data are shared on a regular basis 5a: Administrators in all APT schools trained. 5b: Administrators in all APT schools coached. 1a: PD meets the needs of participating LEAs. 1b: Evidence-based practices are used by participating LEAs 2a: Stat. sig. increases in pre/post training evaluations. 2b: PD participants report training increased their knowledge of APT. 3: PD participants report coaching increased their knowledge of APT. 4a: Project staff know how to use the data for future program action. 4b: SPDG stakeholders report data are shared regularly & are useful. 5: Administrators report training increased their knowledge of APT. 1a: PD meets the needs of participating LEAs. 1b: Evidence-based practices are used by participating LEAs 2a: Stat. sig. increases in pre/post training evaluations. 2b: PD participants report training increased their knowledge of APT. 3: PD participants report coaching increased their knowledge of APT. 4a: Project staff know how to use the data for future program action. 4b: SPDG stakeholders report data are shared regularly & are useful. 5: Administrators report training increased their knowledge of APT. 90% of teachers implement APT with fidelity. 80% of teachers demonstrate greater skills in meeting students’ academic needs. 75% of students show growth on Core Phonics Survey & other assessments. 80% of administrators have the necessary skills to support and sustain APT. Increased parents understanding of the APT & the importance of the Read by 3rd Grade statewide initiative & how it applies to special education students. 90% of teachers implement APT with fidelity. 80% of teachers demonstrate greater skills in meeting students’ academic needs. 75% of students show growth on Core Phonics Survey & other assessments. 80% of administrators have the necessary skills to support and sustain APT. Increased parents understanding of the APT & the importance of the Read by 3rd Grade statewide initiative & how it applies to special education students. Increased percentage of students with IEP score proficient on statewide assessments. Gaps in performance between students with IEPs & all students are decreased. Increased number of students with IEPs receive the majority of their instruction in general education classrooms. Increased percentage of students with IEP score proficient on statewide assessments. Gaps in performance between students with IEPs & all students are decreased. Increased number of students with IEPs receive the majority of their instruction in general education classrooms.

Objectives Inputs or Program Investments Outputs (Process Measures) Short-Term Outcomes Intermediate Outcomes Long-Term Outcomes Obj. 1: To develop the capacity of those providing PD on ELOs, transition planning, and parent engagement, and to define the expectations and commitment of those receiving PD.  DOE Staff  PTI  VR  MCST  KSC  SLC  Regional trainers/coaches established & trained  Annual, a new cohort of committed LEAs are recruited  Reviews of LEA transition policies & practices are conducted  Plans for transition-related PD are developed for each LEA  LEAs report satisfaction with the quality of PD  PD meets the needs of participating LEAs  PD is provided collaboratively  Evidence-based practices are used by participating LEAs  PD is aligned with related DOE initiatives.  All LEAs fulfill SPDG commitments  A greater percentage of students with disabilities in participating LEAs graduate, college and career ready.  There is a decrease in the percentage of students with disabilities in participating LEAs dropping out.  Increase in percentage of LEAs meeting Indicator 13 compliance  Improved outcomes for Indicator 14  Parents report greater levels of satisfaction (Indicator 8) Obj. 2: To increase and expand the use of ELOs in all regions of the state.  DOE Staff  QED  VR  PTI  KSC  MCST  SLC  Training curriculum established  Initial training of LT and first LEA cohort by QED  Annual training thereafter conducted by LT  Stat. sig. increases in pre/post training evaluations.  PD participants report training increased their knowledge of ELOs & transition planning  Parents report they are more knowledgeable about transition planning.  PD participants report they are more skilled about ELOs & transition planning.  PTI has greater capacity to provide PD on family engagement  Increased use of ELOs for core courses Obj. 3: To increase the use of best practice, EB transition planning, including enhanced parent engagement strategies.  DOE Staff  VR  PTI  KSC  MCST  SLC  Training curricula established (transition planning, family engagement, RENEW, etc.)  Training of PD providers  Regional and LEA training workshops

Evaluation Plan Evaluation theory Organize by RFP criteria Formatting information Use of performance indicators, particularly addressing GPRA/Program Measures Formative versus summative evaluation

Evaluation Criteria (C-21 of RFP) (i) The extent to which the methods of evaluation are thorough, feasible, and appropriate to the goals, objectives, and outcomes of the proposed project. (ii) The extent to which the methods of evaluation provide for examining the effectiveness of project implementation strategies. (iii) The extent to which the methods of evaluation include the use of objective performance measures that are clearly related to the intended outcomes of the project and will produce quantitative and qualitative data to the extent possible. (iv) The extent to which the evaluation will provide performance feedback and permit periodic assessment of progress toward achieving intended outcomes.

Sample Instrument Table (Criteria i) InstrumentData SourcePurpose Training/coaching evaluation forms  Collected after each training session and after 2 coaching sessions/year  To improve delivery of training and coaching IC Team Level of Implementation (LOI)  Interviews with a random sample of Case Managers & Requesting Teachers, each principal, & Team Member  Review of a sample of a CDF  Objective measure of the integrity of program implementation and use. Case Documentation Form Review (CDF)  Case documentation reviewed, scored, and rated by project coordinator  Measure of Case Goal Attainment  Provides a measure of accuracy in documentation District Implementation Sustainability Checklist  District leadership completes the assessment annually.  To assess the capacity of district leadership to support and sustain ICAT implementation

Sample Evaluation Table (Criteria i) Selection ActivitiesData SourcesAnalysis/Method 1.1.1: Develop selection criteria for participating districts.  Meeting Minutes  Final agreement form  Qualitative analysis of minutes & selection criteria 1.1.2: Application for selecting districts developed and disseminated.  Documentation of dissemination  Completed applications  Qualitative analysis of received applications : Professional development providers meet certification standards for delivering PD  Documentation of ICAT certification  Review of list of ICAT certified trainers 1.1.4: Roles & responsibilities for professional development providers are determined.  Copy of expectations, job descriptions, MOUs  Qualitative analysis of expectations, job descriptions, MOUs

Sample Evaluation Table (Criteria i) Type of Outcome Outcome DescriptionEvaluation Questions How Will We Know the Intended Outcome Was Achieved? (performance indicator) Measurement/Data Collection Method Timeline (projected initiation and completion dates) Short term (systems) (G-4) LEA and building leadership report that DDOE communication has positively impacted their expectations for SWD. To what degree & how well was DDOE communication used with LEAs in an effective manner? 90% of LEA staff report that the communication with the DOE was effective. Communication Logs LEA survey Middle & end of each school year. Short term (systems) (G-4) Project partners report that DDOE communication efforts were effective in promoting increased expectations for students with disabilities. To what degree & how well was DDEO communication used with partners in an effective manner? 90% of project partners report that the communication with the DOE was effective. Communication Logs Partner survey Middle & end of each school year.

Evaluating Implementation Strategies (Criteria ii) Measuring formative/process work Connection to outputs in logic model Importance of implementation science Stages of implementation Implementation drivers

Program/GPRA Measures (Criteria iii) MeasureDescription/Purpose Program Measure 1: We will assess the degree to which evidence-based PD is used to support implementation, using the SPDG Evidence-Based PD worksheet. This includes a focus on selection of schools and providers, training, coaching, performance assessment, and facilitative administrative supports. We will collect data related to growth in teachers’ knowledge and skills, and data on fidelity of training and coaching implementation. A mixed set of qualitative & quantitative data will be provided to assess the quality of PD. Program Measure 2 We will use the IC Team Level of Implementation (LOI) as our ICAT fidelity measure. Data will be collected by a trained external observer from a random sample of Case Managers & teachers, each principal & Team Member, and a review of student outcomes to assess the fidelity of program implementation. We will use the CORE K-6 Implementation Rubric to assess the degree to which our second goal, Assess, Plan, Teach, is implemented with fidelity. Both instruments are in Appendices G and H. Program Measure 3 To report on the efficiency performance measure, the degree of sustained PD activities will be tracked via an online database, known as the PD Log. This has been used in successive SPDGs to collect data on the type, amount, and audience of PD. Through this database, we can track the percentage of staff time (which will be used to determine personnel cost) spent on sustained PD activities, such as coaching. Travel & other costs are tracked to determine the total cost/percentage of sustained activities.

Project Performance Indicators (Criteria iii) Objective 1 (Selection Driver):  ICAT implemented with fidelity in four new districts and approximately 20 schools.  APT implemented with fidelity in 75 Brent County schools and in 40 schools in 2 additional districts.  6 new Level 5 and 6 ICAT Trainers are certified by Objective 2 (Training Driver):  75% of training workshops across initiatives will result in statistically significant increases in participants’ knowledge, as measured by workshop pre/post evaluations.  80% of PD participants report that training increased their knowledge of ICAT/ATP. Objective 3 (Coaching Driver):  80% of PD participants report that SPDG coaching increased their skills to implement ICAT/ATP. Objective 4 (Performance Assessment Driver):  90% of participating schools implement ICAT/APT with fidelity.  SPDG stakeholders report data are shared regularly & are useful. Objective 5 (Facilitative Administrative Supports Driver):  80% of administrators report training increased their knowledge of ICAT/APT.

Performance Feedback (Criteria iv) Reporting – Formative reports (training, coaching logs) – Monthly/Quarterly reporting to client – Annual performance/continuation report Formats for reporting – Formal reports – InfoGraphics – Abstracts

Presentation Jennifer Gonzales – AR SPDG Evaluation Alignment of Arkansas’ SPDG and SSIP – Overview of the Presentation and 3 Takeaways – Questions

Alignment of Arkansas’ SPDG and SSIP Evaluation Jennifer Gonzales Arkansas Department of Education Special Education Unit SPDG Director and SSIP Coordinator May 5, 2016

What do the SSIP and SPDG have in Common Individualized based upon State needs Based on Implementation Science Frameworks Focuses on Building State Education Agency Capacity Scaling Up of Evidence- Based Practices at the District level Focuses on Professional Learning and Implementation Fidelity Includes Family Involvement and Stakeholder Engagement Involves Rigorous and Relevant Evaluation Methods Uses a Theory of Action Utilizes Logic Model Development Required Annual Progress Reports Supported by National Communities of Practice Shares similar National Resources

Poll Question To what degree does your SPDG evaluation align with or support your state’s SSIP evaluation plan? a.Substantially b.Somewhat c.A Little d.Not at All

Strengths of SSIP and SPDG Alignment Rigorous and relevant evaluation plan Leveraged and shared programmatic and evaluation resources Coordinated supports State department of education leadership support A Focus on SEA infrastructure development Time

Potential Challenges Who “owns” the work, where does the work “live” Level of alignment Reporting timelines Vocabulary Compliance vs. Results Evaluating change in attitudes, beliefs, culture

Considerations Develop networks of critical friends Be flexible with your alignment –Changes in resources –Leadership changes –Political changes Never forget to evaluate the adaptive side of systems change –Leading by ConveningLeading by Convening

Key Strategies for Building Communities of Practice

Purpose To provide a brief overview of the following: The features of a Community of Practice and Vision 5 Key elements for Building Communities of Practice Example and resources

Communities of Practice “Communities of practice are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly” Wenger-Traynor, E., & Wenger-Trayner, B. (2015), p. 1

Building Communities of Practice Consider issues from a community lens Survey needs of group members Sensing Issues Consider all possible stakeholders Make sure there is an incentive to participate Inviting Participants Encourage active participation Sharing Information Develop and implement an action plan Agree to an end goal and timeframe Taking Action Reflect on the process E.g., Do we need to shift how we are working things? Do we need new roles and actions? Determining Next Steps Cashman, J., Linehan, P., & Rosser, M. (2007). Pp

Rare Sharing of Data Example True vision of CoP realized

References Cashman, J., Linehan, P., Purcell, L., Rosser, M., Schultz, S., & Skalski, S. (2014). Leading by convening: A blueprint for authentic engagement. Alexandria, VA: National Association of State Directors of Special Education. Retrieved from: – The report includes a description of the framework, Leading by Convening, and discusses the three habits of interaction from Dr. Wenger’s work with CoP (coalescing around issues, ensuring relevant participation, and doing work together), across elements of interaction and the depth of the interaction. Cashman, J., Linehan, P., & Rosser, M. (2007). Communities of Practice: A new approach for solving complex education problems. Alexandria, VA: National Association of State Directors of Special Education. Retrieved from – The guide defines CoP, outlines the phases in building and creating them, and outlines tips for facilitating the process and participant engagement

References Cashman, J., Linehan, P., & Rosser, M. (2007). Facilitating community: Key strategies for building communities of practice to accomplish state goals. New Eyes: Meeting Challenges Though Communities of Practice. Retrieved from – The brief outlines strategies for supporting CoP, including coalescing around issues and highlighting successful practices. Wenger-Traynor, E., & Wenger-Trayner, B. (2015). Communities of Practice a brief introduction. Retrieved from: content/uploads/2015/04/07-Brief-introduction-to-communities-of-practice.pdfhttp://wenger-trayner.com/wp- content/uploads/2015/04/07-Brief-introduction-to-communities-of-practice.pdf – The brief provides an overview of CoP, common myths associated CoP, and resources for additional reading on the topic.

64 To Join SIGnetwork CoPs Contact: Jennifer Coffey or John Lind