Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dial-in: 1-877-512-6886 Pass code: 16 16 32 2775 SPDG Directors’ Webinar Communities of Practice State Systemic Improvement Plan Evaluation Coaching 1.

Similar presentations


Presentation on theme: "Dial-in: 1-877-512-6886 Pass code: 16 16 32 2775 SPDG Directors’ Webinar Communities of Practice State Systemic Improvement Plan Evaluation Coaching 1."— Presentation transcript:

1 Dial-in: 1-877-512-6886 Pass code: 16 16 32 2775 SPDG Directors’ Webinar Communities of Practice State Systemic Improvement Plan Evaluation Coaching 1

2 Attendance: Please enter name, state, and role with the SPG in the chat pod Roll Call

3 Mute phones and computers *6 / #6

4 4 Overview of Today’s Webinar Jennifer Coffey State Systemic Improvement Plan Evaluation Coaching

5 Office of Special Education Programs (OSEP) Using Best Practices from the State Personnel Development Grants (SPDG) Work to Build a Stronger State Systemic Improvement Plan (SSIP) Tara Courchaine, Ed.D. and Janine Rudder Education Program Specialists Office of Special Education Programs

6 Proposed Mission Create a forum for States to learn from and collaborate with peers across the U.S. consistently and in real time around how the SPDG work supports the SSIP work to improve student outcomes.

7 Why Have a SPDG / SSIP CoP? SPDG best practices in professional development and implementation may be solid foundation for SSIP work Relationship building inspires and creates opportunities Problem Solving with peers who may have similar issues Capacity building Increasing sustainability

8 Format ( Webinar) State Spotlight: Webinars will include presentations from states on how they are currently integrating their SPDG and SSIP work, including the biggest barriers and any successful practices they would like to share with the community for feedback and discussion. Experts from the Field: In addition, some webinars will include presentations from experts in the field on specific topics to align with participant needs

9 Proposed Problem of Practice ( Virtual Meeting) States (1-3) will describe, in great detail, a problem or barrier it is encountering with its SPDG or SSIP work, including previous attempts to solve the problem. Community members will respond and offer suggestions and / or strategies.

10 Problem of Practice Cont’d…. The problem would focus on a specific aspect of the work such as: ◦ Evaluation ◦ Sustainability ◦ Supporting LEAs in implementing evidence based practices ◦ Infrastructure development

11 Past Webinar Topics December- State Spotlight- State Example of SPDG/SSIP alignment February- Evidence Based Practices and Evaluation April- Data Collection and Data-based decision making

12 Tools Shared Hexagon Tool

13 Tools Shared Data Collection Tool

14 SSIP Community of Practice Alayna Gee Idaho SPDG agee@sde.idaho.gov

15 Takeaways from the Community Participation Problem of Practice Presentations

16 - Created the opportunity for collaboration between the SPDG and SSIP State Leads - Required we carve out dedicated time to meet - Led to scheduled collaborative meetings - Conversations moved from SSIP or SPDG related to including both

17 - Required us to identify many barriers and narrow them down to the most important - Feedback provided a place for us to begin our conversations - Led to the addition of team members (assessme nt)

18 - Guided us in how to begin to identify similarities in our processes Evaluation paradigms Hexagon Tool Evidence Based Practices Resources (National Center on Intensive Intervention and What Works Clearinghouse)

19 - Amount of Feedback - Time - Magnitude of not knowing what we don’t know

20 - Dedicate 5-10 minutes during a regularly scheduled meeting to get on the wiki - Identify states who have similar barriers and build a bridge

21

22 Alignment of Nevada SSIP and SPDG Presented by: Ann Alexander and Julie Bowers - Nevada Department of Education

23 Alignment of Nevada SSIP and SPDG Goals Objectives Professional Development Evaluation

24 Common Goal for SPDG & SIMR The NDE will support improved performance of third-grade students with disabilities on statewide assessments of reading/language arts through building LEA capacity to strengthen the skills of special education teachers in assessment, instructional planning, & teaching.

25 Alignment of NV Initiatives to SSIP

26 Professional Development Assess, Plan, Teach (APT)  Structured framework for on-going Professional Development to strengthen the skills of teachers  Aligns to Key State and District Improvement Strategies  APT model will be expanded to other districts through SPDG

27 Evaluation  SSIP Evaluation Plan aligns to Evaluation Plan in SPDG  Identified data collection and analysis systems are the same for both SPDG and SSIP  Use of State/District Data System - provides efficient data collection

28 Benefits  Clear alignment between all State and LEA initiatives  Maximizes resources - human and fiscal  No need to have multiple data collection and analysis for each plan For more information: jabowers@doe.nv.gov

29  Welcome & Overview – Corinne Weidenthal, OSEP  Survey of Needs/Interests – Polly Maccini, OSEP  Presentations ◦ Brent Garrett, SPDG Evaluator- Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models ◦ Jennifer Gonzales, SPDG Director and SSIP Coordinator, Arkansas Department of Education - Alignment of Arkansas’ SPDG and SSIP  Next meeting: June 28 th – 1pm EST – Presentation on SSIP/SPDG Alignment – Barb Guy, Iowa

30 The purpose of the SPDG Evaluators’ ISC is to share knowledge and resources related to the implementation of SPDG evaluation plans. Meetings to date-  Jan. 12, 2016- Kick-Off  Feb. 23 rd - Brent Garrett: Evaluation Frameworks  April 26 th - Janine Rudder & Jennifer Gonzales: SSIP/SPDG Evaluation Alignment Through bimonthly meetings and the use of the wiki for follow-up discussions, the following items may be considered for future meetings as reported on the survey:

31 Ideas, examples and implications on evaluation of the alignment of the SPDG with the SSIP and SIMR *4.0 Processes and dashboards related to data reporting and data-based decision making 4.7 Assessments/tools for measuring the effectiveness of implementation and professional development 4.7 Resources for performance-based assessments, site visit protocols and observation tools 4.8 Assessments/tools for measuring the effectiveness of instructional coaching (quality) and the impact of coaching on student performance 5.2 Ideas for measuring sustainability5.3 Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models 5.8 Methods for evaluating big data sets and reporting results to reach a wide audience 7.7 Ideas for ensuring sustainability and completion of evaluations7.9 The evaluators’ role particularly during times of project staff turnover8.5

32 CompletedTopicRanking √ Ideas, examples and implications on evaluation of the alignment of the SPDG with the SSIP and SIMR *4.0 Processes and dashboards related to data reporting and data-based decision making 4.7 Assessments/tools for measuring the effectiveness of implementation and professional development 4.7 Resources for performance-based assessments, site visit protocols and observation tools 4.8 Assessments/tools for measuring the effectiveness of instructional coaching (quality) and the impact of coaching on student performance 5.2 Ideas for measuring sustainability5.3 √ Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models 5.8 Methods for evaluating big data sets and reporting results to reach a wide audience 7.7 Ideas for ensuring sustainability and completion of evaluations7.9 The evaluators’ role particularly during times of project staff turnover8.5

33 Meeting DateTopic June 28 th – 1pm ESTContinued - Evaluation Alignment of the SPDG with the SSIP and SIMR – Barb Guy, IA Department of Education August 1 st -3 rd – During PD Conference - TBD TBD Oct 11-12 th – SPDG National meeting - TBD TBD Dec 13 th – 1pm ESTTBD

34 Evaluation frameworks used in SPDG grant applications and examples of developing evaluation plans based on logic models with evaluation questions and methods based on models ◦ Overview of the Presentation and 3 Takeaways ◦ Questions

35 Evaluation Frameworks Used in SPDG Grant Applications May 6, 2016 Brent Garrett brent@bgarrettconsulting.net

36 Discussion Topics Reduction in the length of the SPDG narrative Goals, objectives, and activities Logic model Evaluation plan

37 Goals, Objectives, and Activities A well thought out Project Design section is critical for a well thought out logic model and evaluation plan. Goals – Long-Term Outcomes Objectives – Intermediate Outcomes Activities – Short-Term Outcomes or Outputs Aligning objectives with SPDG Program Measure 1 - Evidence-Based Professional Development practices

38 Logic Models Resources https://www.osepideasthatwork.org/logicModel/index.asp http://www.smartgivers.org/uploads/logicmodelguidepdf.pdf http://www.cdc.gov/oralhealth/state_programs/pdf/logic_models.pdf Using Logic Models to Enhance Evaluation, Amy Germuth, WESTAT Using Logic Models to Enhance Evaluation Conceptual presentation of program/initiative. Links goals/objectives/activities with expected outputs and outcomes. Represents causal, or if-then, relationships. Logic models provide a strong foundation for the evaluation plan.

39 Goal 2: The DOE will support improved performance of third grade students with disabilities on statewide assessments of reading/language arts through building LEA capacity to strengthen the skills of special education teachers in assessment, instructional planning, & teaching. Inputs Process Outcomes Program Investments Objectives Outputs Short Term Less Than A Year Short Term Less Than A Year Intermediate One to Two Years Intermediate One to Two Years Long Term Three years out Long Term Three years out Office of Special Education Title 1 Office Brent County Special Education administrators, coaches, & teachers Materials & processes from the Consortium on Excellence in Reading Reading assessments PEP Distance technology for meetings & trainings Office of Special Education Title 1 Office Brent County Special Education administrators, coaches, & teachers Materials & processes from the Consortium on Excellence in Reading Reading assessments PEP Distance technology for meetings & trainings 1: To select LEAs & PD providers with the capacity & expectations necessary to implement APT. 2: To enhance the capacity of LEA personnel to implement, replicate, & sustain APT through evidence-based (EB) training strategies. 3. To enhance the capacity of district & schools to implement, replicate, & sustain APT through EB coaching strategies. 4: To increase the use of implementation, intervention, & outcome data to support decision making at schools, LEAs, & the SEA. 5: To ensure administrators are trained to support their staff & initiatives to develop & sustain APT. 1: To select LEAs & PD providers with the capacity & expectations necessary to implement APT. 2: To enhance the capacity of LEA personnel to implement, replicate, & sustain APT through evidence-based (EB) training strategies. 3. To enhance the capacity of district & schools to implement, replicate, & sustain APT through EB coaching strategies. 4: To increase the use of implementation, intervention, & outcome data to support decision making at schools, LEAs, & the SEA. 5: To ensure administrators are trained to support their staff & initiatives to develop & sustain APT. 1a: 75 CCSD schools implement APT 1b: 40 schools in 2 additional districts implement APT 2a: 3 days of trainings in 15 Performance Zones/year 2b: 4 trainer meetings/ year to review training data 2c: Participants satisfied with quality, relevance, & usefulness of training 3a: Coaching model validated 3b: Monthly coaching contacts with teachers 3c: Monthly coach meetings to review coaching data 4a: APT evaluation tools administered & reported on a regular basis 4b: Evaluation data are shared on a regular basis 5a: Administrators in all APT schools trained. 5b: Administrators in all APT schools coached. 1a: 75 CCSD schools implement APT 1b: 40 schools in 2 additional districts implement APT 2a: 3 days of trainings in 15 Performance Zones/year 2b: 4 trainer meetings/ year to review training data 2c: Participants satisfied with quality, relevance, & usefulness of training 3a: Coaching model validated 3b: Monthly coaching contacts with teachers 3c: Monthly coach meetings to review coaching data 4a: APT evaluation tools administered & reported on a regular basis 4b: Evaluation data are shared on a regular basis 5a: Administrators in all APT schools trained. 5b: Administrators in all APT schools coached. 1a: PD meets the needs of participating LEAs. 1b: Evidence-based practices are used by participating LEAs 2a: Stat. sig. increases in pre/post training evaluations. 2b: PD participants report training increased their knowledge of APT. 3: PD participants report coaching increased their knowledge of APT. 4a: Project staff know how to use the data for future program action. 4b: SPDG stakeholders report data are shared regularly & are useful. 5: Administrators report training increased their knowledge of APT. 1a: PD meets the needs of participating LEAs. 1b: Evidence-based practices are used by participating LEAs 2a: Stat. sig. increases in pre/post training evaluations. 2b: PD participants report training increased their knowledge of APT. 3: PD participants report coaching increased their knowledge of APT. 4a: Project staff know how to use the data for future program action. 4b: SPDG stakeholders report data are shared regularly & are useful. 5: Administrators report training increased their knowledge of APT. 90% of teachers implement APT with fidelity. 80% of teachers demonstrate greater skills in meeting students’ academic needs. 75% of students show growth on Core Phonics Survey & other assessments. 80% of administrators have the necessary skills to support and sustain APT. Increased parents understanding of the APT & the importance of the Read by 3rd Grade statewide initiative & how it applies to special education students. 90% of teachers implement APT with fidelity. 80% of teachers demonstrate greater skills in meeting students’ academic needs. 75% of students show growth on Core Phonics Survey & other assessments. 80% of administrators have the necessary skills to support and sustain APT. Increased parents understanding of the APT & the importance of the Read by 3rd Grade statewide initiative & how it applies to special education students. Increased percentage of students with IEP score proficient on statewide assessments. Gaps in performance between students with IEPs & all students are decreased. Increased number of students with IEPs receive the majority of their instruction in general education classrooms. Increased percentage of students with IEP score proficient on statewide assessments. Gaps in performance between students with IEPs & all students are decreased. Increased number of students with IEPs receive the majority of their instruction in general education classrooms.

40 Objectives Inputs or Program Investments Outputs (Process Measures) Short-Term Outcomes Intermediate Outcomes Long-Term Outcomes Obj. 1: To develop the capacity of those providing PD on ELOs, transition planning, and parent engagement, and to define the expectations and commitment of those receiving PD.  DOE Staff  PTI  VR  MCST  KSC  SLC  Regional trainers/coaches established & trained  Annual, a new cohort of committed LEAs are recruited  Reviews of LEA transition policies & practices are conducted  Plans for transition-related PD are developed for each LEA  LEAs report satisfaction with the quality of PD  PD meets the needs of participating LEAs  PD is provided collaboratively  Evidence-based practices are used by participating LEAs  PD is aligned with related DOE initiatives.  All LEAs fulfill SPDG commitments  A greater percentage of students with disabilities in participating LEAs graduate, college and career ready.  There is a decrease in the percentage of students with disabilities in participating LEAs dropping out.  Increase in percentage of LEAs meeting Indicator 13 compliance  Improved outcomes for Indicator 14  Parents report greater levels of satisfaction (Indicator 8) Obj. 2: To increase and expand the use of ELOs in all regions of the state.  DOE Staff  QED  VR  PTI  KSC  MCST  SLC  Training curriculum established  Initial training of LT and first LEA cohort by QED  Annual training thereafter conducted by LT  Stat. sig. increases in pre/post training evaluations.  PD participants report training increased their knowledge of ELOs & transition planning  Parents report they are more knowledgeable about transition planning.  PD participants report they are more skilled about ELOs & transition planning.  PTI has greater capacity to provide PD on family engagement  Increased use of ELOs for core courses Obj. 3: To increase the use of best practice, EB transition planning, including enhanced parent engagement strategies.  DOE Staff  VR  PTI  KSC  MCST  SLC  Training curricula established (transition planning, family engagement, RENEW, etc.)  Training of PD providers  Regional and LEA training workshops

41 Evaluation Plan Evaluation theory Organize by RFP criteria Formatting information Use of performance indicators, particularly addressing GPRA/Program Measures Formative versus summative evaluation

42 Evaluation Criteria (C-21 of RFP) (i) The extent to which the methods of evaluation are thorough, feasible, and appropriate to the goals, objectives, and outcomes of the proposed project. (ii) The extent to which the methods of evaluation provide for examining the effectiveness of project implementation strategies. (iii) The extent to which the methods of evaluation include the use of objective performance measures that are clearly related to the intended outcomes of the project and will produce quantitative and qualitative data to the extent possible. (iv) The extent to which the evaluation will provide performance feedback and permit periodic assessment of progress toward achieving intended outcomes.

43 Sample Instrument Table (Criteria i) InstrumentData SourcePurpose Training/coaching evaluation forms  Collected after each training session and after 2 coaching sessions/year  To improve delivery of training and coaching IC Team Level of Implementation (LOI)  Interviews with a random sample of Case Managers & Requesting Teachers, each principal, & Team Member  Review of a sample of a CDF  Objective measure of the integrity of program implementation and use. Case Documentation Form Review (CDF)  Case documentation reviewed, scored, and rated by project coordinator  Measure of Case Goal Attainment  Provides a measure of accuracy in documentation District Implementation Sustainability Checklist  District leadership completes the assessment annually.  To assess the capacity of district leadership to support and sustain ICAT implementation

44 Sample Evaluation Table (Criteria i) Selection ActivitiesData SourcesAnalysis/Method 1.1.1: Develop selection criteria for participating districts.  Meeting Minutes  Final agreement form  Qualitative analysis of minutes & selection criteria 1.1.2: Application for selecting districts developed and disseminated.  Documentation of dissemination  Completed applications  Qualitative analysis of received applications. 1.1.3: Professional development providers meet certification standards for delivering PD  Documentation of ICAT certification  Review of list of ICAT certified trainers 1.1.4: Roles & responsibilities for professional development providers are determined.  Copy of expectations, job descriptions, MOUs  Qualitative analysis of expectations, job descriptions, MOUs

45 Sample Evaluation Table (Criteria i) Type of Outcome Outcome DescriptionEvaluation Questions How Will We Know the Intended Outcome Was Achieved? (performance indicator) Measurement/Data Collection Method Timeline (projected initiation and completion dates) Short term (systems) (G-4) LEA and building leadership report that DDOE communication has positively impacted their expectations for SWD. To what degree & how well was DDOE communication used with LEAs in an effective manner? 90% of LEA staff report that the communication with the DOE was effective. Communication Logs LEA survey Middle & end of each school year. Short term (systems) (G-4) Project partners report that DDOE communication efforts were effective in promoting increased expectations for students with disabilities. To what degree & how well was DDEO communication used with partners in an effective manner? 90% of project partners report that the communication with the DOE was effective. Communication Logs Partner survey Middle & end of each school year.

46 Evaluating Implementation Strategies (Criteria ii) Measuring formative/process work Connection to outputs in logic model Importance of implementation science Stages of implementation Implementation drivers

47 Program/GPRA Measures (Criteria iii) MeasureDescription/Purpose Program Measure 1: We will assess the degree to which evidence-based PD is used to support implementation, using the SPDG Evidence-Based PD worksheet. This includes a focus on selection of schools and providers, training, coaching, performance assessment, and facilitative administrative supports. We will collect data related to growth in teachers’ knowledge and skills, and data on fidelity of training and coaching implementation. A mixed set of qualitative & quantitative data will be provided to assess the quality of PD. Program Measure 2 We will use the IC Team Level of Implementation (LOI) as our ICAT fidelity measure. Data will be collected by a trained external observer from a random sample of Case Managers & teachers, each principal & Team Member, and a review of student outcomes to assess the fidelity of program implementation. We will use the CORE K-6 Implementation Rubric to assess the degree to which our second goal, Assess, Plan, Teach, is implemented with fidelity. Both instruments are in Appendices G and H. Program Measure 3 To report on the efficiency performance measure, the degree of sustained PD activities will be tracked via an online database, known as the PD Log. This has been used in successive SPDGs to collect data on the type, amount, and audience of PD. Through this database, we can track the percentage of staff time (which will be used to determine personnel cost) spent on sustained PD activities, such as coaching. Travel & other costs are tracked to determine the total cost/percentage of sustained activities.

48 Project Performance Indicators (Criteria iii) Objective 1 (Selection Driver):  ICAT implemented with fidelity in four new districts and approximately 20 schools.  APT implemented with fidelity in 75 Brent County schools and in 40 schools in 2 additional districts.  6 new Level 5 and 6 ICAT Trainers are certified by 2020. Objective 2 (Training Driver):  75% of training workshops across initiatives will result in statistically significant increases in participants’ knowledge, as measured by workshop pre/post evaluations.  80% of PD participants report that training increased their knowledge of ICAT/ATP. Objective 3 (Coaching Driver):  80% of PD participants report that SPDG coaching increased their skills to implement ICAT/ATP. Objective 4 (Performance Assessment Driver):  90% of participating schools implement ICAT/APT with fidelity.  SPDG stakeholders report data are shared regularly & are useful. Objective 5 (Facilitative Administrative Supports Driver):  80% of administrators report training increased their knowledge of ICAT/APT.

49 Performance Feedback (Criteria iv) Reporting – Formative reports (training, coaching logs) – Monthly/Quarterly reporting to client – Annual performance/continuation report Formats for reporting – Formal reports – InfoGraphics – Abstracts

50 Presentation Jennifer Gonzales – AR SPDG Evaluation Alignment of Arkansas’ SPDG and SSIP – Overview of the Presentation and 3 Takeaways – Questions

51 Alignment of Arkansas’ SPDG and SSIP Evaluation Jennifer Gonzales Arkansas Department of Education Special Education Unit SPDG Director and SSIP Coordinator jennifer.gonzales@arkansas.gov 501-682-4221 May 5, 2016

52 What do the SSIP and SPDG have in Common Individualized based upon State needs Based on Implementation Science Frameworks Focuses on Building State Education Agency Capacity Scaling Up of Evidence- Based Practices at the District level Focuses on Professional Learning and Implementation Fidelity Includes Family Involvement and Stakeholder Engagement Involves Rigorous and Relevant Evaluation Methods Uses a Theory of Action Utilizes Logic Model Development Required Annual Progress Reports Supported by National Communities of Practice Shares similar National Resources

53 Poll Question To what degree does your SPDG evaluation align with or support your state’s SSIP evaluation plan? a.Substantially b.Somewhat c.A Little d.Not at All

54 Strengths of SSIP and SPDG Alignment Rigorous and relevant evaluation plan Leveraged and shared programmatic and evaluation resources Coordinated supports State department of education leadership support A Focus on SEA infrastructure development Time

55 Potential Challenges Who “owns” the work, where does the work “live” Level of alignment Reporting timelines Vocabulary Compliance vs. Results Evaluating change in attitudes, beliefs, culture

56 Considerations Develop networks of critical friends Be flexible with your alignment –Changes in resources –Leadership changes –Political changes Never forget to evaluate the adaptive side of systems change –Leading by ConveningLeading by Convening

57 Key Strategies for Building Communities of Practice

58 Purpose To provide a brief overview of the following: The features of a Community of Practice and Vision 5 Key elements for Building Communities of Practice Example and resources

59 Communities of Practice “Communities of practice are groups of people who share a concern or a passion for something they do and learn how to do it better as they interact regularly” Wenger-Traynor, E., & Wenger-Trayner, B. (2015), p. 1

60 Building Communities of Practice Consider issues from a community lens Survey needs of group members Sensing Issues Consider all possible stakeholders Make sure there is an incentive to participate Inviting Participants Encourage active participation Sharing Information Develop and implement an action plan Agree to an end goal and timeframe Taking Action Reflect on the process E.g., Do we need to shift how we are working things? Do we need new roles and actions? Determining Next Steps Cashman, J., Linehan, P., & Rosser, M. (2007). Pp. 28-34

61 Rare Sharing of Data Example True vision of CoP realized

62 References Cashman, J., Linehan, P., Purcell, L., Rosser, M., Schultz, S., & Skalski, S. (2014). Leading by convening: A blueprint for authentic engagement. Alexandria, VA: National Association of State Directors of Special Education. Retrieved from: https://www.nasdse.org/LinkClick.aspx?fileticket=uyIi21KRYB4%3D https://www.nasdse.org/LinkClick.aspx?fileticket=uyIi21KRYB4%3D – The report includes a description of the framework, Leading by Convening, and discusses the three habits of interaction from Dr. Wenger’s work with CoP (coalescing around issues, ensuring relevant participation, and doing work together), across elements of interaction and the depth of the interaction. Cashman, J., Linehan, P., & Rosser, M. (2007). Communities of Practice: A new approach for solving complex education problems. Alexandria, VA: National Association of State Directors of Special Education. Retrieved from http://www.ideapartnership.org/documents/CoPGuide.pdf http://www.ideapartnership.org/documents/CoPGuide.pdf – The guide defines CoP, outlines the phases in building and creating them, and outlines tips for facilitating the process and participant engagement

63 References Cashman, J., Linehan, P., & Rosser, M. (2007). Facilitating community: Key strategies for building communities of practice to accomplish state goals. New Eyes: Meeting Challenges Though Communities of Practice. Retrieved from http://www.ideapartnership.org/documents/NewEyes-1-Strategies.pdf http://www.ideapartnership.org/documents/NewEyes-1-Strategies.pdf – The brief outlines strategies for supporting CoP, including coalescing around issues and highlighting successful practices. Wenger-Traynor, E., & Wenger-Trayner, B. (2015). Communities of Practice a brief introduction. Retrieved from: http://wenger-trayner.com/wp- content/uploads/2015/04/07-Brief-introduction-to-communities-of-practice.pdfhttp://wenger-trayner.com/wp- content/uploads/2015/04/07-Brief-introduction-to-communities-of-practice.pdf – The brief provides an overview of CoP, common myths associated CoP, and resources for additional reading on the topic.

64 64 To Join SIGnetwork CoPs Contact: Jennifer Coffey Jennifer.Coffey@ed.gov or John Lind jlind@uoregon.edu


Download ppt "Dial-in: 1-877-512-6886 Pass code: 16 16 32 2775 SPDG Directors’ Webinar Communities of Practice State Systemic Improvement Plan Evaluation Coaching 1."

Similar presentations


Ads by Google