Presentation is loading. Please wait.

Presentation is loading. Please wait.

FSI Level IV Lisa Guzzardo Asaro Dr. Lisa Rivard January 2012.

Similar presentations


Presentation on theme: "FSI Level IV Lisa Guzzardo Asaro Dr. Lisa Rivard January 2012."— Presentation transcript:

1 FSI Level IV Lisa Guzzardo Asaro Dr. Lisa Rivard January 2012

2 Connector Activity The Leadership by Douglas B. Reeves The Write Way Each person will read for: 3 Ideas 2 Insights 1 Question that Surfaced As a Table Team, identify 3, 2, and I TAB 12

3 The Leadership for Learning Framework, by Doug Reeves Lucky High results, low understanding of antecedents Replication of success unlikely Leading High results, high understanding of antecedents Replication of success likely Losing Low results, low understanding of antecedents Replication of failure likely Learning Low results, high understanding of antecedents Replication of success likely Achievement of Results TAB 12

4 Today’s Outcomes Receive updates and new content Explore a Book Walk: From Questions to Actions by Victoria Bernhardt Progress Monitoring a SI strategy Hear two schools present how they collect and use SI and Walk Through data Network with colleagues on relevant topics Provide approaches for completing SPP/A Explore MI School Data Web Portal and Data Director 4.0

5 Today’s Roadmap Welcome Inclusion Activity: Doug Reeves Noteworthy Updates School Process Profile/Analysis: Rubrics Networking: Role the Dice Monitoring the Impact of Strategy Implementation Data Director/MISchooldata Network and Planning TAB 12

6 Key Working Agreements A Facilitation Tool Respect all Points of View Be Present and Engaged Honor Time Agreements Get All Voices in the Room These breathe life into our Core Values

7 Parking Lot A Facilitation Tool Rest questions that do not benefit the whole group Place questions that do not pertain to content at this time Place questions that pertain, but participants do not want to ask at this time

8 NOTEWORTHY Cut Score Retrospective Due Dates –SDP/A…February rollout AdvancED NCA Accreditation MDE/AdvancED NCA Due dates Make-up Dates BAA School Improvement Work Teams 10-11 MISD Perception Data Surveys

9 TAB 2

10 Understanding the New Michigan Cut Scores Wayne RESA http://www.mistreamnet.com/vidflv.php?wh o=resa121611

11 Accreditation The meaning of your accreditation “status” New procedures, new possibilities –Accredited: Excellence –Accredited: Distinction –Accredited –Accredited: On Advisement –Accredited: Warned –Accredited: Probation © 2011 AdvancED 11 TAB 12

12 The Accreditation Decision Components of the Accreditation Decision CriteriaEvaluative Tools Organizational Performance AdvancED Standards and Indicators Organizational Performance 4-Level Rubric for each indicator Student PerformanceMultiple sources of student assessment data (including applicable state/ assessment data) Student Performance 4- Level Rubric Stakeholder PerceptionsAdvancED SurveysStakeholder Data 4- Level Rubric © 2011 AdvancED TAB 12

13 Document I (SDP/A) School Data Profile/Analysis Due Online: 09.01.12 Document III (Summary Report/Goals Management) Summary Report/School Improvement Plan Due Online: 09.01.12 Document II (SPP/A) School Process Profile/Analysis Due Online: 03.09.12 a. MDE: School Process Rubrics 90 b. MDE: School Process Rubrics 40 c. NCA: Assist Self Assessment (Assist SA) d. NCA: Self Assessment (SA)

14 School Data Profile/Analysis (SDP/A) Collaboration between BAA, CEPI, and OEII Information that will need to be gathered by local schools has been greatly reduced –Students entire instructional program (k-12) –Extended Learning Opportunities available for all students –Length of time teachers have been teaching –Total Teacher absence that resulted in a sub-teacher assigned to classroom –The report will be pre-populated with both last year’s evidence and the4 new and additional information MDE is gathering

15 Make-Up Dates Make-up Days: must register –01.20.11 –04.25.12 SB-CEUs: 2.6 completed in MAY

16 REMINDER to REGISTER School Improvement Teams Work Day 04.26.12 Title I only 04.30.12 Title I only 05.01.12 05.02.12 05.03.12 05.08.12

17 www.michigan.gov/baa-secure Aggregate Data File (ADF) for 2011 MEAP Student Analysis File Extract (SAFE) –1 st time last year is posted These files allow schools and districts to review summary and item analysis data that previously was available on the PDF or printed reports, or by districts aggregating data locally from the student data file when it became available.

18 Common Core State Standards Career and College Standards 2012-2013 2012 MEAP minimally modified as necessary to reflect CCSS 2013 MME remains the same State focus will be on student learning 2013-2014 2013 MEAP based on 2012 model 2014 MME remains the same State focus will be on student learning 2014-2015 Full implementation on assessment and instruction of CCSS

19 From Questions to Actions by Victoria Bernhardt Chapter 5: Analyzing Questionnaire Data Determine Purpose: What do you want to learn? How do you want to use the results in conjunction with your SIP? Determine Content: What content is desired and from whom? Develop Instrument and Pilot: Create instrument, pilot and revise as necessary. Is the questionnaire working the way you want it to work? Collect the Data: How will the questionnaire be administered and when? Analyze Results: How can the results be analyzed to show the information gleaned from the questionnaire? PERCEPTION DATA TAB 4

20 AdvancED Perception Surveys 2011-2012…Late February FREE Paper copies available for a small fee Paper will be integrated into online and then aggregated Surveys for Parent, Staff and Student Stakeholders Strongly Agree to Strongly Disagree preK-2, 3-5, 5-13+ Directly aligned to quality indicators Provide number of responses and % selected for the 5 choices Turn around 10-15 business days

21 One Common Voice – One Plan Michigan Continuous School Improvement Stages and Steps Study Analyze Data Set Goals Set Measurable Objectives Research Best Practice (MI-CSI)

22 One Common Voice – One Plan Michigan Continuous School Improvement Stages and Steps I.-III. Comprehensive Needs Assessment Components III. Summary Report/ Goal Management Study Analyze Data Set Goals Set Measurable Objectives Research Best Practice I. School Data Profile/Analysis II. School Process Profile/Analysis I. School Data Profile/Analysis II. School Process Profile/Analysis III. Summary Report/ Goal Management TAB 12

23 One Common Voice – One Plan Michigan Continuous School Improvement Stages and Steps Getting Ready Collect School Data Build School Profile  I. School Data Profile  II. School Process Profile Analyze Data  I. School Data Analysis  II. School Process Analysis  III. Summary Report/Goals Management Set Goals Set Measurable Objectives Research Best Practice Develop Action Plan Implement Plan Monitor Plan Evaluate Plan Comprehensive Needs Assessment School Improvement Plan Gather Study Plan Do TAB 12

24 Stage One: GATHER Step 2: Collect School Data GATHER Getting Ready Collect School Data Build School Profile

25 One Common Voice – One Plan Stage One Gather: Step 2 Collect School Data What do you already know? What data do you need to know? What additional information/data do you need to know? Where can the information/data be found? Definitions Achievement Student Outcome Data How our students perform on local, state and federal assessments (subgroups) Demographic or Contextual Data Describes our students, staff, building, and community Process Data The policies, procedures, and systems we have in place that define how we do business PERCEPTION DATA Opinions of staff, parents, community and students regarding our school TAB 2

26 What types of data are/are not readily available in your building? 26 Demographic DataAchievement/ Outcome Data Process DataPerception Data Enrollment Subgroups of students Staff Attendance (Students and Staff) Mobility Graduation and Dropout Conference Attendance Education status Student subgroups Parent Involvement Teaching Staff Course enrollment patterns Discipline referrals Suspension rates Alcohol ‐ tobacco ‐ drugs violations Participation extra ‐ curriculars Physical, mental, social and health Local assessments: District Common Assessments, Classroom Assessments, Report Cards State assessments: MME, ACT, MEAP, MI-Access, MEAP Access, ELPA National assessments: ACT Plan, ACT Explore, ACT WorkKeys, NWEA, ITBS, CAT, MET NAEP, PSAT GPA Dropout rates College acceptance Policies and procedures (e.g. grading, homework, attendance, discipline) Academic and behavior expectations Parent participation – PT conferences, PTO/PTA, volunteers Suspension data School Process Profile Rubrics (40 or 90) or SA/SAR (NCA) Event occurred: Who, what, when, where, why, how What you did for Whom: Eg. All 8th graders received violence Prevention Survey data (student, parent, staff, community) Opinions Clarified what others think People act based on what they believe How do they see you/us? TAB 2

27 All previous progress comments will migrate to new format Cannot edit previously made notes Communicates a historical picture Document II (SPP/A) School Process Profile and Analysis Due Online: 03.09.12 a. MDE: School Process Rubrics 90 b. MDE: School Process Rubrics 40 c. NCA: Assist Self Assessment (Assist SA) d. NCA: Self Assessment (SA) TAB 4

28 School Process Rubrics (SPR) document two Two Road Maps www.advanc-ED.org/mde www.advanc-ED.org MICHIGAN DEPARTMENT EDUCATION MDE AdvancED ACCREDITATION NCA School Process Rubrics (CNA) 90 Rubrics Discussion Questions ASSIST Self Assessment (ASSIST SA/ES) 56 Rubrics Discussion Questions ( Required for the year of the QAR ) EDYES! 40 Process Rubrics 40 Rubrics (Required Cycles 1-4) Self Assessment (SA) 56 Rubrics Quality Assurance Review (QAR) TAB 4

29 STANDARD 1 - Curriculum Schools/districts have a cohesive plan for instruction and learning that serves as the basis for teachers' and students' active involvement in the construction and application of knowledge. BENCHMARK A: Aligned, Reviewed and Monitored School/district written curriculum is aligned with, and references, the appropriate learning standards (MCF, AUEN, ISTE, GLCE, HSCE, METS, etc.). Rubric Definitions Getting Started: Less than half of the local curriculum includes the Content Expectations (GLCE, HSCE) or Michigan Curriculum Framework, CTE program standards or course content expectations as appropriate. The curriculum is not aligned to the standards. Content Area Please check the content areas that this impacts: ELAMSSS Enter Evidence TAB 4

30 Clickable POSSIBLE DATA SOURCE(S) EXAMPLES OF DOCUMENTABLE/ OBSERVABLE RESULTS  Curriculum guides Guides reference the Michigan Curriculum Framework and contain benchmarks and content expectations. Guides contain scope and sequence  Curriculum maps: Maps contain specific information regarding what is taught and where it is taught  Pacing guides/curriculum calendars Guides organized with detailed information useful in daily instructional practice TAB 4

31 Standard 4 - Documenting & Using Results STANDARD: The school enacts a comprehensive assessment system that monitors and documents performance and uses these results to improve student performance and school effectiveness. Impact Statement: A school is successful in meeting this standard when it uses a comprehensive assessment system based on clearly defined performance measures. The system is used to assess student performance on expectations for student learning, evaluate the effectiveness of curriculum and instruction, and determine interventions to improve student performance. The assessment system yields timely and accurate information that is meaningful and useful to school leaders, teachers, and other stakeholders in understanding student performance, school effectiveness, and the results of improvement efforts. Indicators Rubric: Please indicate the degree to which the noted practices/processes are in place in the school. The responses to the rubric should help the school identify areas of strength and opportunities for improvement as well as guide and inform the school’s responses to the focus questions and examples of evidence. Indicators Evidence: For each Indicator, click the (Add Evidence) link to provide examples of evidence that support the rubric response. TAB 4

32 INDICATORS In fulfillment of this standard, the school: Not Eviden t Emer ging Operat ional Highly Functional 4.14.1 Establishes performance measures for student learning that yield information that is reliable, valid, and bias free 4.24.2 Develops and implements a comprehensive assessment system for assessing progress toward meeting the expectations for student learning 4.34.3 Uses student assessment data for making decisions for continuous improvement of teaching and learning processes 4.44.4 Conducts a systematic analysis of instructional and organizational effectiveness and uses the results to improve student performance 4.54.5 Communicates the results of student performance and school effectiveness to all stakeholders 4.64.6 Uses comparison and trend data of student performance from comparable schools in evaluating its effectiveness 4.74.7 Demonstrates verifiable growth in student performance 4.84.8 Maintains a secure, accurate, and complete student record system in accordance with state and federal regulations

33 ‘Ladder of Inference’ Peter Senge 1994 TAB 4

34 One Common Voice – One Plan Michigan Continuous School Improvement Stages and Steps Study Analyze Data Set Goals Set Measurable Objectives Research Best Practice

35 Stage Four: DO Step 10: Monitor Plan DO Implement Plan Monitor Plan Evaluate Plan

36 MONITOR MONITOR IMPLEMENTATION OF THE PLAN (Formative) IS IT WORKING? EVALUATE EVALUATE ADULT IMPLEMENTATION and the IMPACT ON STUDENT ACHIEVEMENT (Summative) DID IT WORK? ARE STRATEGIES AND ACTIVITIES BEING IMPLEMENTED WITH FIDELITY ? ARE WE COLLECTING & USING STUDENT AND ADULT DATA TO MODIFY & ADJUST ONGOING IMPLEMENTATIO? DID WE IMPLEMENTTHE PLAN/STRATEGIES CORRECTLY & CONSISTENTLY? IS WHAT WE ARE DOING WORKING? ARE WE SHOWING EVIDENCE OF STUDENT GROWTH? WHAT INTERIM ADJUSTMENTS ARE SUGGESTED BY IMPLEMENTATION DATA? HOW MIGHT THESE ADJUSTMENTS AFFECT THE INTEGRITY OF THE RESULTS? DID WE GIVE IT ENOUGH TIME? ENOUGH RESOURCES? Implementation: Adult FocusedImpact: Student Focused MONITOREVALUATE MONITOR DID OUR STRATEGIES RESULT IN INCREASED STUDENT ACHIEVEMENT? WHAT UNINTENDED CONSEQUENCES (GOOD AND BAD) HAVE OCCURRED? Do Implement Plan Monitor Plan Evaluate Plan

37 MONITOR MONITOR IMPLEMENTATION OF THE PLAN (Formative) IS IT WORKING? EVALUATE EVALUATE ADULT IMPLEMENTATION and the IMPACT ON STUDENT ACHIEVEMENT (Summative) DID IT WORK? ARE STRATEGIES AND ACTIVITIES BEING IMPLEMENTED WITH FIDELITY ? ARE WE COLLECTING & USING STUDENT AND ADULT DATA TO MODIFY & ADJUST ONGOING IMPLEMENTATIO? IS WHAT WE ARE DOING WORKING? ARE WE SHOWING EVIDENCE OF STUDENT GROWTH? WHAT INTERIM ADJUSTMENTS ARE SUGGESTED BY IMPLEMENTATION DATA? HOW MIGHT THESE ADJUSTMENTS AFFECT THE INTEGRITY OF THE RESULTS? Implementation: Adult FocusedImpact: Student Focused MONITOREVALUATE MONITOR Do Implement Plan Monitor Plan Evaluate Plan

38 Activities MATRIX Connection to SPR 40/90, SA/Assist SA* Getting Ready to Implement Implement Monitoring Fidelity of Implementation and Impact How will you address the targeted areas in your Summary Report (SPP) ? How will you ensure readiness for implementation? How will you ensure that participants have the knowledge and skills to implement? POSSIBILE ACTIVITIES  Professional development around strategy  Purchase materials  Planning for implementation – Identify schedule for strategy use, personnel, mechanism for monitoring, rollout, etc.  Communication vehicles How will you ensure successful implementation for your selected activities? POSSIBLE ACTIVITIES  Communication – to whom? How?  Instructional technology*  Activities to support at-risk students (For Title One students)* Parent Involvement *Required Components How will you ensure the program/activity is implemented with fidelity? How will you monitor the programs impact on student achievement? POSSIBLE ACTIVITIES  Walkthroughs  PLC/CASL meetings  Documentation of impact  Demonstration classrooms  Gathering achievement data 38 Do Implement Plan Monitor Plan Evaluate Plan

39 Just Do IT! Monitor ImplementationMonitor Implementation Evaluate ImplementationEvaluate Implementation Monitor ImpactMonitor Impact Evaluate ImpactEvaluate Impact Adult Focused Student Focused Do Implement Plan Monitor Plan Evaluate Plan

40 Progress Monitoring Matrix Every 2 Months conduct Progress Monitoring at the STRATEGY LEVEL. –District Administrators will… –Administrators will…. –Teachers will… –Students will… Baseline, Benchmark/Interim, and Summative

41 NETWORKING Dialogue Dice Each person in your group will take a turn rolling the dice and sharing briefly an experience in response to the written prompt. TAB 12

42 School Improvement and Walk Through Data Jefferson Middle School PRESENTERS David Lavender, Principal Bob Schneider, Computer Teacher

43 School Improvement Data Collection: Quick and Dirty Richards Middle School PRESENTERS Jessica Carrier, Principal Kris Robinson, Asst. Principal Christine Biondo, Co-Chair Andy Brody, Co-Chair

44 Stage One: GATHER Step 1: Getting Ready GATHER Getting Ready Collect School Data Build School Profile

45 A Discussion Protocol A protocol for discussing a short reading. Adapted from the National School Reform Faculty, www.nsrfharmony.org TAB 4 STUDY Analyze Data Set Goals Set Measurable Objective Research Best Practice

46 School Data Inventory Data Source ExternalTypes of Data Who Uses the Data? Purpose of this Data? AccessibilityHow is the Data Used? Next Use Steps TAB 2 GATHER Getting Ready Collect School Data Build School Profile

47 Data Director 4.0 MISchooldata.org Stage One: Gather Step 2: Collecting School Data Step 3: Build School Profile Presenter Dr. Jennifer Parker-Moore GATHER Getting Ready Collect School Data Build School Profile

48 Network and Team Time School Process Profile/Analysis Monitoring School Improvement Network with Colleagues Seek Assistance


Download ppt "FSI Level IV Lisa Guzzardo Asaro Dr. Lisa Rivard January 2012."

Similar presentations


Ads by Google