Scoring and Reporting of Results Teachers and School Leaders access reports online within 6 weeks of administration. Scoring based on % positive (Most.

Slides:



Advertisements
Similar presentations
Definitions Innovation Reform Improvement Change.
Advertisements

PAYS FOR: Literacy Coach, Power Hour Aides, LTM's, Literacy Trainings, Kindergarten Teacher Training, Materials.
Gwinnett Teacher Effectiveness System Training
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Collecting data Chapter 5
TEACHER EVALUATION IN NEW JERSEY: PREVIEW OF FINAL REPORT RUTGERS GRADUATE SCHOOL OF EDUCATION.
WHY USE A STUDENT SURVEY? The survey is a unique form of actionable feedback that districts, schools and teachers can use to inform practice. Students.
Summary of Results from Spring 2014 Presented: 11/5/14.
 Reading School Committee January 23,
Teacher Professional Growth & Effectiveness System Monica Osborne, presenter KDE Effectiveness Coach 1.
Multiple Measures of Teacher Effectiveness Tulsa Public Schools Jana Burk.
New York State District-wide Growth Goal Setting Process: Student Learning Objectives Webinar 2 (REVISED FEBRUARY 2012)
Learning Walk High Levels of Learning for All Students Quality Instruction in Every Classroom Skillful Leadership Throughout the School and District.
NYS Assessment Updates & Processes for New Social Studies Regents Exams September 18, 2014 Candace Shyer Assistant Commissioner for Assessment, Standards.
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
Common Core Implementation Plan Whittier City School District Board of Education Meeting April 7, 2014.
Ways to Utilize the 2012 FCPS Working Conditions Survey April 11, 12, 13 Laurie Fracolli, Sid Haro, and Andrew Sioberg.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Stronge Teacher Effectiveness Performance Evaluation System
Student Perception Survey Toolkit Colorado’s Student Perception Survey Planning Webinar for Districts.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
This document is an editable template. Please make sure to customize it for your district before distributing. Copyright 2014 by The Colorado Education.
Administrative Evaluation Committee – Orientation Meeting Dr. Christine Carver, Associate Superintendent of Human Capital Development Mr. Stephen Foresi,
School Improvement Improving what’s happening in the classroom for students with disabilities: instruction & its impact on student learning Systems that.
Colorado’s Student Perception Survey. Agenda Why use a Student Perception Survey? What the Research Says Survey Overview Survey Administration Use of.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
DISTRICT CFASST MEETING #2
Stronge Teacher Effectiveness Performance Evaluation System
Evaluation Team Progress Collaboration Grant 252.
NCDPI Observation Calibration Training Pilot: Introduction & Demo November 2014.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Commission on Teacher Credentialing Ensuring Educator Excellence 1 Biennial Report October 2008.
ISLN Network Meeting KEDC SUPERINTENDENT UPDATE. Why we are here--Purpose of ISLN network New academic standards  Deconstruct and disseminate Content.
Summary of Findings: Reliability Student-Level Reliability (α) Grades 3-5 Grades 6-12 Overall Reliability (all items) Student Learning
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
OVERVIEW PRESENTATION
Expeditionary Learning Queens Middle School Meeting May 29,2013 Presenters: Maryanne Campagna & Antoinette DiPietro 1.
Las Cruces Public Schools Principal Evaluation Overview Stan Rounds Superintendent Stan Rounds Superintendent.
Should Students Have A Voice?
MISSOURI PERFORMANCE ASSESSMENTS An Overview. Content of the Assessments 2  Pre-Service Teacher Assessments  Entry Level  Exit Level  School Leader.
- 0 - Collaborative Inquiry via Professional Learning Communities MSDF Impact Assessment.
LEAP in School Staff. Training Objectives  Understand the changes to LEAP for  Have questions answered.
Reform Model for Change Board of Education presentation by Superintendent: Dr. Kimberly Tooley.
TPEP Teacher & Principal Evaluation System Prepared from resources from WEA & AWSP & ESD 112.
Governor’s Teacher Network Action Research Project Dr. Debra Harwell-Braun
PGES: The Final 10% i21: Navigating the 21 st Century Highway to Top Ten.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
Mathematics Performance Tasks Applying a Program Logic Model to a Professional Development Series California Educational Research Association December.
Connecting K-12 Schools Nationwide to Support Sustainable School Improvement October 2008 “We” Survey Suite with Linda Lucey
August 2, Welcome Who is the TSD Continuous Improvement Team ? What is the work of the TSD Continuous Improvement Team? What is.
Massachusetts Department of Elementary & Secondary Education 11  What role will student feedback play in your district next year?
Using Student Perception Survey Results A Training for Principals This presentation is a template and should be customized to reflect the needs and context.
JOSHUA WORLEY NOVEMBER 11, 2013 ITEC 7445 – FALL 2013 DR. GOETZEL EMERGING TECHNOLOGIES PROJECT GradeCam.
Preparing for the NYCDOE Student Perception Survey Aaron FeuerEric Weisman CEO - PanoramaPartnerships Director- Panorama.
Welcoming, caring, respectful, and safe learning and working environments and student code of conduct A presentation for EIPS leadership, COSC, EIPS staff,
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
Teacher Professional Growth and Effectiveness System (TPGES) Pulaski County Schools June
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
A lens to ensure each student successfully completes their educational program in Prince Rupert with a sense of hope, purpose, and control.
Tell Survey May 12, To encourage large response rates, the Kentucky Education Association, Kentucky Association of School Administrators, Kentucky.
Clinical Practice evaluations and Performance Review
Improving the Accessibility of Locally Developed Assessments CCSSO National Conference on Student Assessment 2016 Phyllis Lynch, PhD Director, Instruction,
The School Turnaround Group
Updates on the Next-Generation MCAS
Who We Are For more than 20 years, we have believed the key to preparing student for a successful future is providing rigorous and relevant instruction.
Presentation transcript:

Scoring and Reporting of Results Teachers and School Leaders access reports online within 6 weeks of administration. Scoring based on % positive (Most of the Time or Always responses) Grouped into quintiles for reporting as no are given at the LEAP ratings at measure level. Combined with other measures to calculate LEAP performance categories when ratings are given. Individual teacher data compared to District and School % positive Data reported at school and teacher level, and disaggregated by: – Category and Question – Demographic data (ethnicity, gender, ELA, SPED) – Response distribution 1

Supports, Alignment, and Next Steps Teachers discuss results with school leaders in mid-year conversations. Results are a part of a holistic conversation that encompasses all LEAP data to date, including Observation and Professionalism. Recommendations and guiding questions provided to school leaders, team leaders, and teachers in training materials (how to look at results in the context of other LEAP data). Data analysis of alignment to other measures is ongoing. Teachers who received additional Observation support though Differentiated Teacher Leaders saw a 1% average increase in scores over expected, the Teacher Leaders saw a 2% average increase. Next steps: – Best practice recommendations and materials for involving students more deeply – Formal Professional Learning materials correlated directly to Student Perception Survey results 2

Survey Alignment with MET MET RecommendationsDPS LEAP System Measure what matters: Questions focus on what teachers do, and on the learning environment they create. Revisions on questions based on results, extensive feedback, external review, and statistical analysis to ensure questions are relevant and appropriate. Ensure accuracy: Student responses should be honest and based on clear understanding of questions. Confidentiality is a must. Continued examination of administration protocols. Administration based on state testing protocols for confidentiality and recommend teacher does not administer. Ensure reliability: Reliability requires adequate sampling and number of items so teachers have confidence that surveys can produce reasonably consistent results. We found no statistical difference in 2 administrations a year and reduced to one administration and one make-up administration to reduce impact on instructional time. Added a second optional administration class period for teachers. Support improvement: Teachers should receive results in a timely manner, understand what they mean, and have access to PD. Teachers and school leaders have access to results online approximately a month after administration during mid-year conversations. We are still working on supports for improvement. Source: Asking Students about Teaching: Student Perception Surveys and their Implementation (2012) 3

Amy Farley, Director, Research and Impact Colorado Legacy Foundation Engaging Students in the Educator Effectiveness Conversation: Building a Robust Student Perception Survey

Overview Why use a Student Perception Survey? What the Research Says Survey Overview Survey Development Pilot Results Survey Administration Use of Survey Results

Why Use a Student Perception Survey? The survey is a unique form of actionable feedback that districts, schools and teachers can use to inform practice. Students are in a unique position to contribute to a comprehensive view of classroom practice because they experience it more than anyone else in the education system. Student perception data can offer a big-picture view of what is happening in classrooms as well as school- and district-wide trends.

The Measures of Effective Teaching (MET) Project had two significant findings around student perception surveys:Measures of Effective Teaching (MET) Project –When student surveys are combined with observation and student growth data, these three measures tell us more and are able to predict future effectiveness better than any of them alone. –Student perception survey results are correlated to student achievement gains. The use of student feedback has also been shown to promote both reflection and responsibility on the part of the students. Research overview Bill and Melinda Gates Foundation (2012). Asking students about teaching: Student perception surveys and their implementation. (MET Project Policy and Practice Brief). Retrieved from Wiggins, G. (2011). Giving students a voice: The power of feedback to improve teaching. Education Horizons, 89(3), What the research says…

Colorado’s Student Perception Survey Free and publically available. 34-item survey about student learning experiences. Two versions of the survey, grades 3-5 and 6-12 Developed by the Colorado Legacy Foundation Input from more than 1,400 teachers Piloted in 16 Colorado districts Rigorous analyses confirm that the survey is fair, valid, and reliable The survey maps to Colorado’s Teacher Quality Standards. Full Technical Report

What does the survey measure? Survey does measure elements of student experience that have been demonstrated to correlate most closely to student growth. Survey does not measure whether or how much a student likes or dislikes a teacher.

What does the survey measure? See the full surveys for grades 3-5 and Student Learning How teachers use content and pedagogical knowledge to help students learn, understand, and improve. Student-Centered Environment How teachers create an environment that responds to individual students’ backgrounds, strengths, and interests. Classroom Community How teachers cultivate a classroom learning community where student differences are valued. Classroom Management How teachers foster a respectful and predictable learning environment. Standards I and III Standard II

Survey Design &Development Process Process/Survey Development TaskTimeline Construct definition & Item Development April – May 2012 Item/Construct Review, including district/expert/teacher feedback May 2012 Psychometric Field Test – Establish baseline psychometric properties and refine instrument as needed before Use Pilot June 2012 Think-Alouds/Cognitive Interviews August 2012 Fall Use Pilot – Administer the survey to integration and pilot districts November 2012 Fall Pilot Analyses – Analyzed data to inform 2 nd round of instrument revisions Nov – March 2013 Teacher feedback survey – Administered to 12 of the participating districts January 2013 Teacher focus groups (Round 1) – Convened to discuss the instruments and recommended changes and preferences for reporting formats March 2013 Analyze & Finalize Results – Prepare reports and guidance documents regarding analysis/use of survey data with help of teacher focus groups Dec – April 2012 Spring Validation Pilot Administration April – May 2013 Teacher focus groups (Round 2 & 3) – Convened to discuss pilot process, lessons learned, and future communication materials June & August 2013 Prep & Release Full Toolkit – free and publically-available toolkit May – August 2013

Student Feedback Students participated in “think-alouds” where they talked through their responses to each question. Students responded thoughtfully. –In my class, we learn things that matter to me: “She made the people who speak Spanish feel more important because we participated… we could teach about our culture. [It] taught us to trust in ourselves.” –My teacher knows when we understand the lesson and when we do not: “I say most of the time… one of my friends didn’t understand and when she asked if we all understood, he didn’t say anything [and she didn’t know and kept on going]”

Teacher Feedback Over 1400 teachers provided input during the survey development process. –Focus groups, survey pre-piloting, online feedback forums We took feedback about the instrument very seriously. –Items were changed or eliminated for specific reasons. For example between the fall and spring administration: The item “I get bored in this class” was removed because many teachers found it troubling “Schoolwork in this class is too easy” was removed because it was not related to students’ responses on other items.

Revisions to the Colorado SPS Removed all negatively-stated items –The fall 2012 instrument included a handful of negatively-worded items –All were removed from the final SPS instrument. Redefined organizing elements (four mapped to TQS) Included open-ended question

Summary of Findings: Teacher-Level Figure 1. Overall teacher mean score v. percent favorable