Download presentation
Presentation is loading. Please wait.
Published byKevin Cobb Modified over 9 years ago
1
Scoring and Reporting of Results Teachers and School Leaders access reports online within 6 weeks of administration. Scoring based on % positive (Most of the Time or Always responses) Grouped into quintiles for reporting as no are given at the LEAP ratings at measure level. Combined with other measures to calculate LEAP performance categories when ratings are given. Individual teacher data compared to District and School % positive Data reported at school and teacher level, and disaggregated by: – Category and Question – Demographic data (ethnicity, gender, ELA, SPED) – Response distribution 1
2
Supports, Alignment, and Next Steps Teachers discuss results with school leaders in mid-year conversations. Results are a part of a holistic conversation that encompasses all LEAP data to date, including Observation and Professionalism. Recommendations and guiding questions provided to school leaders, team leaders, and teachers in training materials (how to look at results in the context of other LEAP data). Data analysis of alignment to other measures is ongoing. Teachers who received additional Observation support though Differentiated Teacher Leaders saw a 1% average increase in scores over expected, the Teacher Leaders saw a 2% average increase. Next steps: – Best practice recommendations and materials for involving students more deeply – Formal Professional Learning materials correlated directly to Student Perception Survey results 2
3
Survey Alignment with MET MET RecommendationsDPS LEAP System Measure what matters: Questions focus on what teachers do, and on the learning environment they create. Revisions on questions based on results, extensive feedback, external review, and statistical analysis to ensure questions are relevant and appropriate. Ensure accuracy: Student responses should be honest and based on clear understanding of questions. Confidentiality is a must. Continued examination of administration protocols. Administration based on state testing protocols for confidentiality and recommend teacher does not administer. Ensure reliability: Reliability requires adequate sampling and number of items so teachers have confidence that surveys can produce reasonably consistent results. We found no statistical difference in 2 administrations a year and reduced to one administration and one make-up administration to reduce impact on instructional time. Added a second optional administration class period for teachers. Support improvement: Teachers should receive results in a timely manner, understand what they mean, and have access to PD. Teachers and school leaders have access to results online approximately a month after administration during mid-year conversations. We are still working on supports for improvement. Source: Asking Students about Teaching: Student Perception Surveys and their Implementation (2012) 3
4
Amy Farley, Director, Research and Impact Colorado Legacy Foundation Engaging Students in the Educator Effectiveness Conversation: Building a Robust Student Perception Survey
5
Overview Why use a Student Perception Survey? What the Research Says Survey Overview Survey Development Pilot Results Survey Administration Use of Survey Results
6
Why Use a Student Perception Survey? The survey is a unique form of actionable feedback that districts, schools and teachers can use to inform practice. Students are in a unique position to contribute to a comprehensive view of classroom practice because they experience it more than anyone else in the education system. Student perception data can offer a big-picture view of what is happening in classrooms as well as school- and district-wide trends.
7
The Measures of Effective Teaching (MET) Project had two significant findings around student perception surveys:Measures of Effective Teaching (MET) Project –When student surveys are combined with observation and student growth data, these three measures tell us more and are able to predict future effectiveness better than any of them alone. –Student perception survey results are correlated to student achievement gains. The use of student feedback has also been shown to promote both reflection and responsibility on the part of the students. Research overview Bill and Melinda Gates Foundation (2012). Asking students about teaching: Student perception surveys and their implementation. (MET Project Policy and Practice Brief). Retrieved from http://www.metproject.org/downloads/Asking_Students_Practitioner_Brief.pdf Wiggins, G. (2011). Giving students a voice: The power of feedback to improve teaching. Education Horizons, 89(3), 23-26. What the research says…
8
Colorado’s Student Perception Survey Free and publically available. 34-item survey about student learning experiences. Two versions of the survey, grades 3-5 and 6-12 Developed by the Colorado Legacy Foundation Input from more than 1,400 teachers Piloted in 16 Colorado districts Rigorous analyses confirm that the survey is fair, valid, and reliable The survey maps to Colorado’s Teacher Quality Standards. Full Technical Report
9
What does the survey measure? Survey does measure elements of student experience that have been demonstrated to correlate most closely to student growth. Survey does not measure whether or how much a student likes or dislikes a teacher.
10
What does the survey measure? See the full surveys for grades 3-5 and 6-123-56-12 Student Learning How teachers use content and pedagogical knowledge to help students learn, understand, and improve. Student-Centered Environment How teachers create an environment that responds to individual students’ backgrounds, strengths, and interests. Classroom Community How teachers cultivate a classroom learning community where student differences are valued. Classroom Management How teachers foster a respectful and predictable learning environment. Standards I and III Standard II
11
Survey Design &Development Process Process/Survey Development TaskTimeline Construct definition & Item Development April – May 2012 Item/Construct Review, including district/expert/teacher feedback May 2012 Psychometric Field Test – Establish baseline psychometric properties and refine instrument as needed before Use Pilot June 2012 Think-Alouds/Cognitive Interviews August 2012 Fall Use Pilot – Administer the survey to integration and pilot districts November 2012 Fall Pilot Analyses – Analyzed data to inform 2 nd round of instrument revisions Nov – March 2013 Teacher feedback survey – Administered to 12 of the participating districts January 2013 Teacher focus groups (Round 1) – Convened to discuss the instruments and recommended changes and preferences for reporting formats March 2013 Analyze & Finalize Results – Prepare reports and guidance documents regarding analysis/use of survey data with help of teacher focus groups Dec – April 2012 Spring Validation Pilot Administration April – May 2013 Teacher focus groups (Round 2 & 3) – Convened to discuss pilot process, lessons learned, and future communication materials June & August 2013 Prep & Release Full Toolkit – free and publically-available toolkit May – August 2013
12
Student Feedback Students participated in “think-alouds” where they talked through their responses to each question. Students responded thoughtfully. –In my class, we learn things that matter to me: “She made the people who speak Spanish feel more important because we participated… we could teach about our culture. [It] taught us to trust in ourselves.” –My teacher knows when we understand the lesson and when we do not: “I say most of the time… one of my friends didn’t understand and when she asked if we all understood, he didn’t say anything [and she didn’t know and kept on going]”
13
Teacher Feedback Over 1400 teachers provided input during the survey development process. –Focus groups, survey pre-piloting, online feedback forums We took feedback about the instrument very seriously. –Items were changed or eliminated for specific reasons. For example between the fall and spring administration: The item “I get bored in this class” was removed because many teachers found it troubling “Schoolwork in this class is too easy” was removed because it was not related to students’ responses on other items.
14
Revisions to the Colorado SPS Removed all negatively-stated items –The fall 2012 instrument included a handful of negatively-worded items –All were removed from the final SPS instrument. Redefined organizing elements (four mapped to TQS) Included open-ended question
15
Summary of Findings: Teacher-Level Figure 1. Overall teacher mean score v. percent favorable
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.