Download presentation
Presentation is loading. Please wait.
Published byKelley May Modified over 6 years ago
1
Resource Orientation Hi, there! I’m Doug from Panorama’s Teaching & Learning team. Incorporate the following slides at the outset of your school’s survey work or use them to answer questions as they arise throughout the school year. Consider the language in the notes section when voicing-over these slides to an audience. Happy facilitating!
2
“Can we trust this data?”
3
Can we trust this data? The Question:
This is a common and reasonable question. If we expect teachers--with limited time as it is--to take action on survey results, we’ll want to ensure they understand where the data comes from and why we believe it’s valid. At Panorama, we ask this same question. A central commitment of our organization is ensuring the validity of our surveys. Our mission--radically improving outcomes for students--depends on it.
4
Framing to keep in mind when considering your data
In completing this survey, our students/staff/families are asking us to hear them. Reports include numbers but those number represent student voice. Consider survey data in the context of additional, anecdotal information. Student/staff/family voice should complement and not replace other information sources such as the experience and opinions of educators in the classroom. Keep in mind that Panorama topic scores aggregate many responses to several questions. Thus, concerns about an individual student’s response to a single question shouldn’t prevent meaningful analysis of topic level information.
5
The Panorama Survey Development Process
6
Extensive Literature Review Valid and Reliable Data from Surveys
The Development Process Extensive Literature Review Focus Groups Synthesize Survey Item Creation Expert Review Valid and Reliable Data from Surveys Our pursuit of reliable and valid data is baked into our development process. Our research team: Does an extensive review of the literature to understand best practices in survey design. We conduct focus groups with potential respondents to learn how they actually think and communicate about the constructs we’re assessing. Then we write the items. There’s a science here (e.g., make it a question not a statement, have enough but not too many answer options, etc.). We have experts in the field review our work. Essentially, they take a survey on the surveys to ensure that we’re adhering to all the best practices in the industry, and revising accordingly. We go back to the potential respondent pool (students, staff, or families) for cognitive pretesting. We ask ”how do you understand this question?” We revise or discard misunderstood questions. Once administered, we get a wealth of data points, which taken together can go a long way towards validating an item or indicating a problem with an item. Quantitative Review Cognitive Pretesting
7
Technical Reports on the Validity and Reliability of Panorama Surveys
8
Yes, we can trust this data!
Links to further research: Preliminary Report: Reliability and Validity of Panorama’s Social- Emotional Learning Measures Validity Brief: Panorama Student Survey
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.