Download presentation
Presentation is loading. Please wait.
1
Thanks for the reports…
…but what good can they do for me, my team, and my districts 5/2/17 Jason Altman TerraLuna Collaborative
2
This morning’s schedule
Introductions/Jason makes the case for value (What?, Why?) How can we use the reports/Peer sharing (How?) Workshop (When?, Where?, and With whom?) Simulation exercise Region report dive Region review Region action plan State recap Q and A
3
Introduction to cover questions like:
What are the reports? Why should I be motivated to use them?
4
Our aim was increased utility
Our vision included you in a visual or in-person meting with your closest contacts in each of the buildings that you serve and the reports facilitating a deep and wide-ranging conversation
5
Reports fueled by a new survey
We made a number of modifications (in reality we created a new tool) as a result of: Our conversations with you about your needs Our continued monitoring of survey dynamics like validity and reliability and known issues with Type I and Type II errors As a result of the changes to the survey we’ve moved away from an instrument that measures one main idea!!!
6
And our new survey is valid AND reliable
According to Widaman (1985), anything above 0.9 for the following values indicates a valid survey and scales. Scale Alpha Maximum Alpha Effective Teaching and Learning Practices 0.84 Common Formative Assessment 0.79 0.82 Data-based Decision- making 0.85 Leadership 0.89 Professional Development Above, we are looking for values of above 0.7, but below 0.9 (which would mean we have more items than we need). We also want values that are close to the maximum. Analyses performed well on another part of the confirmatory factor analysis in testing the null hypothesis that the true value of items is zero. In other words, the items add value to the scales.
7
So what do we have to work with in the reports?
Participation Details Overall Performance By Scale Individual Response Plot Scale Performance (by item) Recent Progress (around 4 themes)
8
What exactly does it look like?
Individuals who were present in the building during the school year will be interested to know how folks rate change. On the figure below any values exceeding 50 indicate positive momentum. Individual dives into scale performance investigate educator responses by item* and in the above chart show the percentage of educators answering “most of the time” or “agree” *While scale performance stabilizes at about 8 responses, be careful about how you communicate on individual items for those schools with low numbers of responses.
9
That’s all great, but what was on the survey again?
As a reminder, our new instrument measures the implementation of the underpinnings of the Collaborative Work process. It is sensitive to issues with self-reporting in most areas, and provides scale scores for: Effective Teaching and Learning Practices (8 items) Common Formative Assessment (4 items) Data-based Decision-making (5 items) Leadership (4 items) Professional Development (4 items) Please follow the QR code above to access the survey.
10
Peer sharing: How can we use the reports?
11
Workshop (When?, Where?, and With whom?)
Simulation exercise – 30m Region report dive – 30m Region review – 10m Region action plan – 10m State recap – 10m Q and A – Until lunch
12
Region report dive – 30m
13
Region review – 10m
14
Region review – 10m Q5) If I could change one thing present in the reports, or the process of preparing the reports that would help facilitate more meaningful conversation at the local level it would be....
15
Region action plan – 10m
16
Q5) If I could change one thing present in the reports, or the process of preparing the reports that would help facilitate more meaningful conversation at the local level it would be.... (zero, then capital “b”)
17
State Recap – 10m
18
Question and Answer Session – Until lunch
Contact Details:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.