Download presentation
Presentation is loading. Please wait.
Published byGregory Wiggins Modified over 8 years ago
1
MONITORING: THINK BIG ACT SMALL System Implementation and Monitoring (SIM) June 2016
2
Monitoring for Improvement and Learning Think Big Act Small
3
Learning Intentions Monitoring Overview Where are we going? -Establishing Success Criteria How are we going? -Data Collection Processes Where to next? -Data Analysis
4
What Do We Mean By Monitoring? Monitoring is not about ‘after the fact’. It’s built into the learning experience. It’s about identifying what’s working and doing more of it. Adapted from Monitoring by M. FullanM. Fullan Monitoring is not about ‘after the fact’. It’s built into the learning experience. It’s about identifying what’s working and doing more of it. Adapted from Monitoring by M. FullanM. Fullan What are we learning as a result of the actions we’ve taken to improve student achievement and engagement in Mathematics?
5
https://youtu.be/6OhYEFsIsIE
7
Where are we going? “ I THINK IT’S VERY IMPORTANT TO HAVE A FEEDBACK LOOP, WHERE YOU’RE CONSTANTLY THINKING ABOUT WHAT YOU’VE DONE AND HOW YOU COULD BE DOING IT BETTER. Elon Musk
8
Needs Assessment (Tozer and Holmes, 2005)
9
Learning Goals At the beginning of each Learning Period Learning Goals provide explicit answers to the following questions: Where am I going? What am I expected to learn?
10
FROM CURRENT TO DESIRED DESIRED KNOWLEDGE AND SKILLS Desired knowledge and skills Current knowledge and skills Learning goal Scaffolding
11
Success Criteria What learners look for during the learning and what it looks like once they have learned it. Helps learners answer “How am I going?” Success Criteria identify “look fors” - enable learners to monitor their progress towards reaching their goals.
12
Reflective Questions: “Where are we going?” What baseline data do you have? How do you determine your greatest student need? What evidence/data do you use? Does this student need define the teacher/ leader learning need? What am I expected to learn? Share your thinking in the chat pod.
13
How are we going? Monitoring alerts us to the possibility, even the likelihood, that the impact of our strategic action in Mathematics is beginning to emerge and take hold in the instructional core.
14
Monitoring Desired State (future baseline) What’s becoming increasingly more “visible” in our School environment (classroom, instructional core)? What appears to be working and what’s not working? What does the initial feedback and results reveal about progress towards achieving our Mathematics goals? Current State (baseline)
15
Monitoring with Success Criteria What would success in Mathematics look like in our classrooms and schools? What are the look-fors? How might the emergence of success criteria help to support and focus monitoring in Mathematics?
16
Evidence as Boulders, Pebbles and Sand What’s becoming visible in your sandbox?
17
Reflective Questions: “How are we going?” What are we learning about our students? What evidence supports our thinking? What are we learning about our practice to address the student learning need? What evidence supports our thinking? How might our monitoring/data plan be improved? What do we suggest to do? Share your thinking in the chat pod.
18
Where to next? What story does the data tell us? How can we use it to improve student learning? “The goal is to turn data into information and information into insight.” Carly Fiorina
19
Moving From Outcomes To Impact Qualitative ■ Data with description ■ Data that can be observed ■ Match with success criteria/outcomes ■ Seeks to explain (why, how) ■ Interviews, focus groups, observations, SWS Quantitative ■Data with numbers ■Data that can be measured ■Uses statistical data analysis ■May be generalized to larger population ■Surveys, Rating Scales, Student Assessments
20
“If we want to glean real insights, Big Data and Small Data are partners in a dance.” “Big Data is about finding correlations, but Small Data is all about finding the causation, the reason why.” Martin Lindstrom Small Data; The Tiny Clues That Uncover Huge Trends Triangulation of Data
21
Effective Use of Data: Highlights urgent learning need (who/what is improving and who/what is not). Why not? Provides evidence to support aligning resources and decisions about instruction with priorities. Leads to deliberate leadership practices. Closes the feedback loop. Highlights how we learn through this process. Collaborative work with colleagues builds context in which educators can interpret and use data more effectively
22
How Do We Analyze Data To Inform Next Steps? Processes and protocols from the field: Pedagogical Documentation Inquiry Cycle (Plan, Act, Assess, Reflect) Tagging key concepts (Google Folders) SWST (Coding using Success Criteria) Superintendent School Visits School Self-Assessment/District Reviews (SEF) Large Scale Assessment Analysis (EQAO) Transcripts of Adobe Sessions Twitter Hashtags
23
Reflective Questions: “Where to next?” Does the evidence (P/C/O) we have collected link directly to the success criteria/intended outcomes we are trying to achieve? How does the evidence measure what is working/ what is not working? How are we using classroom level data to inform school decisions? School level to system? How are we learning from our work? What needs/variations, and/or adaptations are emerging?
25
Find all the resources from this session at https://sim.abel.yorku.ca
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.