Imagine the possibilities Peg Balachowski Everett Community College

Slides:



Advertisements
Similar presentations
Comparison of Teacher-Centered and Learner-Centered Paradigms From Figure 1-2 in Huba and Freed, Learner-Centered Assessment on College Campuses: Shifting.
Advertisements

NCTM’s Focus in High School Mathematics: Reasoning and Sense Making.
Mathematics for Middle School Teachers: A Program of Activity- Based Courses Portland State University Nicole Rigelman Eva Thanheiser.
Students feel socially tied to peers, faculty, and the course.
LANGUAGE LEARNING STRATEGIES
Indiana’s Early Learning Development Framework
Inquiry-based Learning Linking Teaching with Learning.
Fostering Improvement in High Achieving but Static Schools by Dr. Kimberley I. Redmond Associate Director, AdvancED Texas For Science Academy of South.
BY LINDA CASTILLO If I have a pencil sharpening procedure will the classroom have fewer distractions?
Inquiry Learning and Social Studies College and Career Readiness Conferences Summer
Course design by M.E. Ellen Graber Curriculum design and EFL/ESL.
Phase-1: Prepare for the Change Why stepping back and preparing for the change is so important to successful adoption: Uniform and effective change adoption.
21st Centruy Approaches to Teaching Physics
Mathematical Practice Standards
Learning Assessment Techniques
Reading and Writing to Succeed on the EAS (Educating All Students) Exam: The “Constructed Response” or Short Essay A Student Workshop by Writing Across.
Literacy Across Learning
The Well Developed Classroom Blog: Everyday Differentiated Instruction: Using Supports and Extensions to Increase Student Achievement Differentiated Instruction.
Inquiry-based learning and the discipline-based inquiry
April L. Zenisky University of Massachusetts Amherst
Organizing Students for Cognitively Complex Tasks
Action Research on Selective Mutisim and Social Anxiety
ELT 329 ACTION RESEARCH Week 4
Plan, Do, Study Act! Using PDSAs to Move Step by Step to Baby-Friendly
GOOD MORNING.
Formative Assessment Classroom Techniques (FACTs)
Growth Mindset vs Fixed Mindset
Evaluating Community Change
Direct Instruction & Differentiation
Maths Counts Insights into Lesson Study
Student Learning Outcomes Assessment
Promoting Reflective Practice Local District 6 February 18, 2005
P-D-S-A CyCles the art + Science of improvement
Improvement 101 Learning Series
Introduction to the NSU Write from the Start QEP
Building Community within the Mathematics Classroom
The Framework for Teaching
Using Action Research to Guide Student Learning Experiences & Assessments Exploring the synergistic activities of curriculum design/delivery and assessment.
Starfish Faculty Training How to Raise Flags and Kudos
Learning to Program in Python
Feedback : Some thoughts
How Learning Works by Kristin
Instructional Learning Cycle:
Target Setting for Student Progress
Literacy Across Learning
Imagine Success Engaging Entering Students Innovations 2009
Quality Improvement Intervention tracking
School’s Cool Makes a Difference!
Making the best use of pre-teaching and assigning competence.
Provider Perspective Shift
Run charts Science of improvement
Building Academic Language
Module 2 Nuts and Bolts of Peer Coaching Peer Coach Training.
PHYS 202 Intro Physics II Catalog description: A continuation of PHYS 201 covering the topics of electricity and magnetism, light, and modern physics.
Analyzing Student Work Sample 2 Instructional Next Steps
“Pop” in to Find Out What’s New in Kindergarten
GOSCIENCE TRAINING: ENHANCING COMPREHENSION IN SCIENCE EDUCATION
Elements of Constructivist Teaching and learning Practices
They Say, I Say Chapter 1 and 12
Developing the Research Question – Awareness Before the Answer
Whose Job Is It? Part Two At the Board Table Discussion Tool
Peer and Self Assessment: Help others help you!!
Chartering a project for improvement
Building Academic Language
Student Led Conferences: A Closer Look
Sharing “ Small Wins” This presentation is a redesign MATLT that explains the concept of sharing small wins in a change strategy with technology in support.
Natalie Drake and Mark Peace
Optional Module 7—Mindset
Exploring the Instructional Shifts Inherent in the 2020 CAS
Presentation transcript:

Imagine the possibilities Peg Balachowski Everett Community College

Using Improvement Science to Improve Student Outcomes Anthony Bryk is the current president of the Carnegie Foundation for the Advancement of Teaching, where he is leading work on transforming educational research and development, more closely joining researchers and practitioners to improve teaching and learning. “There is no blueprint. We are inventing this together as we go along. “ Anthony Bryk

Thanks to Lawrence Morales, Associate for Improvement Science and Jane Muhich, Managing Director for Community College Programs, Director of Productive Persistence for the use of their work.

got persistence?

Outline Improvement Science The Model for Improvement Tools in Improvement Science Driver Diagrams PDSA Cycles Run Charts

Carnegie’s Networked Improvement Community Language & Literacy Productive Persistence Improvement Science Advancing Teaching Analytics

Carnegie’s Pathways Addressing the Developmental Math challenge with a Pathways approach: Create a high quality Instructional System Attend to Language and Literacy issues Advance Quality Teaching Sustain a Networked Improvement Community Promote Productive Persistence

Improvement Science A field designed to provide a framework for research that is focused on improvement. Poses questions about a change being tested Proposes predictions about how the test will go Uses a testing mechanism to answer questions and verify predictions Collects data to support conclusions Analyzes data to inform later iterations of tests

Langley, G. J., et al (2009). The improvement guide: a practical approach to enhancing organizational performance (2nd ed.). San Francisco: Jossey-Bass.

The Model for Improvement (MFI) The Aim The Measures The “Change Ideas”

Three Questions for Guiding Work What are we trying to accomplish? How will we know that a change is an improvement? What changes can we make that will result in an improvement? One Method for Testing Change The Plan-Do-Study-Act Cycle (PDSAs)

Tools in Improvement Science 1. Driver Diagrams 2. PDSA Cycles 3. Run Charts

Productive Persistence The tenacity plus good strategies that math students need in order to be successful.

Tool 1: The Driver Diagram AIM What are we trying to do? Aim What? Who? By When? Measures: All three essential questions are on the driver diagram, a clear framework. MEASURES How will we know a change is an improvement?

How will we know a change is an improvement? The Driver Diagram AIM What are we trying to do? Driver 1 Driver 2 Driver 3 Primary Secondary Driver A Driver B Driver C Driver D Driver E Aim What? Who? By When? Measures: MEASURES How will we know a change is an improvement?

The Driver Diagram CHANGE What changes can we test? AIM What are we trying to do? Driver 1 Driver 2 Driver 3 Primary Secondary Driver A Driver B Driver C Driver D Driver E Change Ideas Change Idea Aim What? Who? By When? Measures: MEASURES How will we know a change is an improvement?

Secondary Drivers Skills & Habits Secondary Drivers Mindsets Primary Drivers  Secondary Drivers Aim: Students continue to put forth effort during challenges and when they do so they use effective strategies. Skills & Habits  Secondary Drivers Mindsets Secondary Drivers Value  Secondary Drivers Social Ties Secondary Drivers We have the five primary drivers of productive persistence Not meant to exhaustive High leverage drivers, if we got traction in that area we would make a difference in our desired outcome Faculty Secondary Drivers

Students believe they are capable of learning math. Mindsets Primary Drivers  Aim: Students continue to put forth effort during challenges and when they do so they use effective strategies. Students have skills, habits and know-how to succeed in college setting.   Skills and Habits   Students believe they are capable of learning math. Mindsets Students believe the course has value.   Value Students feel socially tied to peers, faculty, and the course.     Social Ties Faculty and college support students’ skills and mindsets. Faculty

Students believe they are capable of learning math. Primary Drivers  Secondary Drivers Have accurate knowledge about succeeding in the course and navigating The Health Care Data Guide: Learning from Data for Improvement institution. Aim: Students continue to put forth effort during challenges and when they do so they use effective strategies. Students have skills, habits and know-how to succeed in college setting.   Have the know-how and self-discipline to set and prioritize long and short-term goals over short-term desires and distractions Use learning strategies that are appropriate for the academic challenge they are facing. Have strategies for regulating anxiety. See that math isn’t just a set of algorithms to be memorized but a connected set of concepts that can be understood and applied. Students believe they are capable of learning math. Believe they can actively grow their math ability with effort, help, and good strategies. View math success as something “people like them” do, and not something “other people” do. Students believe the course has value.   Students see how completion of this course is relevant to goals for degree/certificate completion. Students believe the knowledge from the course is relevant to a personal or socially-valued goal. Students feel as though they are completing academic tasks for personal reasons.   Students feel socially tied to peers, faculty, and the course.   Students feel that the professor cares that they, personally, succeed in the course and in college. Possible Measures: Attendance Time on task Strategy use Help-seeking Revising work Challenge-seeking Students feel comfortable asking questions Students do not feel stigmatized due to membership in a negatively stereotyped group. Students do not question whether they belong. Faculty and college support students’ skills and mindsets. Faculty believe students can succeed if they develop more productive skills and mindsets.   Faculty know how to promote productive skills and mindsets.   Faculty see helping their students to productively persist as part their role as an instructor. Faculty integrate PP principles in how they talk to students and in the curriculum they assign.

Secondary Drivers Skills & Habits Secondary Drivers Mindsets Primary Drivers  Secondary Drivers Aim: Students continue to put forth effort during challenges and when they do so they use effective strategies. Skills & Habits  Secondary Drivers Mindsets Secondary Drivers Value  Secondary Drivers Social Ties Secondary Drivers Let’s look specifically at the social ties driver Faculty Secondary Drivers

Social Ties Students feel the professor cares that they succeed Primary Driver  Secondary Drivers Change Ideas Students feel the professor cares that they succeed Social Ties Students feel comfortable asking questions Students do not doubt their belonging in the course

Primary Driver  Secondary Drivers Change Ideas Students feel the professor cares that they succeed Email routines Use of names routines Your ideas? Social Ties Students feel comfortable asking questions Students do not doubt their belonging in the course

Your ideas? Social Ties Your ideas? Primary Driver  Secondary Drivers Change Ideas Students feel the professor cares that they succeed Email routines Use of names routines Your ideas? Social Ties Students feel comfortable asking questions Answering routines Data collection routine Your ideas? Students do not doubt their belonging in the course

How do we know these work? Primary Driver  Secondary Drivers Change Ideas Students feel the professor cares that they succeed Email routines Use of names routines Your ideas? Social Ties Students feel comfortable asking questions Answering routines Data collection routine Your ideas? Students do not doubt their belonging in the course Group role routine Group noticing routine Your ideas? CHANGE How do we know these work?

Tool 2: PDSA Cycles “A way to turn change ideas into action and connect action to learning.”* A disciplined way to test ideas. Avoids implementing too soon. *Langley GL, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance (2nd edition). San Francisco: Jossey-Bass Publishers; 2009.

What changes can we test? PDSA Cycles CHANGE What changes can we test? PLAN What’s the change? What’s your prediction? Plan to conduct test DO Execute test Collect data, document observations STUDY Compare results to prediction What did you learn? ACT Next steps: Adapt, adopt, abandon “A way to turn change ideas into action and connect action to learning.”

PDSA Cycles in Productive Persistence Aim: To measure student sense of belonging, develop an efficient way to gather data on the number of students asking questions Cycle 1: Feedback from ONE student. Cycle 2: Test with ONE student in class. Cycle 3: Test with MORE students in class. Cycles 4-6: Further testing with more students, more than one student at a time, and regular implementation. Cycle 7: Inform whole class more fully. Next Step?

References Provost, L. P., & Murray, S. K. (2011). The health care data guide learning from data for improvement. San Francisco, CA: Jossey-Bass.

Tool 3: Run Charts A graphical display of data to allow you to track outcomes over time.

Run Chart Scenario Around the 4th week of the term, an instructor is noticing a lot of students are missing class. So she brainstorms and eventually has an idea for how to reduce absences. Her plan is to implement her idea in Week 8 of the semester and then compare data from Week 4 and Week 11 of the semester to see if her idea worked. Here are her data…. Week 4 Data Week 8 Implement Week 11 Data

Did this change lead to an improvement? Questions to Ponder: Did this change lead to an improvement? What questions do you have about the data? What other factors might be at play? Week 8 Implementation

Group Activity 1) Look at the following 5 Run Charts: Note: Each one has the same data for Weeks 4 & 11 as the bar chart. 2) Decide what conclusions you can make for each case: Did the change lead to improvement? Why or why not?

Case 1 Conclusion? Week 4 Data Week 8 Implement Week 11 The change appears to have had no effect. Improvement was happening steadily and the data reflect this. Conclusion?

Case 2 Week 4 appears to be an outlier, otherwise there is normal variation. Conclusion?

Case 3 The improvement appears to have taken place in Week 5. What happened? Change in Week 8 did not lead to improvement. Conclusion?

Case 4 The change appears to have led to an improvement, but after Week 11 the improvement seems to have faded. Conclusion?

Case 5 The change led to an improvement that was sustained but below the median. Conclusion?

How do we know there’s improvement? In all of these Cases, we need to consider the question: “When do the data give us a signal that we have improved on our outcome measure?” Signal is detected through use of one of 4 basic rules. If you have a signal, tells you you have change not attributed to normal variation. These 6 different cases show that you can have the same two data points for week 4 and 11, but they may represent completely different stories. With a run chart, we are able to more accurately whether a change leads to sustained improvement. Data in a bar graph (as in slide 2) could appear to show improvement, but they do not account for the variable possibilities demonstrated through these run chart cases.

Does this count as an improvement? Case 5 Revisited How do we know that the pattern in the data can be attributed to the change and confirm that it is an improvement? Held their gains for a while, but is this enough data points to make the conclusion? Does this count as an improvement?

Run Chart Rules for Signals Shift: Six or more consecutive points all above or all below the median. (Skip points on the median.) Trend: Five or more consecutive points all going up (increasing) or all going down (decreasing). Astronomical Point: An obvious value different than others in the data set.

Example 1: Are there signals? Note: 3 of 4 of the following examples contain real data and the y-axis units have changed.

Example 1: Analysis Shifts? Trends? Astronomical Points? None No Signals  There is no signals of improvement: The number of runs (6) is within the limits for charts (13 points not on the median has a range of 4 to 11 runs). It is thus considered normal variation around the median There are no shifts, trends, or astronomical points

Example 2

Example 2 *First two weeks: Trend downwards from Jan 1 to just past Jan 8, 5 consecutive points decreasing (+1 point which is same as previous data point) Middle of the month (1/8 to after 1/15): shift with 6 consecutive points below the median line Through rest of January: Trend upwards, 5 consecutive points increasing

Example 3 Median = 0.83 The shift here is later in the data set. In the context of a community college classroom, this might be a usual pattern faculty sense in their classes, as this occurs well enough into the term that students may be tired and attendance drops off. According to the data alone though, the shift indicates that this is a non-random pattern. Having actual data helps back up these intuitions on attendance patterns, and help us think about how you might reverse that tendency and what things you could try.

Example 3 Median = 0.83 The shift here is later in the data set. In the context of a community college classroom, this might be a usual pattern faculty sense in their classes, as this occurs well enough into the term that students may be tired and attendance drops off. According to the data alone though, the shift indicates that this is a non-random pattern. Having actual data helps back up these intuitions on attendance patterns, and help us think about how you might reverse that tendency and what things you could try.

Example 4 Median = 0.78

Example 4 Median = 0.78 In Faculty E’s run chart, there is a shift in the most recent data from10/1 until 10/8. We see that the shift happens after exam, which might lead you to think about possible changes to test before/after the next exam to address a possible drop in attendance. If you tested a change and continued to collect this data, you can compare attendance around that exam with the one on the run chart here. Even if you are not collecting data, you would probably sense this lower attendance in your class. When you have actual data to back this up, it allows you to think about what to do next.

Reflections from users “From hearing the group discussions in the last couple modules—they were teaching themselves. The amount that I had to come to the group[s] and make sure they were on task went down. They were becoming self-learners…. It was wonderful to see.” - Productive Persistence Faculty “It helped me focus on what was going on with the students, particularly on who was engaged in class. I enjoyed doing the Improvement work and I feel like it really helped me focus on teaching in a different way than I did before.” Another possible quote “Writing up the [PDSA] cycles was helpful for me. I reflected more deeply on what I was trying to do and it helped with planning for how I would modify it, what I would expect out of it and what I would hope to see,” (aaron)

Thanks for your time! Peg Balachowski mbalachowski@everettcc.edu Questions? Thanks for your time! Peg Balachowski mbalachowski@everettcc.edu