Deborah Ball, Hyman Bass, MerrieBlunk, Katie Brach,

Slides:



Advertisements
Similar presentations
Using Growth Models to improve quality of school accountability systems October 22, 2010.
Advertisements

1 Solving the Problem: What Mathematics Do Teachers Need to Know? Hyman Bass Forum on Mathematical Competencies in Higher Education Bogotá, Colombia November.
t 1 Developing Measures of Mathematical Knowledge for Teaching Geoffrey Phelps, Heather Hill,
Judy Rink Emeritus Professor University of South Carolina.
What Does Research Tell Us About Identifying Effective Teachers? Jonah Rockoff Columbia Business School Nonprofit Leadership Forum, May 2010.
Chapter 1 What is listening?
Learning Mathematics for Teaching Heather C. Hill Deborah Loewenberg Ball Steven G. Schilling Hyman Bass Principal Investigators.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Cal State Northridge Psy 427 Andrew Ainsworth PhD
The Research Consumer Evaluates Measurement Reliability and Validity
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Observing Classrooms Using Technology Heather C. Hill Harvard Graduate School of Education.
1 Learning to Do the Mathematical Work of Teaching Deborah Loewenberg Ball Hyman Bass, Tim Boerst, Yaa Cole, Judith Jacobs, Yeon Kim, Jennifer Lewis, Laurie.
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
1-Teacher competence does affect student learning. Outsiders can bring fresh ideas and enthusiasm to tired systems. And principals do have a role in reform.
What do you know about Effective Teaching Behaviors?
5D+ Teacher Evaluation Model Professional Learning Phase 1
Specialized Understanding of Mathematics: A Study of Prospective Elementary Teachers Meg Moss.
Preparing Elementary Teacher Candidates for the edTPA Prior to Student Teaching: Documenting Experiences in a Math Methods Course Dr. Erica Kwiatkowski-Egizio.
Evaluating Professional Development Debbie Junk, Coordinator for Mathematics Initiatives Mathematics Project Directors’ Meeting Tuesday, October 9th Austin,
Production Functions and Measuring the Effect of Teachers on Student Achievement With Value-Added HSE March 20, 2012.
Matt Moxham EDUC 290. The Idaho Core Teacher Standards are ten standards set by the State of Idaho that teachers are expected to uphold. This is because.
Using observation to improve teaching and learning Robert C. Pianta, Ph.D. Dean, Curry School of Education Director, Center for Advanced Study of Teaching.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
1 Developing an Evaluation Plan _____________________ The Mathematically- Connected Communities MSP Developed for the February, MSP Conference Dr.
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
Collaboration between mathematicians and mathematic education researcher María Leonor Varas UNIVERSIDAD DE CHILE BIRS, January 2014.
Railside High School Study
LCSD APPR: Overview Review and Focus on the 60 points December 3, 2012.
GMU COMPLETE Center Candy Dilemma A VDOE SPONSORED MSP PROFESSIONAL DEVELOPMENT OPPORTUNITY THROUGH GEORGE MASON UNIVERSITY Cyndi Madden Mona SamahaPatricia.
The Impact of the MMP on Student Achievement Cindy M. Walker, PhD Jacqueline Gosz, MS University of Wisconsin - Milwaukee.
Instrumentation (cont.) February 28 Note: Measurement Plan Due Next Week.
1 Overview Comments, questions Continue work with base ten blocks (decimals –– ordering and computation) Overview of Part 3 of our course Learning to.
Educational Psychology: Theory and Practice, 9/e Timothy J. Piciullo Ed.D. EDU 5301-Human Development and the Learning Process in.
K-12 Mathematics in Rapid City Longitudinal Findings from Project PRIME Ben Sayler & Susie Roth November 5, 2009.
1 Milwaukee Mathematics Partnership Program Evaluation Year 6 Results Carl Hanssen Hanssen Consulting, LLC Cindy Walker University of Wisconsin-Milwaukee.
STEM INTEGRATION FRAMEWORK ASTE January 2014 San Antonio, TX ASTE January 2014 San Antonio, TX DRS. Andrea Burrows, Tim Slater, Mike Borowczak, and Ms.
Assessment of an Arts-Based Education Program: Strategies and Considerations Noelle C. Griffin Loyola Marymount University and CRESST CRESST Annual Conference.
Impact of Organizational Supports for Math Instruction on the Instructional Quality of Beginning Teachers Laura L. Neergaard & Thomas Smith (Vanderbilt.
Learning Centered Conferencing with the Using the Danielson Framework John Hellwich & Michelle Lewis.
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
Measuring Effectiveness in Mathematics Education for Teachers Heather Hill University of Michigan School of Education Learning.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
Developing Measures of Mathematical Knowledge for Teaching Geoffrey Phelps, Heather Hill, Deborah.
Using Teacher Evaluation as a Tool for Professional Growth and School Improvement Redmond School District
UNIVERSITY OF LOUISVILLE Assessing the Mathematics Knowledge of Teachers William S. Bush University of Louisville North Carolina Association of Mathematics.
How can giving ELL students access to learning games on a computer help them learn in the classroom? By: Lisa Cruz.
Challenges and Trade-offs in Measuring the Outcomes of NSF’s Mathematics and Science Partnership Program: Lessons from four years on the learning curve.
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
Mentoring School Name Date Mentor’s Name. OVERVIEW What is Mentoring? The Mentoring Menu The Coaching Process.
The Relationship between Elementary Teachers’ Beliefs and Teaching Mathematics through Problem Solving Misfer AlSalouli May 31, 2005.
Copyright 2010, The World Bank Group. All Rights Reserved. Testing and Documentation Part II.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
AIM: K–8 Science Iris Weiss Eric Banilower Horizon Research, Inc.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
[School Name]’s Student Perception Survey Results This presentation is a template and should be customized to reflect the needs and context of your school.
1 Perspectives on the Achievements of Irish 15-Year-Olds in the OECD PISA Assessment
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
Research and Evaluation
“Excuse Me Sir, Here’s Your Change”
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Melanie Taylor Horizon Research, Inc.
Instructional Coaching in the Elementary Mathematics Classroom
Jonathan Supovitz Abigail Gray
Measuring Teachers’ Fidelity of Implementation
THE RELATIONSHIP BETWEEN PRE-SERVICE TEACHERS’ PERCEPTIONS TOWARD ACTIVE LEARNING IN STATISTIC 2 COURSE AND THEIR ACADEMIC ACHIEVEMENT Vanny Septia Efendi.
TAPTM System Overview Teacher Excellence Student Achievement
High quality CPD for Early Career Teachers
Presentation transcript:

Deborah Ball, Hyman Bass, MerrieBlunk, Katie Brach, Teacher Quality, Quality Teaching, and Student Outcomes: Measuring the Relationships Heather C. Hill Deborah Ball, Hyman Bass, MerrieBlunk, Katie Brach, CharalambosCharalambous, Carolyn Dean, Séan Delaney, Imani Masters Goffney, Jennifer Lewis, Geoffrey Phelps, Laurie Sleep, Mark Thames, Deborah Zopf

Measuring teachers and teaching Traditionally done at entry to profession (e.g., PRAXIS) and later ‘informally’ by principals Increasing push to measure teachers and teaching for specific purposes: Paying bonuses to high-performing teachers Letting go of under-performing (pre-tenure) teachers Identifying specific teachers for professional development Identifying instructional leaders, coaches, etc. Not going to give the long history of measuring teachers here, but….wanted to give a sense for history, what’s out there, and what’s coming down the pike. In Boston, 97% of teachers received highest rating.

Methods for identification Value-added scores Average of teachers’ students’ performance this year differenced from same group of students’ performance last year In a super-fancy statistical model Typically used for pay-for-performance schemes Problems Self-report / teacher-initiated Typically used for leadership positions, professional dev. However, poor correlation with mathematical knowledge R= 0.25

Identification: Alternative Methods Teacher characteristics NCLB’s definition of “highly qualified” More direct measures Educational production function literature Direct measures of instruction CLASS (UVA)—general pedagogy Danielson, Saphier, TFA—ditto But what about mathematics-specific practices?

Purpose of talk To discuss two related efforts at measuring mathematics teachers and mathematics instruction To highlight the potential uses of these instruments Research Policy?

Begin With Practice Clips from two lessons on the same content – subtracting integers What do you notice about the instruction in each mathematics classroom? How would you develop a rubric for capturing differences in the instruction? What kind of knowledge would a teacher need to deliver this instruction? How would you measure that knowledge? Middle school, southwestern district in US Want to show you these instruments because it’ll mimic our own process of developing this instrument

Bianca Teaching material for the first time (Connected Mathematics) Began day by solving 5-7 with chips Red chips are a negative unit; blue chips are positive Now moved to 5 – (-7) Set up problem, asked students to used chips Given student work time

Question What seems mathematically salient about this instruction? What mathematical knowledge is needed to support this instruction?

Mercedes Early in teaching career Also working on integer subtraction with chips from CMP Mercedes started this lesson previous day, returns to it again

Find the missing part for this chip problem Find the missing part for this chip problem. What would be a number sentence for this problem? Start With Rule End With Add 5 Subtract 3

Questions What seems salient about this instruction? What mathematical knowledge is needed to support this instruction?

What is the same about the instruction? Both teachers can correctly solve the problems with chips Both teachers have well-controlled classrooms Both teachers ask students to think about problem and try to solve it for themselves

What is different? Mathematical knowledge Instruction

Observing practice… Led to the genesis of “mathematical knowledge for teaching” Led to “mathematical quality of instruction”

Mathematical Knowledge for Teaching Source: Ball, Thames & Phelps, JTE 2008

MKT Items 2001-2008 created an item bank of for K-8 mathematics in specific areas (see www.sitemaker.umich.edu/lmt) (Thanks NSF) About 300 items Items mainly capture subject matter knowledge side of the egg Provide items to field to measure professional growth of teachers NOT for hiring, merit pay, etc.

MKT Findings Cognitive validation, face validity, content validity Have successfully shown growth as a result of prof’l development Connections to student achievement - SII Questionnaire consisting of 30 items (scale reliability .88) Model: Student Terra Nova gains predicted by: Student descriptors (family SES, absence rate) Teacher characteristics (math methods/content, content knowledge) Teacher MKT significant Small effect (< 1/10 standard deviation): 2 - 3 weeks of instruction But student SES is also about the same size effect on achievement (Hill, Rowan, and Ball, AERJ, 2005) What’s connection to mathematical quality of instruction??

History of Mathematical Quality of Instruction (MQI) Originally designed to validate our mathematical knowledge for teaching (MKT) assessments Initial focus: How is teachers’ mathematical knowledge visible in classroom instruction? Transitioning to: What constitutes quality in mathematics instruction? Disciplinary focus Two-year initial development cycle (2003-05) Two versions since then

MQI: Sample Domains and Codes Richness of the mathematics e.g., Presence of multiple (linked) representations, explanation, justification, multiple solution methods Mathematical errors or imprecisions e.g., Computational, misstatement of mathematical ideas, lack of clarity Responding to students e.g., Able to understand unusual student-generated solution methods; noting and building upon students’ mathematical contributions Cognitive level of student work Mode of instruction

Initial study: Elementary validation Questions: Do higher MKT scores correspond with higher-quality mathematics in instruction? NOT about “reform” vs. “traditional” instruction Instead, interested in the mathematics that appears

Method 10 K-6 teachers took our MKT survey Videotaped 9 lessons per teacher 3 lessons each in May, October, May Associated post-lesson interviews, clinical interviews, general interviews

Elementary validation study Coded tapes blind to teacher MKT score Coded at each code Every 5 minutes Two coders per tape Also generated an “overall” code for each lesson – low, medium, high knowledge use in teaching Also ranked teachers prior to uncovering MKT scores

Projected Versus Actual Rankings of Teachers Projected ranking of teachers: Actual ranking of teachers (using MKT scores): Correlation of .79 (p < .01) Hill, H.C. et al., (2008) Cognition and Instruction

Correlations of Video Code Constructs to Teacher Survey Scores Construct (Scale) Correlation to MKT scores Responds to students 0.65* Errors total -0.83* Richness of mathematics 0.53 One of the next steps was to correlate the video code scale scores (what Heather earlier referred to as constructs) to teachers’ multiple choice measure scores. Here I’ve listed some of the scales you’ve heard mentioned along with their correlations to the measure scores. Although only one scale listed here is significantly related to the measure scores, all of these correlations are pretty big on the grand scale of educational measurement. All our other scales are of similar magnitude and are described further in my paper. Again, these correlations suggest that the survey measures and the video codes are both assessing mathematical knowledge for teaching. *significant at the .05 level

Validation Study II: Middle School Recruited 4 schools by value-added scores High (2), Medium, Low Recruited every math teacher in the school All but two participated for a total of 24 Data collection Student scores (“value-added”) Teacher MKT/survey Interviews Six classroom observations Four required to generalize MQI; used 6 to be sure

Validation study II: Coding Revised instrument contained many of same constructs Rich mathematics Errors Responding to students Lesson-based guess at MKT for each lesson (averaged) Overall MQI for each lesson (averaged to teacher) G-study reliability: 0.90

Validation Study II: Value-added scores All district middle school teachers (n=222) used model with random teacher effects, no school effects Thus teachers are normed vis-à-vis performance of the average student in the district Scores analogous to ranks Ran additional models; similar results* Our study teachers’ value-added scores extracted from this larger dataset

Results MKT MQI Lesson-based MKT Value-added score* 1.0 0.53** 0.72** 0.41* 0.85** 0.45* 0.66** Value added score Significant at p<.05 Significant at p<.01 Source: Hill, H.C., Umland, K. &Kapitula, L. (in progress) Validating Value-Added Scores: A Comparison with Characteristics of Instruction. Harvard GSE: Authors.

Additional Value-Added Notes Value-added and average of: Connecting classroom work to math: 0.23 Student cognitive demand: 0.20 Errors and mathematical imprecision: -0.70** Richness: 0.37* **As you add covariates to the model, most associations decrease Probably result of nesting of teachers within schools Our results show a very large amount of “error” in value-added scores

Lesson-based MKT vs. VAM score

Proposed Uses of Instrument Research Determine which factors associate with student outcomes Correlate with other instruments (PRAXIS, Danielson) Instrument included as part of the National Center for Teacher Effectiveness, Math Solutions DRK-12 and Gates value-added studies (3) Practice?? Pre-tenure reviews, rewards Putting best teachers in front of most at-risk kids Self or peer observation, professional development

Problems Instrument still under construction and not finalized G-study with master coders indicates we could agree more among ourselves Training only done twice, with excellent/needs work results Even with strong correlations, significant amount of “error” Standards required for any non-research use are high KEY: Not yet a teacher evaluation tool

Next Constructing grade 4-5 student assessment to go with MKT items Keep an eye on use and its complications Questions?