James Ladwig What NAPLAN and MySchool don’t address (but could, and should) A presentation to the AEU, AGPPA, ASPA National Symposium, 23 July 2010, Sydney.

Slides:



Advertisements
Similar presentations
The Teacher Foundation A Look at the Evaluation Process by The Teacher Foundation (TTF)
Advertisements

Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Positive responses to the use of data:. It will show them where they are and what they need to work on.
CLICK It! to Learn By Sue Z. Beers. Sue Z. Beers, 2006 CLICK it! Connecting… Learning: Whats happening inside the students mind; how.
Using Growth Models to improve quality of school accountability systems October 22, 2010.
OECD Programme for International Student Assessment (PISA)
1 Alternative measures of well-being Joint work by ECO/ELSA/STD.
Potential impact of PISA
Key Findings from Recent Illinois Voter Survey Survey Conducted: May 12-14, 2009 Fairbank, Maslin, Maullin & Associates Opinion Research & Public Policy.
The Role of the Principal Going Forward. What Leon School District is Required to Do Florida Department of Educations Supports that might impact you What.
Supporting further and higher education Setting the scene Rhona Sharpe Learner Experience Support Project.
Overview of Lecture Parametric vs Non-Parametric Statistical Tests.
SADC Course in Statistics Analysis of Variance for comparing means (Session 11)
Evaluating Provider Reliability in Risk-aware Grid Brokering Iain Gourlay.
Plantation Primary School
The SCPS Professional Growth System
Developing aspirations in able first year students within a massification context: an exploratory study Sallie Phillips Senior Lecturer University of Bedfordshire.
Characteristics of Effective Schools Helen Raptis and Thomas Fleming Faculty of Education University of Victoria Victoria Confederation of Parent Advisory.
Competencies for beginning teachers
Using an emulator. Outline So we’ve built an emulator – what can we use it for? Prediction What would the simulator output y be at an untried input x.
Determining How Costs Behave
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
The Need To Improve STEM Learning Successful K-12 STEM is essential for scientific discovery, economic growth and functioning democracy Too.
School Development Plan
Provisions for Training and Professional Development 1 Florida Digital Instructional Materials Work Group November 13, 2012.
SMART GOALS APS TEACHER EVALUATION. AGENDA Purpose Balancing Realism and Rigor Progress Based Goals Three Types of Goals Avoiding Averages Goal.
Teacher Training Educators and Economists’ Perspectives
A SHIFT IN FOCUS: FROM TEACHER TRAINING TO TEACHER SUPPORT AND DEVELOPMENT Research results developed by SC – BEwG Learning Forum – Ottawa Feb 28 th, 2012.
What Does Research Tell Us About Identifying Effective Teachers? Jonah Rockoff Columbia Business School Nonprofit Leadership Forum, May 2010.
Friends Don’t Let Friends Learn Alone Dr. Robin L. Smith, MGRESA Professional Learning Coordinator Ms. Cyndi Barr, MGRESA Consultant.
This project Pri-Sci-Net has received funding from the European Union Seventh Framework Programme (FP /13) under grant agreement No Inquiry.
School autonomy and student achievement. An international study with a focus on Italy Angelo Paletta Maria Magdalena Isac Daniele Vidoni.
PERSPECTIVES ON QUALITY AND EQUITY FROM LARGE-SCALE ASSESSMENT STUDIES John Ainley and Eveline Gebhardt.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
AME Education Sector Profile
Sample Design Issues in EGRA
Maria Cristina Matteucci, Dina Guglielmi
REPORT AUTHORS Madeleine Arnot, Claudia Schneider, Michael Evans, Yongcan Liu, Oakleigh Welply and Deb Davies-Tutt With the assistance of Karen Forbes.
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
Research, evidence and engaging learning Profiling the influence of school librarianship Penny Moore
NCCSAD Advisory Board1 Research Objective Two Alignment Methodologies Diane M. Browder, PhD Claudia Flowers, PhD University of North Carolina at Charlotte.
Becoming a Teacher Ninth Edition
HS-MASS 2 Project: Mathematics Overview of Sessions 1&2 for Teachers.
To Think Or Not To Think: That Is The Question Abstract Year after year, teachers recognize that many of their students lack critical thinking skills or.
Student Engagement Survey Results and Analysis June 2011.
Quality teaching and the implications for teaching and learning Based on the Quality Teaching for Diverse.
Daniel Muijs, University of Southampton
Workshop 3 Early career teacher induction: Literacy middle years Workshop 3 Literacy teaching and NSW syllabus 1.
Developing an Effective Teacher Education System.
James G Ladwig Newcastle Institute for Research in Education The impact of teacher practice on student outcomes in Te Kotahitanga.
Strengthening Student Outcomes in Small Schools There’s been enough research done to know what to do – now we have to start doing it! Douglas Reeves.
Early career teacher induction: Literacy middle years Workshop 4 Literacy and Quality Teaching Workshop 4 Early career teacher induction: Literacy middle.
ITEC6310 Research Methods in Information Technology Instructor: Prof. Z. Yang Course Website: c6310.htm Office:
Programming the New Syllabuses (incorporating the Australian Curriculum)
Collaborative Inquiry “Teachers possess tremendous knowledge, skill, and experience. Collaborative inquiry creates a structure for them to share that expertise.
Measures for assessing the impact of ICT use on attainment Ian Stevenson University of Leeds.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
An Analysis of Three States Alignment Between Language Arts and Math Standards and Alternate Assessments Claudia Flowers Diane Browder* Lynn Ahlgrim-Delzell.
Business Statistics for Managerial Decision Farideh Dehkordi-Vakil.
AP Statistics Section 11.1 B More on Significance Tests.
Using a Model Teaching Activity to Help Teachers Learn to Use Comparison in Algebra Kristie J. Newton, Temple University Jon R. Star, Nataliia Perova Harvard.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
Lesson Observations and Learning walks
1 Teacher Evaluation Institute July 23, 2013 Roanoke Virginia Department of Education Division of Teacher Education and Licensure.
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
North Carolina Mentor Training Standards 4 and 5 A Lifeline for North Carolina’s Beginning Teachers.
2015 / 2016 and beyond.  1. High quality leadership drives school improvement  2. Quality of teaching and learning  3. Quality of maths provision 
What have we learned from PISA and TIMSS?
Direct Instruction & Differentiation
Organisation for Economic Co-Operation and Development Indicators on the Quality of Educational Performance Quality of Education Teachers’ Professional.
Presentation transcript:

James Ladwig What NAPLAN and MySchool don’t address (but could, and should) A presentation to the AEU, AGPPA, ASPA National Symposium, 23 July 2010, Sydney

Intro: just one point If we are serious about improving student academic learning outcomes, we must start addressing within-school curriculum differentiation (understanding pedagogy and assessment as enacted curriculum)

What we know about the sources of student outcomes: what schools can affect Multi-level analyses have shown us that variance in student outcomes does not match the ‘school difference’ focus of current policy directions. If we use only three levels, for example: Differences between schools account for (roughly) % of student achievement variance Differences within schools account for roughly 40-45% of student achievement variances Differences between individual students account for roughly 40-50% of student achievement Please note estimates vary from study to study, model to model – and just because something occurs at any one level doesn’t mean that’s where it comes from (e.g. individual differences between kids at the start of school are not JUST about the individual kid)

What we know about the sources of student outcomes: what schools can affect continued In simple terms, we know the things schools can most directly leverage are those sources of student achievement that are themselves most closely connected to student learning outcomes: the ‘enacted curriculum’ (learning experiences of kids, in the classrooms, halls, playgrounds and excursions of schooling). As an aside: we also know that any serious systematic attempt to improve these things has cascading effects beyond schooling – if we ever get serious about this we will have to dramatically change current and future teaching, career structures and work conditions of teachers, University budgets, government priorities relating to social support, job creation, health, science, and research funding, etc….

Current conditions of significant consequence for the problem Since the ‘devolution’ policies of the 80s and 90s, most (all?) systems of schooling in Australia ditched the main means of monitoring the quality of classroom practice: the inspectorate. Since the 90s (since Metherell in NSW) there has been no systemic alternative for monitoring and improving classroom practice In most of the state and territory systems we have only recently been able to develop testing and data systems that can actually do full population monitoring of achievement at all

Current policy landscape We have inherited a very limited interpretation of the knowledge about-within school variance: a near singular focus on teachers as the ostensible source of within school variance And we have a recent policy interest in doing something about the social exclusion effects of economic inequities and the history of racial disparity – which, in education, has been addressed at a school level (between school difference)

Let’s take a closer look….

PISA 2006 School Residuals – means and 95% confidence interval - science

To underline what this tells us Differences between schools (in terms of student outcomes) are small compared to within school differences It is very sketchy to make comparisons except at ‘the extremes’ of the range There is a LOT of variance within schools. For this one unweighted PISA outcome: Between Schools: 19% of the overall variance Within Schools: 81% of the overall variance

Within school variance of the enacted curriculum: in class pedagogy: SIPA observational data means, 95% CIs

Variance of Pedagogy From SIPA observations Using all three dimensions of QT pedagogy model combined (QT Total) 322 teachers in 35 Schools Variance between Schools = 14% Variance between Teachers = 86% Of this: ≈ 1% of school level variance due to socio-economic ≈ 4% of school level variance due to percent of ATSI in student population

But is this just ‘teacher’ difference? SIPA -113 class averages

What this scatterplot shows The quality teaching measure here is a coding of the quality of assessment tasks students completed There is a very big range of average class prior achievement Note the empty space in the ‘low prior experience’ – ‘high quality pedagogy’ quadrant The link between quality of the enacted curriculum and average class prior achievement is very strong

Very Open Question Clearly there is a large amount of pedagogical variance within schools; however, Not all of this variance is simply ‘between teachers’, much of this variance is between classes, and We also know there is strong social ‘co-linearity’ between social backgrounds and ‘prior achievement’ (often misrecognised as ‘ability’) This begs the question: What are the practices of class grouping based on prior achievement? –What is the social distribution within ‘prior achievement’ groups? –How mutable are the groupings? –Over how much time are students within the same groups? –How much do these groups vary from subject to subject?

We have some good indication of how common – PISA 2006 School survey results

Very big, very open question What are the effects of this ‘ability’ grouping, streaming / tracking in Australia?

Conclusion Much of the current educational debate, and policy focus, is not really focusing on the main issue if we are going to address our main levers for improving student outcomes: within school curriculum and pedagogy differentiation While there is a huge amount known internationally, we know very little about how this differentiation plays out in Australia… lots of opinion and anecdote, very, very little rigorous research And that is only the beginning – if international experience holds here, changing this differentiation is a much bigger challenge than measuring students and schools So… while we debate current policies, let us not loose sight of the main game.