Download presentation
Presentation is loading. Please wait.
1
Examining high-performing systems
Barry McGaw Professorial Fellow, University of Melbourne Co-Director, Australian National Development Index (ANDI) Project Barry McGaw is a Vice-Chancellor’s Fellow at the University of Melbourne and was Foundation Chair of the Australian Curriculum, Assessment and Reporting Authority. He returned to Australia at the end of 2005 from Paris where he had been Director for Education at the Organisation for Economic Co-operation and Development (OECD). He had previously been Executive Director of the Australian Council for Educational Research (ACER) and Professor of Education at Murdoch University in Perth, Western Australia. He began his professional life as a science and mathematics teacher. Assessment Training Program Department of Examinations, Sri Lanka University of Melbourne 2 October 2017
2
Quality of student achievements
3
Mean reading results (PISA 2000)
Australia tied for 2nd with 8 others among 42 countries. The figure above shows the mean performances of countries in reading literacy in 2000 in the Program for International Student Assessment (PISA) conducted by the Organisation for Economic Co-operation and Development (OECD). Reading literacy assessed in PISA is the capacity to use, interpret and reflect on written material. The line in the middle of the box for each country gives the mean performance of the sample of 15-year-olds tested in the country. The size of a box reflects the precision with which the mean for the population of the country’s 15-year-olds was estimated. Where the boxes overlap on the vertical dimension, there is no significant difference between the population means for the countries. (Further details are given in the PISA report indicated at the foot of the figure.) The results reveal marked variations in performance levels among the 42 participating countries – ranging from Finland, significantly better than all others at the top, to Peru, significantly worse than all others at the bottom. Australia ranked in 4th place but its mean was not significantly different from those of two countries above it or six below it. It is, therefore, appropriate to say that Australia ranked between 2nd and 10th or that Australia tied in 2nd place with eight other countries among the 42 participating. OECD (2003), Literacy skills for the world of tomorrow: Further results from PISA 2000, Fig. 2.5, p.76.
4
Countries ahead of Australia in PISA 2015
Reading Mathematics Science Singapore Hong Kong Canada Finland Ireland Estonia Korea Japan Norway New Zealand Singapore Hong Kong Macao Taiwan Japan B-S-J-G (China)* Korea Switzerland Estonia Canada Netherlands Denmark Finland Slovenia Belgium Germany Poland Ireland Norway Singapore Japan Estonia Taiwan Finland Macao Canada Vietnam Hong Kong The table above lists the countries or regions within countries that were significantly ahead of Australia among the 70 participating in PISA 2015. Australia remains significantly above the OECD average in reading, mathematics and science but only just so in mathematics where there are 16 are ahead of Australia. In reading there are 9 significantly ahead and in science there 7 significantly ahead. In PISA 2012 only Shanghai participated from China and stood out as the top performer. In PISA 2015, four regions in China participated, as shown above, with their combined performances lower than that of Shanghai alone. Together, they were significantly ahead of Australia only in mathematics. In the development of a new national Australian curriculum, attention is being given to the curricula in higher performing countries, where comparisons can be made, to determine if expectations of students might be set too low in existing state and territory curricula within Australia. In mathematics, for example, particular attention was given to the curricula in Finland, Hong Kong and Singapore and this resulted in an Australian curriculum that generally sets higher expectations of our students than the state and territory curricula that it has replaced.. *Beijing, Shanghai, Jiangsu, Guangdong OECD (2016), PISA 2015 results: excellence and equity in education, Vol 1, Fig. I.2.13, p.67, Fig. I.4.1, p149, Fig I.5.1, p.177.
5
Comparison of changes in PISA performances
Reading Mathematics Science Significant decline Australia - Czech Republic - - Ireland - Sweden Australia Belgium Canada Czech Republic Denmark Finland France Hungary Iceland Netherlands New Zealand Slovak Republic Sweden Australia Austria Czech Republic Finland Greece Hong Kong Hungary Iceland New Zealand Netherlands Slovak Republic Sweden Reading was the main domain of assessment in PISA 2000 and again in Over that period, 13 countries improved their mean performance, 21 showed no significant change and four, Australian among them, showed a significant decline. Mathematics was assessed in PISA 2000 but, when it was the main domain of assessment in PISA 2003, the scale was refined and better defined. Mathematics was assessed as a minor domain, but on the new scale, in PISA 2006 and PISA It was the main domain being assessed in PISA Over the period 2003 to 2012, 9 countries improved, 12 showed no significant change and 13, Australia among them, showed a significant decline. Science was assessed in PISA 2000 and PISA 2003 but was the main domain of assessment in PISA 2006, when the scale was somewhat redefined. Science was the main domain of assessment again in PISA Over the period 2006 to 2915, 2 countries improved, 22 showed no significant change and 12, Australia among them, showed a significant decline. No significant change 21 countries 12 countries 22 countries 13 countries 9 countries 2 countries Significant improvement
6
Countries provoked to change by comparisons
7
You need to assess only a sample of students.
8
A low-performing country challenged to improve
9
Trends in PISA reading means
Finland Australia Poland The comparisons above show the trends in PISA Reading mean scores for Finland, the highest performing country in reading in PISA 2000, Australia and Poland, the OECD country showing the most dramatic improvements from a relatively low performance in PISA 2000. Within three years, Poland had passed the OECD mean caught Australia and, by PISA 2012, had matched Finland, which had shown some decline but remained well above the OECD mean. Since 2012, Finland has improved and both Poland and Australia dropped.
10
PISA trends for Australia and Poland
Reading Finland Australia Poland Mathematics Finland Australia Poland Science The graphs above show the results for reading, mathematics and science for Poland, Australia and Finland in the five PISA assessments from PISA 2000 to PISA Poland achieved a remarkable improvement in the performance of its 15-year-olds in all three domains: reading, mathematics and science, up to 2012 but declined somewhat in 2015 in all domains. It has caught up to Australia in all domains. Finland Australia Poland
11
How did Poland do it?. 1999 Reform
Ceased separating students into academic and vocational tracks before age 15. Strengthened role of local government. Increased autonomy for schools Introduced national exams for accountability Details on the reforms in Poland can be found at
12
Variation in reading performance (PISA 2000)
Variation of performance within schools Explained by SES Variation of performance between schools Not explained by SES A further way in which to examine equity is to examine the variation in student performance within and between schools. In the countries on the left of the graph, there is substantial variation between schools and relatively little within. In these countries students are separated into schools of different kinds as early as at 11 years of age on the basis of their educational achievement. In the countries at the right, schooling for 15-year-olds remains comprehensive, without selection and there is very little difference among schools. In these countries parents need not worry much about selection of school. In the mid-range are countries like Australia, the US and the UK where there is not formal streaming of students into schools of different kinds but a range of influences such as demographic differences and private provision that create differences between schools. The variation between schools is divided into two components: variation that can be accounted for in terms of differences in the social backgrounds of students in the schools; variation that cannot be accounted for in these terms. In Australia, 68 per cent of the variation between-schools could be accounted for in terms of differences between schools in the social background of their students. These are differences due to whom the school enrols not what the schools do. In Finland, where there was in any case little difference between schools, only 16 per cent of it could be explained in terms of differences in students’ social backgrounds. In PISA 2000, Poland was among the countries on the left with a substantial variation between schools and less within schools. OECD,UNESCO (2003), Literacy skills for tomorrow’s world: further results from PISA 2000, Table 7.1a, p.357.
13
Variation in mathematics performance (PISA 2003)
Variation of performance within schools Explained by SES Variation of performance between schools Not explained by SES Between 2000 and 2003, Poland had stopped selection of students into schools of different kinds and created comprehensive secondary schools. As a consequence, The variation between schools in Poland decreased and the variation within schools increased. The extent of the changes is reflected in the figure above for PISA Poland was at the right of the graph among the Scandinavian countries. The shift of Poland from the left to the right in a graph such as that above shows only that Poland did what it said it had done, that is remove selective entry to the secondary schools and make the schools comprehensive. A more interesting and important question is whether the policy change had any impact on the achievement levels of students. OECD (2004), Learning for tomorrow’s world: First results from PISA 2003, Table 4.1a, p.383.
14
How did Poland do it?. 1999 Reform
Ceased separating students into academic and vocational tracks before age 15. Strengthened role of local government. Increased autonomy for schools Introduced national exams for accountability Curriculum Reform Comprehensive education extended from 8 to 11 years. Academic studies added to vocational education. Universal preschool education introduced. CONCLUSIONS Effective reform takes time. Without data: You don’t know what you need to do You don’t know if you are succeeding. Details on the reforms in Poland can be found at
15
A high-performing country challenged to improve
16
Percent of students at each PISA 2000 reading level
The figure above shows, for PISA 2000 Reading, the percentages of 15-year-olds at each of the levels on the reading scale, from level 5 down to level 1 and also below level 1 for the nine countries with the highest mean scores.. The Republic of South Korea had the smallest percentages of students at or below level 1, yet ranked only 6th out of 9 on mean scores. The reason that South Korea’s mean score was not higher was that it had a relatively low percentage of students at the highest levels (5 and 4). This compression in the range of scores in South Korea was also evident in comparisons of the standard deviations of the distributions of scores. South Korea had the smallest standard deviation. Level 2 Level 1 Below Level 1
17
Changes in top nine in PISA 2000 reading to 2003
Finland Canada New Zealand Australia Ireland Korea Japan Sweden The changes in the mean scores in reading from PISA 2000 to PISA 2003 are shown in the figure above for the nine highest performing countries in PISA South Korea was the only one among them to have achieved a higher mean score in PISA 2003 than it had in PISA 2000. Austria
18
How did South Korea do it?.
Shift in distribution of results PISA 2000 – Korea showed had the smallest variation in student performances PISA 2003 – Korea’s variation was at the OECD average, with more high-performing students. YES, BUT how did Korea increase the spread New curriculum More emphasis on essay tests More use of essays in assessments for university entrance. CONCLUSIONS This reform worked more quickly than Poland’s It was a smaller change The country was starting from a high base. Without data: You don’t know what you need to do You don’t know if you are succeeding.
19
Equity of student achievements
20
Social background & reading literacy
High Two indices of relationship Social gradient Correlation or variance accounted for Social gradient: Magnitude of increment in achievement associated with an increment in social background (on average) Reading literacy The 15-year-olds in PISA provide information on their economic and social background – parents’ education and occupation, cultural artefacts in the home – that permits the construction of an index of social background that ranges from socially disadvantaged to socially advantaged. This scale is comparable across countries. The relationship between social background and reading literacy in PISA 2000 is shown in the figure above There are two indices with which to summarise the nature of a relationship such as this: The slope of the regression line The extent of variation around the regression line. The slope of the regression line (which is referred to the ‘social gradient’ when the horizontal axis is a measure of socio-economic status) shows the magnitude of change in performance (in the case above, reading literacy in PISA 2000) associated with a particular increment in socio-economic status, on average. The correlation summarises the extent to which the data are spread around the line and thus expresses how well the regression line summarises the relationship. The correlation in this case is relatively high (around 0.45) for measures such as these, though that indicates that variation in socio-economic status accounts for only just over 20% of the variation in reading literacy scores. Correlation or variance accounted for: How well the regression line summarises the relationship Social Advantage Low PISA Index of social background Source: OECD (2001) Knowledge and skills for life, Appendix B1, Table 8.1, p.308
21
Social gradients for reading (PISA 2009)
High quality Low equity High quality High equity Social gradient (OECD regression slope – country regression slope) Reading In the figure mean performances of countries in reading in PISA 2009 are represented on the vertical axis, the grey band highlighting countries with means not significantly different from Australia’s. The slope of the regression line for social equity on reading literacy is represented on the horizontal axis as the difference between the slope for the OECD as a whole and a country’s own slope. This places to the left countries where the slope is steeper than in the OECD as a whole (that is, countries in which there are bigger differences in educational achievement associated with differences in social background) and to the right countries where the slope is less steep than that for the OECD as a whole (that is, countries in which there are smaller differences in educational achievement associated with differences in social background). Countries with slopes significantly less steep than the OECD’s are shown in blue; those with lines significantly steeper are shown in red and those with lines not significantly different in slope from the overall OECD line are shown in grey. Countries high on the page are high-quality and those to the far right are high-equity. The graph is divided into four quadrants on the basis of the OECD average on the two measures. The presence of countries in the ‘high-quality, high-equity’ quadrant (top right) demonstrates that it is possible to achieve quality and equity together. Australia is in the top-left quadrant as a ‘high-quality, low-equity’ country, with a high average performance but a relatively steep regression line. There are many countries to the left of Australia in this graph (and thus with less equitable results) but the ones on which Australia should focus are those in the grey band containing countries equal to Australia in quality and those above that band and, in particular, those to the right of it. Low quality Low equity Low quality High equity OECD (2010) PISA 2009 Results: overcoming social background, Fig. II.3.2, p.55.
22
Correlations for reading (PISA 2009)
High quality Low equity High quality High equity Reading As indicated in the notes on slide 9, the slope of the regression line (or ‘social gradient’) provides one perspective on the relationship between social background and educational achievement. It is an indicator of the magnitude of the average differences in educational achievement associated with particular differences in social background. The correlation (or the squared correlation as a measure of the percentage of variance in achievement accounted for by differences in social background) is an indicator of how well the regression line summarises the relationship for a particular country. On this indicator of equity, Australia appears somewhat better than with the social gradient indicator. It is in the ‘high-quality, high-equity’ quadrant as shown above, indicating that it has more exceptions (disadvantaged students performing relatively well and advantaged students performing relatively poorly) than in countries to the left of it. On that measure, however, Australia is not significantly different from the overall OECD result. Finland, Canada and Korea all significantly outperformed Australia in quality (mean performance) but were also significantly ahead of the OECD as a whole in equity on this measure Japan matched Australian in quality but was ahead of it in equity. Low quality Low equity Low quality High equity Variance in reading accounted for by social background (OECD-country) OECD (2010) PISA 2009 Results: overcoming social background, Fig. II.3.2, p.55.
23
Orientation of OECD’s PISA
24
What to assess
25
Deciding what to assess...
Looking back at what students were expected to have learned For IEA studies, the curriculum is the focus. OR Looking ahead to what students can do with what they have learned. For PISA, the OECD countries chose the future orientation.
26
PISA assessments Reading literacy Mathematical literacy
Using, interpreting and reflecting on written material. Mathematical literacy Recognising problems that can be solved mathematically, representing them mathematically, solving them. Scientific literacy Identifying scientific questions, recognising what counts as scientific evidence, using evidence to draw conclusions about the natural world. Problem solving Using cognitive processes to resolve real situations where the solution path is not immediately obvious and where the competencies required are not within a single discipline.
27
Development of assessments
Frameworks by international experts Assessment materials submitted by countries developed by research consortium screened for cultural bias by countries and expert, international panel items with prima facie cultural bias removed at this stage translated from English & French originals trialled to check items working consistently in all countries Final tests items shown in trial to be culturally biased removed best items chosen for final tests balanced to reflect framework range of difficulties range of item types (constructed response, multiple choice)
28
Sources and source languages for PISA 2000
Materials submitted from 20 countries Final forms included materials with the following source languages English, Spanish, Greek, Finnish, French, Swedish, German, Dutch, Czech, Korean, Norwegian Test developers from Australia, Netherlands, Japan
29
Change in Reading Rank: Non-English versus English Items
Ranked higher on English Items Ranked higher on non-English Items
30
Whom to assess
31
Deciding whom to assess...
Grade-based sample For IEA studies, the grade in which most students of particular age are enrolled is chosen. OR Age-based sample For PISA, the OECD countries chose an age-based sample, selecting 15-year-olds in school as the population.
32
PISA sampling requirements
Population: all 15-year-olds in school excludes 15-year-olds out of school includes 15-year-olds in special education institutions could exclude up to 5% of 15-year-olds in school difficult to reach (e.g. remote schools) non-participation Sample minimum of 150 schools per country two random samples: schools and replacement schools if school declines, replacement school is invited stringent requirements set by countries Source: OECD (2004) Learning for tomorrow’s world: First results from PISA 2003, Table A3.3, p.327.
33
Weighted response rates of schools: PISA 2000
Target response rate Minimum before replacement UK included - declining and replacement schools’ examination results did not differ significantly. US included – students’ social backgrounds in declining and replacement schools did not differ significantly. Netherlands excluded - response rate too low to assure that sample was not biased. Source: OECD (2001) Knowledge and skills for life: First results from PISA 2000, Table A3.2, p.235.
34
Weighted response rates of schools: PISA 2003
Target response rate Netherlands satisfied requirements. Minimum before replacement US included – students’ social backgrounds in declining and replacement schools did not differ significantly, but response rate is poor, and worse than in 2000. UK excluded – declining and replacement schools’ examination results did not differ significantly but student response rate within schools less than 80%. Source: OECD (2004) Learning for tomorrow’s world: First results from PISA 2003, Table A3.3, p.327.
35
Summary
36
Summary Australian 15-year-olds do relatively well internationally
Weaker in mathematics and declining (out of top 10 but still in top 20) 7 countries significantly better in PISA 2003 16 countries significantly better in PISA 2012 (3 of them not in PISA 2003) Better in reading but declining (though still just in top 10) 1 country significantly better in PISA 2000 9 countries significantly better in PISA 2012 (3 of them not in PISA 2000) Best in science but declining (though still well in top 10) 3 countries significantly better in PISA 2006 7 countries significantly better in PISA 2012 (2 of them not in PISA 2006) Declining in achievement while others holding or improving Differences in students’ social background More strongly related to achievement than in other high-performing countries Account for much of the differences among schools
37
Thank you.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.