Presentation is loading. Please wait.

Presentation is loading. Please wait.

The characteristics of high-performing education systems: Lessons from international comparisons Barry McGaw Melbourne Graduate School of Education, University.

Similar presentations


Presentation on theme: "The characteristics of high-performing education systems: Lessons from international comparisons Barry McGaw Melbourne Graduate School of Education, University."— Presentation transcript:

1 The characteristics of high-performing education systems: Lessons from international comparisons
Barry McGaw Melbourne Graduate School of Education, University of Melbourne Professor Barry McGaw has been a half-time Professorial Fellow in the Melbourne Graduate School of Education at the University of Melbourne since January In , he was Director of the Melbourne Education Research Institute. He is currently Executive Director of the international Assessment and Teaching of 21st Century Skills project established and funded by Cisco, Intel and Microsoft which has its headquarters at the University of Melbourne. In the other part of his professional life he is a consultant though that part has been dominated since 2008 by his role as Chair of the Interim National Curriculum Board and now as Chair of the Board of the Australian Curriculum, Assessment and Reporting Authority which has replaced the Interim National Curriculum Board. Prior to returning to Australia at the end of 2005, he was Director for Education at the Organisation for Economic Co-operation and Development (OECD). He had earlier been Executive Director of the Australian Council for Educational Research (ACER), Professor of Education at Murdoch University, Head of the Research and Curriculum Branch in the Queensland Department of Education and, originally, a science teacher in Queensland secondary schools. He holds BSc, DipEd and BEd(Hons) degrees from the University of Queensland and a PhD from the University of Illinois. Professor McGaw is a Fellow of the Academy of the Social Sciences in Australia, the Australian Psychological Society, the Australian College of Educators and the International Academy of Education. He has been President of the Australian Association for Research in Education, the Australian Psychological Society, the Australian College of Educators and the International Association for Educational Assessment. He received an Australian Centenary Medal in 2003 and was appointed an Officer in the Order of Australia in He was the 2005–06 recipient of University of Illinois Alumni Award for Exceptional Achievement Revolutions, Revelations & Reality PDN School Leaders’ Conference 2009 Gold Coast, 7 August 2009

2 There is a rising demand for high-level skills

3 Changed demand for skills in the US
The dilemma for schools: The skills that are easiest to teach and test are also the ones that are easiest to digitise, automate and outsource. The figure shows a decline in labour involving physical tasks that can be well described using deductive or inductive rules. It also shows a decline in labour involving physical tasks that cannot be well described as following a set of “If-Then-Do” rules because they require optical recognition or fine muscle control that have proven extremely difficult to program computers to carry out. The decline in the demand for manual work has been widely discussed. However, much less public attention has been devoted to the significant decline in routine cognitive task input, involving mental tasks that are well described by deductive or inductive rules. Because such tasks can be accomplished by following a set of rules, they are prime candidates for computerisation and the figure above shows that demand for this task category has seen the steepest decline over the last decade. Furthermore, rules-based tasks are also easier to offshore to foreign producers than other kinds of work: … By the same token, when a process can be reduced to rules, it is much easier to monitor the quality of output. This highlights the concern that if students learn merely to memorise and reproduce knowledge and skills, they risk being prepared only for jobs that are in fact increasingly disappearing from labour markets. … In contrast, the figure displays sharp increases in the demand for task input requiring complex communication, which involves interacting with humans to acquire information, explain it or persuade others of its implications for action. … Similar increases have occurred in the demand for expert thinking, which involves solving problems for which there are no rule-based solutions. … These situations require what is referred to as pure pattern recognition – information processing that cannot now be programmed on a computer. While computers cannot substitute for humans in these tasks, they can complement human skills by making information more readily available. OECD (2007) PISA 2006: Science competencies for tomorrow’s world, Volume 1-Analysis. Paris: Author, p.34. Autor, D., Levy, F. and Murnane, R. J., (2003) The skill content of recent technical change, Quarterly Journal of Economics 118, M.I.T. Press, Cambridge, pp Levy, F. and Murnane, R.J. (2006), “How Computerized Work and Globalization Shape Human Skill Demands”, working paper, available at:

4 The storyline so far… There is a growing labour market demand for higher level skills.

5 How good is Australian school education?

6 Using international comparisons

7 Making international comparisons of achievement requires decisions about...
what to assess, whom to assess. Designing a survey of student achievement, whether internationally or on a more limited scale, requires decisions about what to assess and whom to assess. Different decisions can be made. The Program for International Student Assessment (PISA) conducted by the Organisation for Economic Co-operation and Development (OECD) and the studies conducted by the International Association for the Evaluation of Educational Achievement (IEA) such as the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS) are based on different decisions in response to both questions.

8 Deciding what to assess...
looking back at what they were expected to have learned OR looking ahead to what they can do with what they have learned. The IEA studies have a strong curriculum focus. They typically commence with a detailed examination of what is common in the curricula in countries participating in a study and then a deliberate focus on the things that the curricula have in common. That can result in differential validity since what finally becomes the focus of the assessment may be a more substantial part of the curriculum in some countries than in others. A curriculum focus, however, does offer the potential of a stronger connection between the findings and the teaching in schools than a more general focus on what students can do with what they have learned. For PISA, the OECD countries resolved not to focus on the specifics of the curricula involved and then to measure students had learned what it was intended they should. Instead, the focus is ‘forward-looking’, measuring students’ capacity to use the knowledge and skills they are expected to have developed by the time they are being tested. This orientation is particularly relevant given the focus on 15-year-old students and, thus, for most countries students close to the end of compulsory schooling. According to Jones and Olkin (Eds.) (2004). The Nation's Report Card: Evolution and Perspectives. Bloomington, IN: Phi Delta Kappa Educational Foundation, the original intention of the designers of the US National Assessment of Educational Progress (NAEP) was to use tests more like those now used in PISA but gradually the tests took on a more direct curriculum focus. OECD/PISA chose the latter. IEA studies have chosen the former. NAEP intention was the latter (as for PISA); but it became more curriculum-oriented (as for IEA)

9 PISA defines science performance in terms of a student’s:
For example When reading about a health issue, can students separate scientific from non-scientific aspects of the text, apply knowledge and justify personal decisions? Scientific knowledge and use/extrapolation of that knowledge to… … identify scientific issues, … explain scientific phenomena, and … draw evidence-based conclusions about science- related issues Understanding of the characteristic features of science as a form of human knowledge and enquiry Awareness of how science and technology shape our material, intellectual and cultural environments Willingness to engage with science-related issues The orientation in PISA to measure students’ capacity to use what they have learned rather than just whether they have learned it is evident in the way in which PISA defines student performance as shown in this and the three following slides.

10 PISA defines science performance in terms of a student’s:
Scientific knowledge and use/extrapolation of that knowledge to… … identify scientific issues, … explain scientific phenomena, and … draw evidence-based conclusions about science- related issues Understanding of the characteristic features of science as a form of human knowledge and enquiry Awareness of how science and technology shape our material, intellectual and cultural environments Willingness to engage with science-related issues For example Can students distinguish between evidence-based explanations and personal opinions?

11 PISA defines science performance in terms of a student’s:
Scientific knowledge and use/extrapolation of that knowledge to… … identify scientific issues, … explain scientific phenomena, and … draw evidence-based conclusions about science- related issues Understanding of the characteristic features of science as a form of human knowledge and enquiry Awareness of how science and technology shape our material, intellectual and cultural environments Willingness to engage with science-related issues For example Can individuals recognise and explain the role of technologies as they influence a nation’s economy? Or are they aware of environmental changes and the effects of those changes on economic/social stability?

12 PISA defines science performance in terms of a student’s:
Scientific knowledge and use/extrapolation of that knowledge to… … identify scientific issues, … explain scientific phenomena, and … draw evidence-based conclusions about science- related issues Understanding of the characteristic features of science as a form of human knowledge and enquiry Awareness of how science and technology shape our material, intellectual and cultural environments Willingness to engage with science-related issues Interest in science, support for scientific enquiry, responsibility for the environment This addresses the value students place on science, both in terms of topics and in terms of the scientific approach to understanding the world and solving problems.

13 Example question: Sunscreens
Mimi and Dean wondered which sunscreen product provides the best protection for their skin. Sunscreen products have a Sun Protection Factor (SPF) that shows how well each product absorbs the ultraviolet radiation component of sunlight. A high SPF sunscreen protects skin for longer than a low SPF sunscreen. Mimi thought of a way to compare some different sunscreen products. She and Dean collected the following: • two sheets of clear plastic that do not absorb sunlight; • one sheet of light-sensitive paper; • mineral oil (M) and a cream containing zinc oxide (ZnO); and • four different sunscreens that they called S1, S2, S3, and S4. This slide and the three that follow provide a sample PISA science task and details on how responses to two questions on it are scored.

14 Example question: Sunscreens
Mimi and Dean included mineral oil because it lets most of the sunlight through, and zinc oxide because it almost completely blocks sunlight. Dean placed a drop of each substance inside a circle marked on one sheet of plastic, then put the second plastic sheet over the top. He placed a large book on top of both sheets and pressed down. Mimi then put the plastic sheets on top of the sheet of light-sensitive paper. Light-sensitive paper changes from dark grey to white (or very light grey), depending on how long it is exposed to sunlight. Finally, Dean placed the sheets in a sunny place.

15 Question: Which one of these statements is a scientific description of the role of the mineral oil and the zinc oxide in comparing the effectiveness of the sunscreens? A. Mineral oil and zinc oxide are both factors being tested. B. Mineral oil is a factor being tested and zinc oxide is a reference substance. C. Mineral oil is a reference substance and zinc oxide is a factor being tested. D. Mineral oil and zinc oxide are both reference substances.

16 Question: The light-sensitive paper is a dark grey and fades to a lighter grey when it is exposed to some sunlight, and to white when exposed to a lot of sunlight. Which one of these diagrams shows a pattern that might occur? Explain why you chose it. Full Credit: A. With explanation that the ZnO spot has stayed dark grey (because it blocks sunlight) and the M spot has gone white (because mineral oil absorbs very little sunlight). A. ZnO has blocked the sunlight as it should and M has let it through. I chose A because the mineral oil needs to be the lightest shade while the zinc oxide is the darkest. Partial Credit: A. Gives a correct explanation for either the ZnO spot or the M spot, but not both. A. Mineral oil provides the lowest resistance against UVL. So with other substances the paper would not be white. A. Zinc oxide absorbs practically all rays and the diagram shows this. A because ZnO blocks the light and M absorbs it. A B C D

17 Deciding whom to assess...
grade-based sample OR age-based sample OECD/PISA chose the age-based differences in school starting age differences in grade retention/promotion policies. IEA studies have chosen grade-based. The two options are a grade-based sample and an age-based sample. For PISA, the OECD countries opted for an age-based sample, directly rejecting the model of grade-based sampling used in the IEA studies such as the Third (now Trends in) International Mathematics and Science Study (TIMSS). This rejection was based on concerns about the comparability of grades (given different starting ages in different countries) and the comparability of the populations in later grades (given differences in grade promotion and retention policies for students not performing well). As Lyle Jones reports in Jones & Olkin (2004) The nation’s report card: Evolution and perspectives, NAEP was established with an age-based sample and an explicit rejection of a grade-based sample for the same reasons that persuaded those developing PISA 30 years later. Grade-based samples are more administratively more convenient for schools, of course, so in the 1980s NAEP introduced a grade-based sample, initially alongside and then instead of the age-based sample. Grade-based samples also have the benefit of enabling investigations of between-class within-school differences if more than one class per grade is sampled. That benefit is rather diminished if it is gained at the cost of less valid comparisons between schools and jurisdictions (and countries) if grade-based comparisons are rendered invalid by differences in school starting age and student promotion practices between grades. NAEP started as age-based (as for PISA) became grade-based in 1980s (as for IEA)

18 Problem of age-based sample (IEA/PIRLS)
Countries that perform best are those that test older students. Results from the IEA Progress in International Reading Literacy Study (PIRLS) illustrate the difficulties faced with grade-based samples. The graph above shows the relationship between countries’ mean scores on PIRLS and the mean age of students tested, with Turkey excluded as an outlier. The correlation is 0.55, which means that 30% of the variation in mean scores on PIRLS can be accounted for in terms of the average age of the students tested. There is a clear tendency for the countries with the highest mean scores to be those testing older students. R = 0.55 R2 = 0.30

19 OECD’s PISA assessment of the knowledge and skills of 15-year-olds
Coverage of world economy 86% 87% 85% 77% 81% 83% The coverage of PISA has gradually expanded to include many countries beyond the 30 members of the OECD. PISA began in 2000 with 28 of the then 29 OECD members and four others. A further 11 administered the PISA 2000 assessments in By 2006, the countries involved in PISA represented 87% of the world economy.

20 What do international comparisons tell us?

21 Mean reading results (PISA 2000)
Australia tied for 2nd with 8 others among 42 countries. The figure above shows the mean performances of countries in reading literacy in PISA Reading literacy assessed in PISA is the capacity to use, interpret and reflect on written material. The line in the middle of the box for each country gives the mean performance of 15-year-olds in the country. The size of a box reflects the precision with which a country’s mean is estimated. Where the boxes overlap on the vertical dimension, there is no significant difference between the means for the countries. (Further details are given in the PISA report indicated at the foot of the figure.) The results reveal marked variations in performance levels among the 42 participating countries – ranging from Finland, significantly better than all others at the top, to Peru, significantly worse than all others at the bottom. Australia ranked in 4th place but its mean is not significantly different from those of two countries above it or six below it. It is, therefore, appropriate to say that Australia ranked between 2nd and 10th or that Australia tied in 2nd place with eight other countries among the 42 participating. OECD (2003), Literacy skills for the world of tomorrow: Further results from PISA 2000, Fig. 2.5, p.76.

22 Australia’s ranking in OECD/PISA Reading
Reading ranks PISA 2000: 4th but tied for 2nd PISA 2003: 4th but tied for 2nd PISA 2006: 7th but tied for 6th PISA 2000 PISA 2003 PISA 2006 Ahead of Australia Same as Australia Behind Australia Finland Korea Canada NZ Hong Kong Korea Canada NZ Hong Kong Finland Finland Korea Canada NZ Hong Kong Australia’s relative position in the three PISA assessments slipped from 2nd in 2000 and 2003 to 6th in The reason is that four countries that were at the same level as Australia, or even behind in the case of Hong Kong in 2003, were significantly ahead of Australia 2006. PISA expresses results on the same scale on each occasion so, as shown in the next slide, it is possible to see how the changes in rank order relate to shifts in levels of mean performance.

23 Trends in reading performance
Higher performers in Korea improved. Korea Finland Lower performers in HK improved. Hong Kong China Canada New Zealand Australia’s rank dropped because the Australian mean performance declined to 513 in 2006 from 528 in 2000 and 525 in This decline, which was statistically significant, occurred primarily because of a decline in performances at the highest level. The reasons for this are not immediately evident from the data but it at least clear that it is due to schools focusing more on basic achievement levels and not so much on the development of sophisticated reading of complex text. Korea, on the other had, significantly improved its mean performance and did so by raising its performances at the highest levels. The sources of this improvement appear to be a new curriculum with more emphasis on essay tests and expanded use of essays in assessments for university entrance. Hong Kong raised its mean performance by raising the performance of its poorer performing students and attributes this primarily to teacher development. There were no significant changes for Finland, Canada and New Zealand. Australia Changes for Finland, Canada & New Zealand are not significant. OECD (2007), PISA 2006: science competencies for tomorrow’s world, Vol analysis, Fig. 6.21, p.319.

24 Trends in Australian reading performances
95th %ile 90th %ile 75th %ile Mean 25th %ile The decline in Australia’s mean performance is shown again the graph above together with the trends in performance levels of Australia’s 15-year-olds at the 95th, 90th, 75th, 25th, 10th and 5th percentiles. (The 5th percentile is the score below which the performances levels of 5 per cent of the Australian students lay, and so on for the other percentiles.) Performance levels at the lower percentiles did not drop, while those at the higher percentiles did. This shows that the significant drop in Australia’s mean performance was due to a decline among high performers. 10th %ile 5th %ile OECD (2007), PISA 2006: science competencies for tomorrow’s world, Vol analysis, Fig. 6.21, p.319.

25 Mean mathematics results (PISA 2003)
Australia tied for 5th with 8 others among 40 countries. In PISA 2003, mathematics was the main domain of assessment. In this case, Australia ranked 11th overall out of the 40 participants but was not significantly different from six immediately above it or two immediately behind it. Australia, therefore, tied in 5th place with these eight other countries. The countries significantly ahead of Australia were Hong Kong-China, Finland, Korea and the Netherlands. PISA assesses whether 15-year-olds can use the mathematics they have learned in school. It does not focus primarily on the curriculum content to determine whether students have learned exactly what they were intended to learn. Instead, it assesses whether students can recognise that a problem can be solved mathematically, are able to ‘mathematise’ it (i.e. represent it mathematically) and then solve it. OECD (2004), Learning for tomorrow’s world: first results from PISA 2003, Fig. 2.16b, p.92.

26 Australia’s ranking in OECD/PISA Mathematics
Mathematics ranks PISA 2000: 6th but tied for 3rd PISA 2003: 11th but tied for 5th PISA 2006: 13th but tied for 9th PISA 2000 PISA 2003 PISA 2006 Ahead of Australia Hong Kong Japan Finland Korea Switzerland Canada Finland Hong Kong Korea Netherlands Switzerland Canada Macao Japan Taiwan Finland Hong Kong Korea Netherlands Switzerland Canada Macao Australia’s rank in PISA mathematics dropped from 3rd in 2000, to 5th in 2003 and 9th in Some of the shift is due to new countries joining – the Netherlands in 2003, performing significantly better than Australia on both occasions, Macao also in 2003 and performing at Australia’s level in 2003 but significantly better in 2006 and Taiwan in 2006, performing significantly better than Australia. Some other countries which were equivalent to Australia in 2000 or 2003 were significantly ahead in 2006. Only Japan among countries that were significantly better than Australia in the earlier PISA assessments has slipped back to equivalence with Australia. Same as Australia Japan

27 None of those that moved ahead of Australia in mathematics from 2003 to 2006 improved significantly.
Australia did not move significantly from 2003 to 2006. Australia’s significant shift in ranking is a consequence of cumulative non-significant shifts in different directions.

28 Trends in Australian mathematics performances
95th %ile 90th %ile 75th %ile Mean 25th %ile While Australia’s 2006 mean was not significantly different from its 2003 mean, there was a decline in performance at the top end (as in reading literacy) but an offsetting improvement at the bottom end. 10th %ile 5th %ile OECD (2007), PISA 2006: science competencies for tomorrow’s world, Vol analysis, Fig. 6.21, p.319.

29 Mean science results (PISA 2006)
Australia tied for 4th with 7 others among 57 countries. Science was the main domain of assessment in PISA 2006 and results are reported on a new scale that does not permit direct comparisons of absolute performance levels with those in previous PISA cycles. They do permit comparisons of relative positions. OECD (2007), PISA 2006: science competencies for tomorrow’s world, Vol analysis, Fig. 2.11b, pp

30 Australia’s ranking in OECD/PISA Science
Science ranks PISA 2000: 8th but tied for 3rd PISA 2003: 6th but tied for 4th PISA 2006: 8th but tied for 4th PISA 2000 PISA 2003 PISA 2006 Ahead of Australia Same as Australia Hong Kong Canada Finland Japan Korea Japan Korea Finland Hong Kong Canada Japan Korea Finland Hong Kong Canada Australia’s ranks in science were high on all three occasions at 3rd in PISA 2000 and 4th in PISA 2003 and PISA 2006. There is a relatively small number of countries that have outperformed Australia in science but none on all three occasions.

31 The storyline so far… There is a growing labour market demand for higher level skills. International comparisons on quality in education show that: Australian students are relatively high performing. The competition is not standing still.

32 The impact of raising expectations of low performers

33 Variation in reading performance (PISA 2000)
Variation of performance within schools The figure above divides the variation in student performance in reading in PISA 2000 for each country into a component due to differences among students within schools, shown above the zero line, and a component due to differences between schools shown below that line. In Iceland, Finland and Norway there is very little variation in scores between schools. Choice of school is not very important because there is so little difference among schools. Among the countries in which there is a large component of variation between schools, there are some in which this occurs by design. In Belgium, Germany and Hungary, for example, students are sorted into schools of different types according to their school performance as early as age 12. The intention is to group similar students within schools differentiated by the extent of academic or vocational emphasis in their curriculum. This is intended to minimise variation within schools in order then to provide the curricula considered most appropriate for the differentiated student groups. It has the consequence of maximising the variation between schools. In some other countries, the grouping of students is less deliberate but, nevertheless, results in substantial between-school variation. In the United States, for example, 30 per cent of the overall variation is between-schools. In Korea, 37 per cent is between schools. In Australia, 19 per cent is between schools. Variation of performance between schools OECD, UNESCO (2003), Literacy skills for tomorrow’s world: further results from PISA 2000, Table 7.1a, p.357.

34 Variation in mathematics performance (PISA 2003)
Variation of performance within schools The figure above shows the within-schools and between-schools variation in student achievement in PISA 2003 mathematics. The pattern is essentially the same as for PISA 2000 reading, shown in the previous slide, except for Poland. For Poland, in PISA 2000, 63 per cent of the variation in reading was between-schools whereas in PISA 2003 in mathematics only 13 per cent was between schools. This remarkable difference was due to a reform in which early streaming of students into schools of different types was abandoned in favour of comprehensive schools for students up to the age at which PISA measures their performance. Variation of performance between schools OECD (2004), Learning for tomorrow’s world, Table 4.1a, p.383.

35 Trends in reading performance
Korea Finland Hong Kong China Canada New Zealand Australia Poland Lower and higher performers in Poland improved. Lower performers in Poland improved. This figure adds Poland to the graph in slide 15. It shows that Poland not only reduced its between-school variation but dramatically improved its its average performance to the point where it is now no longer significantly behind Australia in reading. In fact, Poland was the only country to improve its average performance significantly on all measures used in both PISA 2000 and PISA It did so largely by raising the achievement levels of its poorer performing students. No longer isolating them in separate schools with other low performers seems to have effectively set higher expectations for them and raised their performance levels. Changes for Finland, Canada & New Zealand are not significant. OECD (2007), PISA 2006: science competencies for tomorrow’s world, Vol analysis, Fig. 6.21, p.319.

36 The storyline so far… There is a growing labour market demand for higher level skills. International comparisons on quality in education show that: Australian students are relatively high performing. The competition is not standing still. International comparisons on equity in education show that: Setting high expectations for all can improve low performers.

37 Elite student achievement is important too.

38 Scientific excellence of 15-year-olds and countries’ research intensity
It is not possible to predict to what extent the performance of today’s 15-year-olds in science will influence a country’s future performance in research and innovation. However, the figure portrays the close relationship between a country’s proportion of 15-year-olds who scored at Levels 5 and 6 on the PISA science scale and the current number of full-time equivalent researchers per thousand employed. In addition, the correlations between the proportion of 15-year-olds who scored at Levels 5 and 6 and the number of triadic patent families relative to total populations and the gross domestic expenditure on research and development (two other important indicators of the innovative capacity of countries), both exceed 0.5. The corresponding correlations with the PISA mean scores in science are of a similar magnitude. The existence of such correlations does, of course, not imply a causal relationship, as there are many other factors involved. OECD (2007) PISA 2006: Science competencies for tomorrow’s world, Volume 1-Analysis. Paris: Author, p.51. R2 = 0.70 OECD (2007), PISA 2006: Science Competencies for Tomorrow’s World, Vol. 1, p.51.

39 Equity matters too.

40 % at each reading proficiency level: PISA 2000
Korea has relatively high mean but with few very high performers and very few low performers. Level 5 Australia’s mean is high because of its relatively high percentage of very high-performing students. Level 4 Level 3 Level 2 Level 1 In the main domains of assessment in PISA, there is sufficient information to establish and describe well-defined levels of performance on the relevant scale. In PISA 2000, five levels of performance were defined on the reading scale, with an additional lower domain not well measured and described only as ‘below Level 1’. Students at this level may be literate in the sense of being able to decode printed words and to read text but they do not have a level of literacy sufficient for further study and learning. Even those at Level 1 are highly likely to be deficient in this respect. The figure above shows the percentage of students at each level in each country. Countries are arranged in order of their mean performance with those around Australia covered by the grey box being the ones with mean performances not significantly different from Australia’s. Australia stands out in two important respects from some of the other high-performing countries around it. Australia has a considerably higher proportion of students at the highest level (Level 5). It also has a rather larger percentage at Level 1 or below than some of the others. There is, thus, a slightly higher proportion of poorer performers in reading in Australia than in some of the other countries that are similarly high performing on average. Korea provides an interesting contrast. It has a considerably smaller proportion of high achievers but a correspondingly small proportion of very low achievers. In fact, Korea has the mostly narrowly dispersed student performances. Australia has somewhat more low performing students than some high-performing countries around it. Below Level 1 Source: OECD, UNESCO (2003) Literacy skills for the world of tomorrow, Table 2.1a, p.274

41 % at each science proficiency level: PISA 2006
In the main domains of assessment in PISA, there is sufficient information to establish and describe well-defined levels of performance on the relevant scale. In PISA 2000, five levels of performance were defined on the reading scale, with an additional lower domain not well measured and described only as ‘below Level 1’. Students at this level may be literate in the sense of being able to decode printed words and to read text but they do not have a level of literacy sufficient for further study and learning. Even those at Level 1 are highly likely to be deficient in this respect. In science in PISA 2006, seven levels of performance were defined on the science scale. Descriptions of each of the levels are provided in the report The figure above shows those whose performance level is deficient (at or below Level 1) together below the zero line. Countries are arranged in order of their mean performance with those around Australia covered by the grey box being the ones with mean performances not significantly different from Australia’s. It is sometimes claimed that, while Australia does well in the international comparisons on average, that it has a relatively long ‘tail’ of low performers. That is marginally the case in reading but it is not the case in science as the figure above shows. Australia has a percentage of low performing students in science similar to those of other high performing countries around it. Australia’s percentage of low performing students is similar to those in other relatively high performing countries around it. Below Level 1 OECD (2007), PISA 2006: science competencies for tomorrow’s world, Vol analysis, Fig. 2.11a, p.49.

42 Socioeconomic status & reading literacy (PISA 2000)
Two indices of relationship: High Social gradient Correlation or variance accounted for Social gradient: Magnitude of increment in achievement associated with an increment in social background (on average) Reading literacy The 15-year-olds in PISA provide information on their economic and social background – parents’ education and occupation, cultural artefacts in the home – that permits the construction of an index of social background that ranges from socially disadvantaged to socially advantaged. This scale is comparable across countries. The relationship between social background and reading literacy in PISA 2000 is shown in the figure above There are two indices with which to summarise the nature of a relationship such as this: The slope of the regression line The extent of variation around the regression line. The slope of the regression line (which is referred to the ‘social gradient’ when the horizontal axis is a measure of socio-economic status) shows the magnitude of change in reading literacy associated with a particular increment in socio-economic status, on average. The correlation summarises the extent to which the data are spread around the line and thus expresses how well the regression line summarises the relationship. The correlation in this case is relatively high (around 0.45) for measures such as these, though it indicates that variation in socio-economic status accounts for only just over 20% of the variation in reading literacy scores. Correlation: How well the regression line summarises the relationship Social Advantage Low PISA Index of social background Source: OECD (2001) Knowledge and skills for life, Appendix B1, Table 8.1, p.308

43 Steeper slope = less equitable results
Social gradients for reading literacy (PISA 2000) High Finland This gap is in the order of 3 years of schooling. Canada Reading literacy Australia Germany An examination of the relationship between social background and reading achievement country-by-country reveals marked differences among countries. The figure above shows the results for four countries. The lines for Finland and Canada are significantly less steep than the one for the OECD as a whole which was shown in the previous slide. Increased social advantage in these countries is associated with less increase in educational achievement than in the OECD as a whole. The results in these countries are more equitable than those of the OECD overall. Students differ in achievement but not in a way that is so substantially related to their social background. The lines for Australia and Germany are both significantly steeper than the one for the OECD as a whole, as are those for the US and the UK which are not shown in the figure above. In all of these countries, social background is more substantially related to educational achievement than in the OECD as a whole. Their results are inequitable in the sense that differences among students in their literacy levels reflect to a marked extent differences in their social background. The differences between these four lines at the left-hand end are substantial. Socially disadvantaged students do very much worse in some of these countries. The gap in educational achievement between similarly socially disadvantaged students in Germany and Finland represents around three years of schooling. Similarly disadvantaged students in Australia fall about half-way between, around 1½ behind their counterparts in Finland. More detailed analysis of the German data shows the pattern to be strongly related to the organisation of schooling. From age 11, students are separated into vocational and academic schools of various types on the basis of the educational future judged to be most appropriate for them. Students from socially disadvantaged backgrounds generally end up in low-status vocational school and achieve poor educational results. Students from socially advantaged backgrounds are directed to high-status academic schools where they achieve high-quality results. The schooling system largely reproduces the existing social arrangements, conferring privilege where it already exists and denying it where it does not. Steeper slope = less equitable results Social Advantage Low PISA Index of social background Source: OECD (2001) Knowledge and skills for life, Appendix B1, Table 8.1, p.308

44 Social gradients for reading literacy (PISA 2000)
High quality Low equity High quality High equity If lines for more countries were to be added to the figure on the previous slide, the pattern would become difficult to discern. The figure above provides a clearer picture for all OECD countries. Mean performances of countries in reading literacy are represented on the vertical axis. The grey band highlights the countries with means not significantly different from Australia’s. The slope of the regression line for social equity on reading literacy is represented on the horizontal axis as the difference between the slope for the OECD as a whole and a country’s own slope. This places to the left countries where the slope is steeper than in the OECD as a whole (that is, countries in which social background is more substantially related to educational achievement) and to the right countries where the slope is less steep than that for the OECD as a whole (that is, countries in which social background is less related to educational achievement). Countries with slopes significantly less steep than the OECD’s are shown in blue; those with lines significantly steeper are shown in red and those with lines not significantly different in slope from the overall OECD line are shown in black. Countries high on the page are high-quality and those to the far right are high-equity. The graph is divided into four quadrants on the basis of the OECD average on the two measures. The presence of countries in the ‘high-quality, high-equity’ quadrant (top right) demonstrates that there is no necessary trade off between quality and equity. They show that it is possible to achieve both together. Korea, Japan, Finland and Canada are among them. As already indicated in the previous slide, Australia is a ‘high-quality, low-equity’ country, with a high average performance but a relatively steep regression line. It is in the top-left quadrant along with the United Kingdom and New Zealand. The United States is only average quality but it is low-equity. Germany, as a low-quality, low-equity country, is in the bottom-left quadrant along with a number of other countries that also begin to separate students into schools of different types as early as age Low quality Low equity Low quality High equity Source: OECD (2001) Knowledge and skills for life, Table 2.3a, p.253.

45 Social gradients for science literacy (PISA 2006)
High quality Low equity High quality High equity The figure above shows the relationship between the slope of countries’ regression lines and their average performance in science in PISA In this case, the line for Australia is significantly steeper than the line for the OECD as a whole. Australian performances are again much less equitable than those in Canada, Finland and Korea. There are many countries to the left of Australia in this graph (and thus with less equitable results) but again the ones on which we should focus are those in the grey band containing countries equal to Australia in quality and those above that band. In particular, we should aspire to match Finland, Canada and Korea. Low quality Low equity Low quality High equity OECD (2007) PISA 2006: science competencies for tomorrow’s world, Vol 1 – analysis, Figure 4.6, p.184.

46 SES-science literacy correlations (PISA 2006)
High quality Low equity High quality High equity As indicated in the notes on slide 3, the slope of the regression line (or ‘social gradient’) provides one perspective on the relationship between social background and educational achievement. It is an indicator of the magnitude of the average differences in educational achievement associated with particular differences in social background. The correlation (or the squared correlation as a measure of the percentage of variance in achievement accounted for by differences in social background) is an indicator of how well the regression line summarises the relationship for a particular country. On this indicator of equity, Australia appears somewhat better than with the social gradient indicator. It is in the ‘high-quality, high-equity On this indicator of equity, Australia appears somewhat better than with the social gradient indicator. It is in the ‘high-quality, high-equity’ quadrant as shown above. Finland, Canada and Korea, however, all outperform Australia on this indicator, as does Korea. Low quality Low equity Low quality High equity OECD (2007) PISA 2006: science competencies for tomorrow’s world, Vol 1 – analysis, Figure 4.6, p.184.

47 The storyline so far… There is a growing labour market demand for higher level skills. International comparisons on quality in education show that: Australian students are relatively high performing. The competition is not standing still. International comparisons on equity in education show that: Setting high expectations for all can improve low performers. Low performers are not left further behind than in other high-performing countries except to some extent in reading. The disadvantaged are over-represented among low performers.

48 What do international comparisons tell us about Australian students’ engagement?

49 Australian students’ engagement with science learning
Rank among 57 countries Aspect of engagement Level of engagement Difference in achievement associated with 1 unit difference in engagement General value of science 41st 2nd Personal value of science 37th 1st General interest in science 54th 13th Enjoyment of science 45th Self efficacy in science 4th Self concept in science 43rd Instrumental motivation to learn 32nd 3rd Future-oriented motivation to learn 42nd Involvement in science-related activities 53rd PISA 2006 gathered data on students’ attitudes and engagement with science in four areas: support for scientific enquiry, self-belief as science learners, interest in science and responsibility towards resources and environments. The table above displays some of the key results for Australia. As seen earlier, Australian students rank at the top among countries in their performance in science. The table above shows that they rank low in their level of engagement. The one exception is in their sense of self-efficacy in science. They rank 13th in their sense of how good they are (their self-efficacy) which is not too different from their performance rank of 8th. On all the other measures of engagement, Australian students rank low among the 57 countries for which these data are available. International comparisons of self-reports of engagements may be difficult because of systematic differences in ways in which the scales are used by students in different countries but there is within-country evidence of a strong relationship between performance and engagement. An indicator is the magnitude of the difference in performance scores associated with a difference of one unit on the engagement scales. Countries are ranked in the PISA 2006 report on the magnitude of this difference. The results for Australia are shown in the rightmost column in the table above. Apart from general interest in science, the relationship between differences in levels of engagement and levels of performance are greater in Australia than in almost all other countries – the ranks ranging from 1st to 4th. There is no need to infer a causal relationship for this finding to be considered important. Both performance and engagement are important outcomes and both will be potentially influential in determining whether students move on to advanced study in science. OECD (2007) PISA 2006: science competencies for tomorrow’s world, Vol 1 – analysis, Chapter 3, pp

50 The storyline so far… There is a growing labour market demand for higher level skills. International comparisons on quality in education show that: Australian students are relatively high performing. The competition is not standing still. International comparisons on equity in education show that: Setting high expectations for all can improve low performers. Low performers are not left further behind than in other high-performing countries except to some extent in reading. The disadvantaged are over-represented among low performers. They think they are good at science, as they are. Australian students’ engagement with (science) learning: They don’t like it, don’t see the value of it, and don’t engage outside school.

51 Characteristics of the best performing systems

52 Autonomy and standards
PISA science Standards-based external examinations School autonomy (in teacher selection)

53 Characteristics of Finland
High-quality teachers More difficult to enrol in teacher education than medicine All 6-year trained with Masters on entry Schools responsible for all their students No grade repetition No school differentiation (before age 15) Early intervention for students needing it Largest % of students given extra support is in Grade 1

54


Download ppt "The characteristics of high-performing education systems: Lessons from international comparisons Barry McGaw Melbourne Graduate School of Education, University."

Similar presentations


Ads by Google