Trends in international assessents

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Potential impact of PISA
Common Core State Standards for Mathematics: Rigor Grade 2 Overview.
Common Core State Standards for Mathematics: Rigor Grade 5.
Skills development in the study of history
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
An approach to teaching it. Jacqueline is purchasing her first car and feels torn as she balances conflicting desires and messages. She yearns to be seated.
Consistency of Assessment
Principles and Standards for School Mathematics National Council of Teachers of Mathematics.
 Here’s What... › The State Board of Education has adopted the Common Core State Standards (July 2010)  So what... › Implications and Impact in NH ›
MATHEMATICS KLA Years 1 to 10 Understanding the syllabus MATHEMATICS.
CYCO Professional Development Packages (PDPs) Teacher Responsiveness to Student Scientific Inquiry 1.
1 A proposed skills framework for all 11- to 19-year-olds.
Margaret J. Cox King’s College London
Critical and creative thinking Assessment Tool How could schools use the tool? Sharon Foster.
Thomas College Name Major Expected date of graduation address
PISA OECD Programme for International Student Assessment PISA for Development Andreas Schleicher Paris, 27 June 2013 Welcome PISA for Development Andreas.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
Chapter 1 Defining Social Studies. Chapter 1: Defining Social Studies Thinking Ahead What do you associate with or think of when you hear the words social.
PROBLEM AREAS IN MATHEMATICS EDUCATION By C.K. Chamasese.
LEARNER CENTERED APPROACH
Common Core Confessions of a Math Teacher. So, what is this Common Core thing all about? The standards define the knowledge and skills students should.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Philippines – Australia Basic Education Assistance for Mindanao Beam Pre-service Workshop “Authentic Assessment”
Maths No Problem; A Mastery Approach.
New NSW Geography syllabus 7-10
Inquiry and IBL pedagogies
The New Illinois Learning Standards
The future of PISA: perspectives for innovation
Information for Parents Key Stage 3 Statutory Assessment Arrangements
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
An Overview of Accounting in the new Junior Cycle Specification
Assessment of Learning 1
Partnership for Practice
Assessment and Reporting Without Levels February 2016
Using Cognitive Science To Inform Instructional Design
PISA 2009 – New Approaches to Assessing Reading Literacy
Integrating Transversal Competencies in Policy and Practice
PISA 2015 Excellence and Equity in Education Peter Adams
Using Victorian Curriculum to plan Visual Arts & Visual Communication learning Webinar, 10 November 2016.
MYP planning: the unit planner
OECD Strategic Education Governance A perspective for Scotland
國立臺灣師範大學英語系陳秋蘭 PISA 與英語閱讀素養 國立臺灣師範大學英語系陳秋蘭
Assist. Prof.Dr. Seden Eraldemir Tuyan
Mathematics Subject Leaders’ Network Meeting
The Importance of Technology in High School Science
Jenny Bradshaw NCETM National CPD Conference 23rd March 2011
University of Nottingham ~ Aberdeen (September)
European Network on teacher Education Policies
The New Illinois Learning Standards
Quality and Qualifications Ireland and its Functions
Exploring the dimensions of quality in education - OECD perspective -
Teaching All Children: Planning and Assessment
Session 4 Objectives Participants will:
Parent Involvement Committee EQAO Presentation
Mastery and the new curriculum
Brian Gong Center for Assessment
Topic Principles and Theories in Curriculum Development
Organisation for Economic Co-Operation and Development Indicators on the Quality of Educational Performance Quality of Education Teachers’ Professional.
PGCE PCE Mentoring Training
Unit 7: Instructional Communication and Technology
Maths No Problem; A Mastery Approach.
Using the 7 Step Lesson Plan to Enhance Student Learning
Skills development in the study of History
Assessment Practices in a Balanced Assessment System
A Challenge: The Cultural Landscape
The Role of the Syllabus and Its Relevance to EFL Teacher Development
LEARNER-CENTERED PSYCHOLOGICAL PRINCIPLES. The American Psychological Association put together the Leaner-Centered Psychological Principles. These psychological.
Presentation transcript:

Trends in international assessents I want to start with a brief overview of the objectives and origins of PISA, then show you where the US stands and what the most effective school systems show can be achieved, and conclude with what all of this means for education policy. I need to remind you that the material remains under embargo until next Tuesday. Moscow 14 April 2017 Andreas Schleicher

OECD instruments Who is assessed Student assessment Classroom How? Methods and procedures, mix of criteria and instruments Mapping of feedback to different units Teacher appraisal For what? E.g. Accountability Improvement Who is assessed School School evaluation System The approaches to assessment that countries are pursuing to get there differ markedly. Many systems seek to figure out how effectively education systems, or parts of them, function as a whole. To do that, they typically assess a sample of schools and students rather than every student in every school. Of course, you then cannot compare the performance of individual schools and build accountability systems around that. On the other hand, you can put more resources into testing fewer students better, rather than sacrifacing validity gains for efficiency gains. Other assessment systems focus on school performance. And yet others seek evaluate teachers and students in classrooms. And of course, all of these systems are complementary and they typically look at what students are able to do. Assessment systems, and the philosophies around which they are build, vary also by their purposes. Is the primary purpose accountability, are results made publicly available and linked to some form of consequencies? Those purposes are common in England, the United States and Latin America. Or is the focus diagnosis and improvement, with the results feeding back directly to teachers and learners? Again, these are complementary not competing objectives, even if there is some tension between objectivity in an assessment and relevance. That tends to be the primary focus in Northern Europe or Scotland, for example. There are also differences with who is in charge to prepare the evaluation, to conduct it and to use the results. Systems involve very different actors and stakeholders in their assessment processes, often this is related to the political economy to maximise the utilisity and acceptance of assessment processes. Similarly, the type of instruments, differ widely, both within but also across countries. When you think about an assessment in the United States, you may imagine a multiple-choice test, but there are other countries where multiple-choice tests are not used at all, they may use open-ended assessments or even oral examinations. Or they may use classroom observations, student or teacher portfolios or other instruments. I will come back to this aspect. And then, of course, assessments differ by what they assess, whether the focus is on inputs, processes or outputs. System assessment

The ‘big’ trends Multi-layered, coherent assessment systems from classrooms to schools to regional to national to international levels that… Support improvement of learning at all levels of the education system Are largely performance-based Make students’ thinking visible and allow for divergent thinking Are adaptable and responsive to new developments Add value for teaching and learning by providing information that can be acted on by students, teachers, and administrators Are part of a comprehensive and well-aligned continuum, communicate what is expected and hold relevant stakeholders accountable . But there are also some common trends, most importantly the move towards multi-layered assessment systems that coherently extend from students to schools to states, nations and the international level. These assessments seek not to take learning time away from students, but try to enhance the learning of students, of teachers, of school administrators and policy makers, through building frameworks for lateral accountability. The assessments underline that successful learning is as much about the process as it is about facts and figures, they emphasise that success is not about the reproduction of subject matter content, but about the capacity to integrate, synthesize and creatively extrapolate from what you know and apply that knowledge in novel situations. They try to provide a window into students’ understandings and the conceptual strategies a student uses to solve a problem. They provide dynamic task contexts in which prior actions may stimulate unpredictable reactions that in turn influence subsequent strategies and options. •They try to add value for teaching and learning. The process of responding to assessments can enhance student learning if assessment tasks are well crafted to incorporate principles of learning and cognition. For example, assessment tasks can incorporate transfer and authentic applications, and can provide opportunities for students to organize and deepen their understanding through explanation and use of multiple representations. They try to generate information that can be acted upon and provides productive and usable feedback for all intended users. Teachers need to be able to understand what the assessment reveals about students’ thinking. And school administrators, policymakers, and teachers need to be able to use this assessment information to determine how to create better opportunities for student learning. Last but not least, those assessments do not operate in a vacuum but, are part of a comprehensive set of instruments that extend to instructional material as well as teacher training.

Some criteria Coherence Comprehensiveness Continuity Built on a well-structured conceptual base—an expected learning progression—as the foundation both for large scale and classroom assessments Consistency and complementarity across administrative levels of the system and across grades Comprehensiveness Using a range of assessment methods to ensure adequate measurement of intended constructs and measures of different grain size to serve different decision-making needs Provide productive feedback, at appropriate levels of detail, to fuel accountability and improvement decisions at multiple levels Continuity A continuous stream of evidence that tracks progress .

Measuring learning outcomes at school PISA

PISA 2015 OECD Partners

Recognise, offer and evaluate explanations for a range of natural and technological phenomena. Describe and appraise scientific investigations and propose ways of addressing questions scientifically. Analyse and evaluate data, claims and arguments in a variety of representations and draw appropriate scientific conclusions. Competencies Explain phenomena scientifically Evaluate and design scientific enquiry Interpret data and evidence scientifically Let me show you another example from PISA here. This is how we defined science

Competencies Knowledge Explain phenomena scientifically Each of the scientific competencies requires content knowledge (knowledge of theories, explanatory ideas, information and facts), but also an understanding of how such knowledge has been derived (procedural knowledge) and of the nature of that knowledge (epistemic knowledge) “Epistemic knowledge” reflects students’ capacity to think like a scientist and distinguish between observations, facts, hypotheses, models and theories Competencies Explain phenomena scientifically Evaluate and design scientific enquiry Interpret data and evidence scientifically Knowledge Content knowledge Knowledge of methodological procedures used in science Knowledge of the epistemic reasons and ideas used by scientists to justify their claims Let me show you another example from PISA here. This is how we defined science

Competencies Knowledge Attitudes Explain phenomena scientifically Peoples’ attitudes and beliefs play a significant role in their interest, attention and response to science and technology. PISA distinguishes between attitudes towards science (e.g. interest in different content areas of science) and scientific attitudes (e.g. whether students value scientific approaches to enquiry) Competencies Explain phenomena scientifically Evaluate and design scientific enquiry Interpret data and evidence scientifically Knowledge Attitudes Content knowledge Knowledge of methodological procedures used in science Knowledge of the epistemic reasons and ideas used by scientists to justify their claims Let me show you another example from PISA here. This is how we defined science Attitudes to science Scientific attitudes

Context Competencies Knowledge Attitudes Personal, local, global Personal, local/national and global issues, both current and historical, which demand some understanding of science and technology Context Personal, local, global Current and historical Competencies Explain phenomena scientifically Evaluate and design scientific enquiry Interpret data and evidence scientifically Knowledge Attitudes Content knowledge Knowledge of methodological procedures used in science Knowledge of the epistemic reasons and ideas used by scientists to justify their claims Let me show you another example from PISA here. This is how we defined science Attitudes to science Scientific attitudes

Measuring learning outcomes at school Broadening learning outcomes

OECD Learning Framework 2030

2012: Financial literacy 2015: Social skills 2018: Global competency PISA 2012: Financial literacy 2015: Social skills Collaborative problem-solving 2018: Global competency Skills, knowledge, understanding 2021: Creative thinking PISA for schools

Global competency in PISA

Measuring learning outcomes at school Understanding learning strategies

Memorisation is less useful as problems become more difficult (OECD average) Odds ratio Greater success Less success Easy problem Memorisation is associated with a lower chance of success as problems become more difficult Notes: Statistically significant odds ratios are marked in a darker tone. Chile and Mexico are not included in the OECD average. Odds ratio are calculated across 48 education systems. Difficult problem Source: Figure 4.3

Control strategies are always helpful but less so as problems become more difficult (OECD average) Odds ratio Greater success Less success Using control strategies is associated with a lower chance of success as problems become more difficult Easy problem Difficult problem Notes: Statistically significant odds ratios are marked in a darker tone. Chile and Mexico are not included in the OECD average. Odds ration are calculated across 48 education systems. Source: Figure 5.2

Elaboration strategies are more useful as problems become more difficult (OECD average) Odds ratio Greater success Less success Using elaboration strategies is associated with a greater chance of success as problems become more difficult Difficult problem Notes: Statistically significant odds ratios are marked in a darker tone. Chile and Mexico are not included in the OECD average. Odds ration are calculated across 48 education systems. Easy problem Source: Figure 6.2

Teaching and learning strategies in mathematics United Kingdom New Zealand Ireland Australia France Japan Hong-Kong China Hungary Macao-China Korea Shanghai- China Croatia Vietnam Chinese Taipei

Approaches to teaching Better Engagement and career expectations Better Learning outcomes Student-oriented Teacher-directed

Measuring early learning

International Early Learning and Child Well-being Study

International Early Learning and Child Well-being Study Learning Context

Measuring adult skills

Numeracy proficiency levels

Labour productivity and the use of reading skills at work This slide shows the positive relationship between labour productivity (GDP per hour worked) at the country level and the average intensity of the use of reading skills at work. This relationship holds even when controls are added for literacy proficiency. This relationship will to a degree reflect the industrial and occupational structure of the countries concerned.

Digital problem-solving skills (PIAAC) Young adults (16-24 year-olds) Older adults (55-65 year-olds)

Some methodological challenges Can we sufficiently distinguish the role of context from that of the underlying cognitive construct ? Do new types of items that are enabled by computers and networks change the constructs that are being measured ? Can we drink from the firehose of increasing data streams that arise from new assessment modes ? Can we utilise new technologies and new ways of thinking of assessments to gain more information from the classroom without overwhelming the classroom with more assessments ? What is the right mix of crowd wisdom and traditional validity information ? How can we create assessments that are activators of students’ own learning ? There are some tough methodological challenges that need to be addressed: Can we sufficiently distinguish the role of context from that of the underlying cognitive construct ? Do new types of items that are enabled by computers and networks change the constructs that are being measured ? Can we drink from the firehose of increasing data streams that arise from new assessment modes ? Can we utilise new technologies and new ways of thinking of assessments to gain more information from the classroom without overwhelming the classroom with more assessments ? What is the right mix of crowd wisdom and traditional validity information ? How can we create assessments that are activators of students’ own learning ?

Thank you Find out more about our work at www.oecd.org/pisa All publications The complete micro-level database Email: Andreas.Schleicher@OECD.org Twitter: SchleicherOECD Wechat: AndreasSchleicher and remember: Without data, you are just another person with an opinion