Presentation is loading. Please wait.

Presentation is loading. Please wait.

Measuring Learning and Improving Education Quality: International Experiences in Assessment John Ainley South Asia Regional Conference on Quality Education.

Similar presentations


Presentation on theme: "Measuring Learning and Improving Education Quality: International Experiences in Assessment John Ainley South Asia Regional Conference on Quality Education."— Presentation transcript:

1 Measuring Learning and Improving Education Quality: International Experiences in Assessment John Ainley South Asia Regional Conference on Quality Education for All New Delhi, India, October 24 - 26, 2007

2 Quality education for all Shift from provision to outcomes Emergence of large-scale assessment programs Developments in methods and reporting Developments in applications Assessments used to: Monitor variations over time in relation to: Established standards / criteria Changes in policy and practice Map variations within countries to establish action targets: Regions and sub-regions Sub-groups of students Contextualise national patterns: In relation to international patterns In relation to comparable countries

3 Large-scale assessment surveys Conducted at various levels International Regional National Sub-national – state or province Provide information at various levels System School Classroom Parent and student Indicate what is valued Impact teaching and learning Drive change in policy and practice

4 International assessment studies OECD PISA Population and samples 15-year-olds in school pps school sample random selection of students Domains Reading literacy Mathematics literacy Science literacy Cycle Three years Since 2000 IEA Populations and samples Grade 4, Grade 8, Grade 12 pps school sample Random selection of classrooms Domains Reading – PIRLS Grade 4 Mathematics – TIMSS Science – TIMSS Cycle TIMSS Four years, since 1994/5 Antecedents back to 1964 PIRLS Five years since 2001 Other studies: ICCS 99 and 2009

5 International assessment studies OECD (PISA) framework expert development consultation future needs domain coverage Rotated booklet design data sources school student teachers option in 2009 psychometrics one parameter IRT Reporting Scale: sd = 100 Proficiency bands IEA (TIMSS & PIRLS) framework curriculum analysis (OTL) common elements what is taught domain coverage Rotated booklet design data sources school student teacher psychometrics three parameter IRT Reporting Scale: sd = 100 Proficiency bands

6 Regional assessment studies Latin America Latin American Laboratory for Assessment of the Quality of Education (LLECE) Second International Comparative Study (SERCE) Language, mathematics science Africa Southern Africa Consortium for Monitoring Educational Quality (SACMEQ) Supported through IIEP

7 National assessment studies NAEP (USA) Sequences over many years Key stage assessment (United Kingdom) Latin America (Puryear, 2007) Rare in 1980 Common by 2005 Vietnam 2001, 2007 Australia

8 Sub-national assessments Typically in federal systems Australian State assessments Equating at benchmark levels Transition to a national assessment in 2008 Germany Canada, Ontario

9 Issues in national and international assessment surveys Domains and sub-domains assessed Census or sample Analysis Reporting

10 Assessment domains Typically Language (literacy, reading) Mathematics (numeracy) Science sometimes Coverage within domains Multiple matrix designs Rotated booklets to ensure coverage Other domains Sample studies

11 Grades or ages assessed Define population Age Grade One grade or several One grade End of common period of schooling Multiple grades End of primary school End of common secondary school Mid-primary school

12 Sample or census Advantages of census Reporting to schools, teachers, parents Enough data to identify disadvantaged groups Enough data to identify regional variations Advantages of sample studies Cost effective Minimal disruption to school teaching programs Cover a wider range of areas Combinations of census and sample Census for literacy and numeracy Samples for other domains

13 Analysis issues Item response theory Development of a common scale Student performance Item difficulty Difference in detail Vertical equating Long scales Common items overlapping Common person equating studies Horizontal equating Equating over time Common items over each cycle Common person equating studies

14 Reporting assessment data Reporting scales Typically a mean for one grade fixed (e.g. 400) Standard deviation of 100 Examine distributions for different groups Proficiency bands – standards referenced Defined in terms of item difficulties Band width of equal difficulty Describe what is represented by items in a band Report percentages Standard setting exercises Define standards in terms of: Proficient standard Minimum competency Panels of expert judges

15 Reporting scale scores: TIMSS Maths Grade 8

16 Describing distributions: Writing (Australia)

17 Scale descriptions Provide an interpretation of scores Monitor student development Identify developmental continua Plan student learning Progress maps at state and school level

18 From item data to described scales: Computer literacy

19

20 PISA Maths Profile: Selected level descriptions

21 Profile distribution: Reading literacy (Australia)

22 Establishing expected standards Consultation What should a student be able to do? Different standards Minimum competency Proficient Advanced Provide a basis for simple comparisons

23 % students at benchmark standard for reading by sub-group

24 Achievement in relation to Most students in the state System average A defined benchmark

25

26 Uses of assessment Public information –A–About the system overall –A–About sections of the education system –A–Accountability Directing resources and interventions –G–Groups of students –L–Levels of schooling –S–Schools –I–Individual students Defining learning progress –E–Establishing progress maps –E–Establishing standards –P–Providing examples of student work at different levels Evaluating programs and research –U–Understanding “what works”

27 Public information Stimulating demand for education Identifying areas of need –Indigenous students –Boys reading –How wide is the gap Providing comparisons internationally –Staying the same –Relative change

28 Directing interventions Identifying disadvantaged students Based on social characteristics Based on diagnostic information – requires census Allocating funds Chile: bottom 10% schools Australian states: bottom 15% schools Focus on the early years Providing a basis for intervention In most education systems Use of consultants to work with schools Easier with census assessment Education action zones

29 Evaluation and research Evaluating what works Starting school Approaches in early childhood Impact of policy interventions Using data longitudinally What contributes to enhanced growth Value-added measures (NSW Smart Schools) Studying later progress (e.g. PISA Longitudinal) Uses of assessment data Linkage to other data about schools Literacy and numeracy in the middle years Literacy development of boys Effective teaching for literacy

30 Concerns at different levels

31

32 Conclusions Assessment programs have grown International, regional, national and sub-national Have begun to impact on policy and practice Complementary roles at different levels Emergent design principles Described scales and standards referencing Higher order skills & thinking Domain coverage Varied methods and formats Enhancing application Report meaningfully Provide interpretation Balance pressure and support

33 Questions ?? Comments !! Discussion *#&^


Download ppt "Measuring Learning and Improving Education Quality: International Experiences in Assessment John Ainley South Asia Regional Conference on Quality Education."

Similar presentations


Ads by Google