Small group consensus discussion tasks: CA driven criteria

Slides:



Advertisements
Similar presentations
We-Speak : Why do we use the application? Benefits for Students Increased opportunities for students to practise oral skills Less confident students can.
Advertisements

Assessment types and activities
Christopher Graham Garnet Education UK. I dont do rhetorical questions !
Performance Assessment
Testing Spoken Language: recent developments, challenges and responses Nic Underhill Consultant in International Education and Assessment.
Belarus SPECIAL ICAO WORKSHOP ON LANGUAGE PROFICIENCY FOR THE STATES FROM THE EASTERN PART OF THE ICAO EUR REGION (Baku, Azerbaijan, 7-9 December 2005)
Session 4: ASSESSING SPEAKING
TESTING SPEAKING AND LISTENING
1 © 2006 Curriculum K-12 Directorate, NSW Department of Education and Training English K-6 Syllabus Using the syllabus for consistency of assessment.
REFERENCES WHAT DOES IT CERTIFY? Cambridge English: First, commonly known as First Certificate in English (FCE) is an exam for people who need to prove.
TELPAS Grades K-1 Holistic Rating Training Spring 2010 Hitchcock ISD.
TELPAS Grades 2-12 Holistic Rating Training Spring 2010 Hitchcock ISD.
Constructing the Foundations of Capacity Building An Activity Theory Analysis of the English in Action Baseline Studies Jan Rae and Adrian Kirkwood.
The International Legal English Certificate Issues in Developing a Test of English for Specific Purposes David Thighe, Cambridge ESOL EALTA Conference.
Supporting Teachers to make Overall Teacher Judgments The Consortium for Professional Learning.
Evaluating tests and examinations What questions to ask to make sure your assessment is the best that can be produced within your context. Dianne Wall.
Language Testing Introduction. Aims of the Course The primary purpose of this course is to enable students to become competent in the design, development,
Developed by Marian Hargreaves for NEAS 2013
Linguistics and Language Teaching Lecture 9. Approaches to Language Teaching In order to improve the efficiency of language teaching, many approaches.
Responding to student writing: promoting engagement and understanding through peer review Sheffield Hallam University Outside Speaker Programme, Quality.
Action Research: For Both Teacher and Student
Chapter 4 Listening for advanced level learners Helgesen, M. & Brown, S. (2007). Listening [w/CD]. McGraw-Hill: New York.
6 th semester Course Instructor: Kia Karavas.  What is educational evaluation? Why, what and how can we evaluate? How do we evaluate student learning?
Dr Mary Drossou RCeL Research Associate and Coordinator of the “TEFL Practicum” and the “Practice Teaching in TEFL”
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
Arunee Wiriyachitra, Chiang Mai University
Challenges in Developing and Delivering a Valid Test Michael King and Mabel Li NAFLE, July 2013.
Copyright © Tulsa Public Schools 2011 © 2012, Tulsa Public Schools Teacher and Leader Effectiveness (TLE) Observation and Evaluation System / Process.
Multi-level Teaching By Heather E. Perry Multi-level Teaching By Heather E. Perry Synopsis Do you feel like you are trapped in a one room school room.
Providing Effective Descriptive Feedback: Designing Rubrics, Part 2 --Are you assessing what you think you’re assessing? Princess Anne Middle School Instructional.
Welcome The challenges of the new National Curriculum & Life without Levels.
FCE First Certificate in English. What is it ? FCE is for learners who have an upper- intermediate level of English, at Level B2 of the Common European.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Moderation and Validation of Teacher Judgements in School.
Assessing Student Learning Workshop 2: Making on-balance judgements and building consistency.
English Language HKDSE Exam. Assessment Component WeightingDuration Public Exam Paper 1 Reading Paper 2 Writing Paper 3 Listening & Integrated Skills.
Monitoring and Assessment Presented by: Wedad Al –Blwi Supervised by: Prof. Antar Abdellah.
Objectives of session By the end of today’s session you should be able to: Define and explain pragmatics and prosody Draw links between teaching strategies.
Controlled Assessment Writing. Who mark? To be marked by Edexcel The work completed in controlled assessments must be submitted for marking WITHOUT any.
Case Study of the TOEFL iBT Preparation Course: Teacher’s perspective Jie Chen UWO.
Instructional Practice Guide: Coaching Tool Making the Shifts in Classroom Instruction Ignite 2015 San Diego, CA February 20, 2015 Sandra
What to test at C1? Susan Sheehan. Acknowledgement This project was funded by the British Council through the Assessment Research Grant scheme. The views.
EVALUATING EPP-CREATED ASSESSMENTS
Entry Level Occupational Studies Agreement Trial
School – Based Assessment – Framework
Assessment and Reporting Without Levels February 2016
ASSESSING PRODUCTIVE SKILLS AT B2. UPDATED FIRST WRITING AND SPEAKING
IMPLEMENTATION OF ICAO LANGUAGE PROFICIENCY REQUIREMENTS IN BELARUS
Liceo Scientifico Internazionale in collaboration with
Welcome to Cambridge FCE
Cambridge Upper Secondary Science Competition
Assessing outcomes: How do we know what students learn?
Masters in Professional Practice Orientation Programme (MPPOP)
National Curriculum Requirements of Language at Key Stage 2 only
The WIDA ELP Standards and Formative Assessment
About CAE CAE is the fourth level in the Cambridge ESOL five-level series of examinations and is designed to offer an advanced qualification, suitable.
Defence Requirements Authority for Culture and Language (DRACL)
FCE (FIRST CERTIFICATE IN ENGLISH) General information.
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Understanding Standards:
Covering all the bases: designing assessments for a bespoke pre-sessional programme Liz MacDougall Senior Language and Learning Development Advisor, Abertay.
English Language Proficiency Benchmarks Assessment
Developing a Portfolio Assessment for Foundation Students
Language Testing in Austria: Towards a Research Agenda Dr Armin Berger
Liceo Scientifico Internazionale in collaboration with
Designing Assessment Methods
Why do we assess?.
Entry Level Occupational Studies Agreement Trial
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Small group consensus discussion tasks: CA driven criteria Chris Heady INTO Newcastle University

task example

What is the task? ……. candidates must: .. Give their views Be able to respond to the task and the materials Be able to respond appropriately to others in the group Discuss their views and the views of other Work towards a consensus (not achieve?) Show they can work collaboratively Be able to listen actively to others Show they can make useful, relevant and sensible points And more ……..

conceptual assumptions Construct – small group study discussions (in class or outside) Weir (2005) model of test validity – criterion, scoring, consequential Learning on Foundation programme Learning in School(s) at Newcastle University Co-construction – c.f. IELTS, FCE, CAE? University – co-construction common place

Process 1: dissertation ‘FEATURES OF SPOKEN INTERACTION IN A PEER-GROUP ORAL ENGLISH TEST AND EVIDENCE OF DIFFERENTIAL PERFORMANCE’ Foundation Architecture EAP module Video recordings middle vs upper: 5.0-5.5 vs 6.5- 8.0 CA Transcription and analysis = criterial features which support evidence of discrimination

summary Listenership (McCarthy, 2003) clarifications, conformations, back channelling, overlaps, turn completion, comments Mix of long and short turns Able to pick up and develop other group members points – from turn to turn and across turns Collaborative points (Galaczi,2008) Complexity of points Listenership: fewer overlaps, back- channelling, rare turn completions Shorter turns Responds to others but little development of points Agrees and disagrees: responding (but not always developing) Separate opinions = parallel (Galaczi, 2008) Some task management

Process 2: criteria development Collaborative process What should we rate? What can we rate? Watch, observe and identify and reflect - narrowing of features Consensus Balance between interactional features and linguistic features Trialled with old and new set of criteria - three examiners Standardisation and on-task moderation All assessment video-ed for EE consideration User guide, student-facing guide

Criteria v1

Rater feedback Concerns about listenership and task preparation Language effectiveness sometimes problematic Positive about ease of use and a movement away from adverbs! Positive about use of criteria as teaching aid Listen and respond is an excellent addition and really differentiates this assessment from the presentation It disadvantages our higher level students who really strive to reach top marks of 90. They dislike ending with a lower score than what they came with at entry. Accuracy of language is missing (links with point above) Further development: more clarification for assessors needed for ‘specialist or topic vocabulary’ some disagreement amongst teachers with the overlapping/finishing turns section The language aspects can be difficult to keep track of i.e. emphatic language and longer noun phrases

Opportunities / limitations Multimodality Paralinguistic Washback Consequential validity – evidence collection Task achievement? Reference outside test context? Scoring debates: should turn completion mean over-ride others? Can students prepare for backchannelling etc?

Very selected bibliography Bachman, L. (1990) Fundamental Considerations in Language Testing. Oxford, Oxford University Press Bonk W. J. and J. G. Ockey (2003) A many facet Rasch analysis of the second language group oral discussion, Language Testing 20 89-110 Brooks, L. (2009) Interacting in pairs in a test of oral proficiency: Co-constructing a better performance, Language Testing 26:3 341-366 Galaczi, E. (2008) Peer-Peer Interaction in a speaking test; the case of the First Certificate in English examination, Language Assessment Quarterly, 5:2 89-119 Gan, Z. (2010) Interaction in group oral assessment: a case study of higher and lower scoring students, Language Testing 27:4 585-602 Lazaranton, A. (1998) An analysis of differences in linguistic features of candidates at different levels of the IELTS Speaking Test. Report prepared for the EFL Division, University of Cambridge Local Examinations Syndicate, Cambridge

McCarthy, M. (2003) Talking back; small interactional response tokens in everyday conversation, Research on Language and Social Interaction 36;1, 33-63 May, L. (2011) Interactional Competence in a Paired Speaking Test, Features Salient to Raters, Language Assessment Quarterly, 8:2, 127-145, published on line at http://dx.doi.org/10.1080/15434303.2011.565845, accessed 06 June 2014 Seedhouse, P. (2012) What kind of interaction receives high and low ratings in Oral Proficiency Interviews? English Profile Journal Volume 3 August 2012 available at http:///journals.cambridge.org/EPJ last accessed 23/05/14 Van Moere, A. (2006) Validity evidence in a university group oral test, Language Testing 23:411 available at http://ltj.sagepub.com , last accessed 06/06/2014 Van Moere, A. and M. Kobayashi (2003) who speaks most in this group? Does that matter? Paper presented at the Language Testing Research Colloquium Weir, C.J. (2005) Language Testing and Validation. Basingstoke, UK: Palgrave Macmillan