Bureau for International Language Coordination Julie J. Dubeau BILC Secretary Istanbul, Turkey May 24, 2010.

Slides:



Advertisements
Similar presentations
A Tale of Two Tests STANAG and CEFR Comparing the Results of side-by-side testing of reading proficiency BILC Conference May 2010 Istanbul, Turkey Dr.
Advertisements

BUREAU FOR INTERNATIONAL LANGUAGE COORDINATION BUREAU DE COORDINATION LINGUISTIQUE INTERNATIONALE NATOSPEAK: ENGLISH IN MULTINATIONAL SETTINGS.
Logistics & Resources Division NATO UNCLASSIFIED1 Mr. Philip Turner NATO Manpower Policy International Military Staff/ HQ NATO.
NYS Assessment Updates & Processes for New Social Studies Regents Exams September 18, 2014 Candace Shyer Assistant Commissioner for Assessment, Standards.
© red ©
BILC Standardization Initiatives and Conference Objectives
Assessment Literacy for Language Teachers by Peggy Garza Partner Language Training Center Europe Associate BILC Secretary for Testing Programs.
RMA - BDLC LANGUAGE REQUIREMENTS: AAP-16 WG OVERVIEW BILC VARNA – October 2010 Marc Isselé.
WCO Knowledge Academy Origin Brussels, 3-4 July 2014 WCO activities on rules of origin Mette Werdelin Azzam Technical Officer Origin Sub-Directorate World.
BUREAU FOR INTERNATIONAL LANGUAGE COORDINATION BUREAU DE COORDINATION LINGUISTIQUE INTERNATIONALE Closing Briefing Julie J. Dubeau BILC Chair Tbilisi,
Bureau for International Language Coordination
The Bureau for International Language Coordination
BUREAU FOR INTERNATIONAL LANGUAGE COORDINATION BUREAU DE COORDINATION LINGUISTIQUE INTERNATIONALE Forging Effective Partnerships to Optimize Operational.
Bureau for International Language Co-ordination BILC Update Julie J. Dubeau Secretary Bucharest Oct 08.
Standardizing Testing in NATO Peggy Garza and the BAT WG Bureau for International Language Co-ordination.
BILC UPDATE Rome, Italy, June 8, 2009 BILC Secretary & D/Secretary Bureau de Coordination Linguistique Internationale Bureau for International Language.
BILC CONFERENCE – MADRID, SPAIN 3-7 May 2015 “NATO REQUIREMENTS VERSUS NATIONAL POLICIES: BRIDGING THE DIVIDE AT THE LANGUAGE SCHOOL” BILC´s Mission: “To.
MINISTRY OF DEFENCE REPUBLIC OF BULGARIA
Study Group 1 Advanced Distributed Learning (ADL) Course in English for Military Operations Bureau for International Language Co-ordination.
NATO TERMINOLOGY PROGRAMME NATO UNCLASSIFIED 11 June 2009 Presented by Mr Ian P. JONES Head Linguistic Service, SHAPE ACO Terminology Coordinator
Closing Remarks and Report from the Steering Committee This has been a remarkable week! –Educational. –Engaging. –Productive. We want to thank our hosts.
Language Needs Analysis of NATO Posts Part 1 Julie J. Dubeau Associate BILC Secretary Chief Standards, Foreign Language Programmes, Canadian Defence Academy.
NATO BAT Testing: The First 200 BILC Professional Seminar 6 October, 2009 Copenhagen, Denmark Dr. Elvira Swender, ACTFL.
SHAPE The SHAPE Update BILC Conference – Istanbul 2010 Michael Adubato SHAPE Language Testing Centre Supreme Headquarters Allied Powers Europe 1.
Welcome! - Current BILC activities. - Comments regarding the theme of this seminar. Dr. Ray T. Clifford BILC Seminar, Vienna 8 October 2007.
Cpt. Nato Jiadze Ms. Mzia Skhulukha Ms. Mzia Skhulukha J-7 Joint Staff of Georgia Success in Training and Testing in Georgian Armed Forces (GAF) Teaching.
STANAG OPI Testing Julie J. Dubeau Bucharest BILC 2008.
BILC 2011 BILC 2011 BILC 2011 BILC Closing Remarks.
Benchmark Advisory Test (BAT) Update BILC Conference Athens, Greece Dr. Ray Clifford and Dr. Martha Herzog June 2008.
Monterey, USA October, 2011 “ Furthering our Training Goals Through Research” BILC UPDATES Julie J. Dubeau & Jana Vasilj-Begovic BILC Secretaries.
HEADQUARTERS ”A Language Needs Assessment (LNA) Study at SHAPE/NATO HQ: What Lies Behind the Standardised Language Profiles (SLPs) in the Job Descriptions?"
Bridging the Gap Language Requirements vs Language Reality Roma 2009.
BILC Testing Seminars Language Testing Seminar (LTS) Advanced Language Testing Seminar (ALTS)
BILC Update June 2014 – May 2015 Peggy Garza BILC Secretary.
Peggy Garza Associate BILC Secretary For Testing Programs Standardization Initiatives.
Study group #3: NATO STANAG 6001 Ed 3 Level 4 Testing Mission statement: The aim of this study group is to formulate recommendations which could alleviate.
BILC Update June 2015 – May 2016 Peggy Garza BILC Secretary.
Test Validation Topics in the BILC Testing Seminars
BILC and Workshop Overview
50 Years of BILC: The Evolution of STANAG – 2016 and the first Benchmark Advisory Test Ray Clifford 24 May 2016.
Defense Ministry Linguistic Center “Teamwork as a way of increasing feedback from a short-term language course” Bled – 2012.
Bureau for International
BILC “Recognized” Courses
STANAG 6001 Testing Update and Introduction to the 2017 Workshop
BILC Conference Prague 2012
Release of PARCC Student Results
BILC Standardization Efforts & BAT, Round 2
STUDY GROUPS Lifelong Language Learning: Enhancing Educational Effectiveness Julie J. Dubeau May 14, 2012 It is.
Skopje, 5-7 Sept STANAG 6001 Testing Workshop
BILC Professional Seminar Helsinki, Finland 4-8 October 2015 “Increasing proficiency levels: What works and what doesn't “ Keith L Wert, BILC Chair
BILC 2010 The tiles are Iznik tiles like the ones that line the walls of the Blue Mosque, Topkapi Palace, and the Rustem Pasha Mosque. The art was lost.
Brno, Sept STANAG 6001 Testing Workshop BILC Update
Roadmap Towards a Validity Argument
Study Group # 1: Familiarization with STANAG 6001 for Non-Specialists
BILC site update and BILC FDW in Shumen, BGR
“NATOSPEAK: ENGLISH IN MULTI-NATIONAL SETTINGS”
BILC Updates “DEVELOPING OPERATIONAL PROFICIENCY”
Julie Dubeau BILC Secretary COPENHAGEN 4-8 Oct 2009
Defence Requirements Authority for Culture and Language (DRACL)
Budapest, Oct BILC Professional Seminar Authenticity in Training and Testing: Making It Real BILC Update BILC Secretariat.
TERMINOLOGY WORKING GROUP (TWG)
BILC 2012 Prague.
“Language is the most complicated human behaviour” ”
Conclusions of the seminar
Bureau for International Language Coordination BILC Updates Back to Basics: Recipes for Instructional Success Bled, Slovenia Julie J. Dubeau October.
TriaLling BAT2 Writing Prompts
BILC Conference Report of the Steering Committee
TERMINOLOGY WORKING GROUP (TWG)
TOWARDS THE CREATION OF A EUROPEAN UNIVERSITY DATA COLLECTION Education and Training Statistics Working Group Luxembourg, 17 November 2010 Peter WHITTEN.
BILC ANNUAL CONFERENCE 2019 Tartu, Estonia
Presentation transcript:

Bureau for International Language Coordination Julie J. Dubeau BILC Secretary Istanbul, Turkey May 24, 2010

General information LNA report BAT Report Outline of Presentation

What does BILC do?  Annual Conference in Spring June 2009 hosted by Italy - “Bridging the Gap: Language Requirements vs. Language Reality” May 2010 hosted by Turkey - “Mapping The Road: Success in Language Training”  Professional Seminar in Fall October 2009 hosted by Denmark – "The 21st Century Classroom: Keeping up with the Times!“ October 2010 hosted by Bulgaria – “Aligning Training and Testing in Support of Interoperability”

Ratified by the nations and promulgated by NSA in February 2009 English and French versions can be downloaded from the BILC website Steering Committee will be discussing a proposal for admin change during conference Responsible for STANAG 6001

What else does BILC do? Language Training Assessments Assistance to National Testing Programmes Language Testing Seminars:  LTS & ALTS Special Projects:  Benchmark Advisory Test (BAT)  Language Needs Analysis (LNA)

General Information ACT is tasked by IMS to investigate the feasibility of introducing a computer based learning tool for use by nations to familiarize with NATO terminology… BILC is investigating requirement through JSSG

“ACT should develop ADL solutions that will complement nations’ efforts in language training, understanding of different local government structures and understanding familial, clan and tribal cultures”. ACT has requested BILC’s assistance Please contact the BILC Secretariat if you are interested in providing an ADL product Complementary ADL Solutions IMS Report Dated November , Bratislava Follow-up Tasking on Counter Insurgency (COIN)

November 2008 Chairmen’s Meeting of the NTG in Brussels The aim of this study was to show whether language requirements appeared to be set at appropriate levels to enable military personnel to perform their duties adequately in the NATO OPS context, in this case ISAF. LANGUAGE NEEDS ANALYSIS OF NATO CRISIS ESTABLISHMENT (CE) POSTS

To broadly identify whether the mandatory SLP requested for the post was:  At Level ~ A: meaning that from the tasks described in the JD, the profile appeared to be adequate;  High ~ H: meaning that the SLP requested for the post appeared to be higher than the functions defined in the job description; or  Low ~ L: meaning that the SLP requested for the post appeared to be lower than the functions defined in the job description. LANGUAGE NEEDS ANALYSIS OF NATO CRISIS ESTABLISHMENT (CE) POSTS

Summary of Results RatingsTotal = 609 CE Post JDs A - At Level355 (58.2%) H- High203 (33.3%) L- Low41 (6.7%) H/L- Mixed10 (1.6%)

Job Descriptions requiring SLPs of Level 1: 30 JDs required SLPs of 1111 & all were High. Job Descriptions requiring mixed SLPs of level 1 & 2: 4 required SLPs of 2211 & all were at level. 2 required SLPs of 2221 & both were at level. Job Descriptions requiring SLPs of level 2: 61 JDs required SLPs of /61 (26.2%) were at level. 43/61 (70.4%) were High. Results: Levels 1 & 2

Job Descriptions requiring mixed SLPs of Levels 2 & 3: 156 (25.6%) JDs required mixed SLPs of Levels 2 and 3. Of these 156, 66 (42.3%) were considered as having SLPs that were at level Further Breakdown: 1 required SLP 3221 (High/Low depending on skill) 9 required SLP 3222 (7/9 High) 1 required SLP 3223 (High) 4 required SLP 3232 (3High/1Low) 118 required SLP 3322 (47 at level (39.8%), 56 High (47.4%) & 3 Low (2.5%)) 23 required SLP 3332 (17 at level, 6 Low note that these 6 also req Dari) Results: Levels 2 & 3

Results: Level 3 Job Descriptions requiring SLPs of Level 3: 301 (49.4%) required SLP /301 (79.9%) were considered at level 44/301 (14.6%) were considered High 13 were considered as Low SLPs for functions, (all req expert language knowledge with 3s in Dari and Pashto)

Job Descriptions requiring mixed SLPs of level 3 & 4: 43 profiles of 4343, from which 22/43 were at level (51.1%), and 21(48.8%) were not scored at level (some H some L) 1 profile of 4344 (Low), 1 profile of 4443 (High) Results: Levels 3 & 4

Results: Level 4 Job Descriptions requiring SLPs of level 4: 8 profiles of 4444 (6/8 were considered at level, with comments referring to the functions as requiring expert language ability, diplomatic or sophisticated language use, 2 were H) Out of the 53 JDs requiring some skills at level 4, 10 (18.8%) also req languages other than Eng such as French, Dari, Pashto, etc. Out of these 10, 6 (60%) were at level.

Recommendations NATO CE Post JD SLPs should be reviewed SLPs should be based on an analysis of the tasks performed by incumbents in relation to STANAG 6001 Ed 3. It is strongly recommended that the next analysis of requirements be done with the assistance of BILC language experts. CE posts master sheet should include the SLPs requested for the posts.

IMS-Tasking on Language Proficiency and Education on NATO Standards & Terminology. 12 February 2010  Analysis of CE/PE posts Setting realistic requirements Review of linguistic requirement for Officers for operationally deployed duties‘ has been initiated  Benchmarking What’s next for interoperability?

Benchmark Advisory Test (BAT)- PURPOSE To provide an external measure against which nations can compare their national STANAG test results To promote relative parity of scale interpretation and application across national testing programs To standardize what is tested and how it is tested

Benchmark Advisory Test (BAT) Allocation to 11 Nations (200 ‘free’ tests) Tests administered by LTI, the ACTFL Testing Office via the Internet and Telephone January 2010 – End of Benchmarking Process Positive Feedback from Nations

Benchmark Advisory Test (BAT) Results ListeningSpeakingReadingWriting Black(11)64%(18)72%(10)60%(11)55% White(18)61%(18)56%(18)94%(18)39% Red(18)89%(18)83% (18)50% Blue(20)85%(19)47%(20)55%(20)60% Teal(8)0%--(8)75%(8)38% Maroon(16)69%(15)47%(14)64%(16)44% Pink(18)67%(18)50%(18)78%(18)67% Purple(12)8%--(13)54%(13)62% Orange(13)77%(13)46%(13)54%(13)62% Yellow(17)24%(18)0%(18)33%(18)0% Green Nation did not release national test scores, but indicated that, on average, they were higher than those attained on the BAT.

Test purpose Testing method Use of plus ratings Alignment of author purpose, text type, and reader/listener task Inadequate tester/rater norming (productive skills) Inconsistencies in the interpretation of STANAG 6001 Cut-off score setting, etc. FACTORS CONTRIBUTING TO NON- ALIGNMENT

BAT – Way Ahead Project successful Demand for administrations will dictate future development needs and modes BILC SC to formulate recommendations Nations who require BILC support (post-BAT) can request it

QUESTIONS? Enjoy the Conference!