Language testing: Current trends and future needs 24 – 25 October, 2014 Centro de Lenguas Modernas, Granada University PaedDr. Miroslava Jurenková Mgr.

Slides:



Advertisements
Similar presentations
CBEA CONFERENCE OCTOBER 20, 2010 MRS. DEDERER BUSINESS TEACHER BETHEL HIGH SCHOOL Moodle.
Advertisements

CONCEPTUAL WEB-BASED FRAMEWORK IN AN INTERACTIVE VIRTUAL ENVIRONMENT FOR DISTANCE LEARNING Amal Oraifige, Graham Oakes, Anthony Felton, David Heesom, Kevin.
Computer Aided Assessment using QuestionMark Perception by Catherine Ogilvie Educational Development Officer.
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Connecticut Assessment of Student Performance and Progress Smarter Balanced Spring 2014 Field Test Workshop for Students Greenwich Public Schools March,
What is a CAT?. Introduction COMPUTER ADAPTIVE TEST + performance task.
California Assessment of Student Performance and Progress
Writing High Quality Assessment Items Using a Variety of Formats Scott Strother & Duane Benson 11/14/14.
What’s New with the FSA? January 6 & 7, 2015 Mildred Grimaldo, Director, Literacy Department.
Accessible Standardized Testing Community Meeting. Feb 18, 2015.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Basic Issues in Language Assessment 袁韻璧輔仁大學英文系. Contents Introduction: relationship between teaching & testing Introduction: relationship between teaching.
User interface design Designing effective interfaces for software systems Objectives To suggest some general design principles for user interface design.
E-Learning Workshop May Learning Objectives By the end of this workshop you will be know how to: plan and structure an e-learning resource use Articulate.
Stages of testing + Common test techniques
Copyright © Allyn & Bacon 2008 POWER PRACTICE Chapter 5 Administrative Software START This multimedia product and its contents are protected under copyright.
Online Assessments: SafeAssign + Tips and Pitfalls “Whats” and “Whys” of assessments Assessment types Types of test questions Assignments Rubrics SafeAssign.
Creating Effective Inquiry-Based Learning Activities that Increase Student Achievement.
Nancy Lister Grant Administrator, Career, Standards, and Assessment Services Kansas State Department of Education Julia Shaftel, Ph.D. Principal Investigator,
March16, To keep up with the current and future standards of high school graduates. To align with college and career readiness standards. To ensure.
Competency #6 MTT Preparation Manual. Competency #6 The master technology teacher demonstrates knowledge of how to communicate in different formats for.
E-Learning Services. e-Learning is transforming the way we learn and teach e-Learning can be broadly defined as technology assisted learning. It is all.
Maintaining student engagement with formative assessment (using Questionmark) Tracey Wilkinson Centre for Biomedical Sciences Education School of Medicine,
12 November 2010 New Way forward to ICT Literacy Training.
Technology Integration Planning Guidelines for Development A Visual Guide.
Assessment Admin Initial Product Training Welcome Instructor: (instructor name)
Using CIITS to Create Classroom Assessments Copyright © 2011 Schoolnet, Inc. All rights reserved.
Using authoring tools to produce materials Hot Potatoes:  small windows or Mac program that creates a variety of exercises  can be freely downloaded.
SPRING 2012 UPDATE Common Core Standards. The headlines “Common Core Raises PD Opportunities, Questions,” Teacher PD Sourcebook, Spring 2012 “Common Core.
Mary Beth Youse Wilmington University TLT Symposium May 5, 2012 Devise and Deliver - Online Assessments in Blackboard 1.
Becoming Familiar with the GRE General Test GRE Test Preparation Workshop for Campus Educators.
Question Types Developed by: Eric Shepherd. Question Types  Objective: To provide information on how question types can be used within computerized assessments.
Discovering Computers Fundamentals Fifth Edition Chapter 9 Database Management.
Using Adobe Acrobat Professional Editing Tools Erika Leal Sprint 2014 EDTC Instructional Technology Practicum.
The changing model of teaching and learning multimedia E.Rossiou, G.Pantziou Department of Informatics TEI of Athens,Hellas.
Computerized Testing System in Science Based on Clickit platform Michal Biran Moshinsky R&D and Training Center - Ort Israel Wingate Seminar - May 2005.
Smarter Balanced Assessment System March 11, 2013.
Traditional vs. Alternative Assessment
Software Architecture
Based on Common Core.  South Carolina will discontinue PASS testing in the year This will be a bridge year for common core and state standards.
Kesarkar Madhura, Ph.D. Head, Department of Education, SNDTWU, Mumbai.
Lectures ASSESSING LANGUAGE SKILLS Receptive Skills Productive Skills Criteria for selecting language sub skills Different Test Types & Test Requirements.
Assessment and Testing
Darek Sady - Respondus - 3/19/2003 Using Respondus Beginner to Basic By: Darek Sady.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Technology Integration Planning Guidelines for Development A Visual Guide.
Presentation e-Learning Basics Author: Mary Frentzou )
Appropriate Testing Administration
S MARTER B ALANCED (R EQUIRED FOR DTC S, STC S, AND S MARTER B ALANCED TA S )
Fact or Opinion? How can one tell? A Teacher’s Guide to a web-based lesson. Created by Deborah Swisher Coker Last revised June 28, 2009.
Introduction to Blackboard Rabie A. Ramadan Session 3.
Accommodations and Modification in Grades Do NOT fundamentally alter or lower expectations or standards in instructional level, content, or performance.
Smarter Balanced Assessment Consortium (SBAC) Fairfield Public Schools Elementary Presentation.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
IN READING, LANGUAGE, AND LISTENING Grades 6-8 Florida Standards Assessment in English Language Arts.
DOCUMENTATION REF: Essentials of IT (Hamilton et al) Chapter 1.
Learning Management System. Introduction Software application or Web-based technology used to plan, implement, and assess a specific learning process.
What is a CAT? What is a CAT?.
Accessibility Features and Accommodations
The National Examinations in Foreign Languages in Slovakia
Teaching and Learning with Technology
TESTING AND LANGUAGE TEACHING
Interim Assessment Training NEISD Testing Services
F.S.A. Computer-Based Item Types
Smarter Balanced Assessments
CBA Assessments in Eduphoria Aware
Presentation transcript:

Language testing: Current trends and future needs 24 – 25 October, 2014 Centro de Lenguas Modernas, Granada University PaedDr. Miroslava Jurenková Mgr. Alžbeta Palacková Mgr. Michaela Ujházyová, PhD. NATIONAL INSTITUTE FOR CERTIFIED EDUCATIONAL MEASUREMENTS Modern education for the knowledge society/The project is co-financed by the European Union

Was established in 2008 and is responsible for: external part of the Maturita examination, testing of 5 th and 9 th grade pupils, international projects (e.g. PISA, PIRLS, TIMSS), national projects co-financed by ESF (e.g. "E-testing"). Increasing the quality of education at primary and secondary schools with the use of electronic testing (2013 – 2015) NÚCEM

What will we be talking about? Some information about the Slovak national project: "E -testing ". E-testing: new ways of test and item development Answer keys: how to prepare answer keys for paper-based (printed) tests and e-tests.

Activity 1.1. Development of tests and items: for school testing (School item bank), for national testing (NUCEM item bank). The aim is to develop about 30,000 new items & 130 tests in cooperation with teachers and experts – item writers, local supervisors, reviewers... (approximately 600). to assess students’ knowledge and skills in various subjects. Training of teachers in development of e-tests and e-items.

Activity 1.2. Implementation of electronic testing. Cooperation with IBM: developing a software for items and tests development, e-testing and item banking. Training of the school’s IT administrators of e-testing. Providing research analyses and reports based on schools’ measurable outputs. Evaluation and comparison of the quality of schools, monitoring the value added indicators.

Paper-based testing vs. E-testing What are the differences between them? We have paper, we have computers. It looks simple... But… using e-testing tools can transform the entire test development life cycle.

E-testing and Item banking Advantages: E-testing offers more response options  new types of test items (drag and drop, interactive ones...). Item content can include multi-media content (audio, video, animations, rich graphics, etc.). Item banks can be easily updated. Tests can be easily created by rearranging items. Testing listening and reading comprehension (receptive skills)can easily be implemented and delivered via e-testing. Most question types can be scored instantly  students are more motivated if they get their scores immediately after taking a test (immediate feedback).

E-testing and Item banking Advantages: Some programs permit only multiple-choice items and some open-ended items(completion, essay, fill-in). The program developed within the Slovak national project support 12 types of items and some other are in the process of development. By developing the system for computer-based testing, it later enables us to use adaptive tests which will be tailored to the individual ability of the test-taker (better estimates of ability with shorter tests).

E-testing and Item banking Challenges: Transforming some types of items used in a traditional paper testing to e-form is sometimes difficult. Many item writers involved in the process - variations in test item style, format and difficulty  a guide with item development standards had to be prepared. Preparing answer keys for open-ended questions and scoring them is for now quite difficult (human/computer?). More time consuming for those who develop the tests. Computer or Internet based exams represent some risk, especially with regard to exam security (in March 2015 – the national online testing: Maturita, T9).

E-testing and Item banking Challenges: Additional skills are required for test writers and test administrators (training is needed). To carry out tests in a computer requires some IT skills: typing, mouse navigation, key combinations  some test takers may not work with computers therefore taking electronic exams may be difficult for them. Reading from a screen is more exhausting, especially long passages with scrolling  the amount of material that can be presented on a computer screen is limited. This limitation could be a problem, for testing languages if we want to develop a reading test based on relatively long passages.

Types of items o Multiple choice (with one response) o Multiple choice(with multiple response) o Fill (one blank) o Fill (multiple blanks) – Custom Fill o Ordering o Single choice matrix o Multiple choice matrix o Marking words in the text o Drag & Drop o True/false o Hotspot o File upload

Multiple choice (with one response)

Multiple choice (with multiple response)

Fill (multiple blanks)

Ordering

Multiple Matrix

Single Matrix

Marking text

Drag and drop

True – False

Hotspot

File Upload

The answer key for open-ended items 1. Before the pilot testing: The test writer develops the 1. version of the answer key. Test coordinators check the key, discuss it with professionals - next possible answers are added. 2. After the pilot testing: Student´s responses are collected and judged by reviewers. New possible responses are added based on student´s answers in pilot testing. If it is needed instructions are edited to make the task more comprehensible.

Scoring open-ended questions/items Human factor (paper testing) The teachers scoring the responses do not respect the answer key: 1. Teachers mark the response as correct instead of incorrect: 2. Teachers mark the response as incorrect instead of correct: 3. Teachers mark the response as incorrect/correct instead of no answer: without knowing the rules and instructions for marking, accidentally, intentionally.

PC factor (e-testing) The system scoring the responses sticks to the key (no exceptions). 1. The system marks the response as incorrect instead of correct: Automatic computer-based marking of open-ended responses operates at basic levels of character or rule recognition. The pre- defined key cannot predicts all specificities in responses of test takers (double space, wrong apostrophe, typing errors, capital letters, quotation marks, more words etc.). For this reason, we are still dependent on human marking. Because the answer key does not include all of the possible answers. 2. The system marks the response as incorrect instead of no answer: Because it can not distinguish incorrect from irrelevant response. Scoring open-ended questions/items

Paper testing vs. e-testing Paper testingE-testing more subjective opportunities to cheat (students & evaluators) How to avoid problems giving feedback to the National school inspection, establishing test centers coordinated by NUCEM, relying on e-system. more subjective opportunities to cheat (students & evaluators) How to avoid problems giving feedback to the National school inspection, establishing test centers coordinated by NUCEM, relying on e-system. more objective and fast reduced opportunities to cheat How to avoid problems Developing new types of items, with instructions adapted to e- system, Utilizing wide data form pilot testing (responses) for making answer keys more precise, relying on people. more objective and fast reduced opportunities to cheat How to avoid problems Developing new types of items, with instructions adapted to e- system, Utilizing wide data form pilot testing (responses) for making answer keys more precise, relying on people.

1.typing errors + double spaces 2.more words than required 3.irrelevant response 4.more errors in one response Sample types of incorrect responses

Answer key for open-ended items in paper testing: English language Teachers who score the responses are trained to know how to work with instructions provided in the answer key. Item type: Custom Fill.

"To key" or "not to key"? E-test form of the answer key would be:

Answer key for open-ended items in paper testing: Slovak language and Slovak literature Teachers who score the responses are trained to know how to work with instructions provided in the answer key. Type of items: Fill.

To key or not to key?  Item nr. 55: ako, ako/ako ako/ako – ako/ako/ako tie, ako tie/ako tie ako tie/ako tie – ako tie/a, a /a a /a – a /ako tys’ pekná, krajina moja, ako mladistvosť milá mi tvoja/ako tie Božie plamene, ako tie kvety na chladnej zemi/ako tie Božie plamene, ako tie kvety na chladnej zemi, ako tie drahé kamene/ako tie kvety na chladnej zemi, ako tie drahé kamene/a pekný život tie kvety žili, a diamant v hrudi nezhnije/...55  Item nr. 55: ako, ako/ako ako/ako – ako/ako/ako tie, ako tie/ako tie ako tie/ako tie – ako tie/a, a /a a /a – a /ako tys’ pekná, krajina moja, ako mladistvosť milá mi tvoja/ako tie Božie plamene, ako tie kvety na chladnej zemi/ako tie Božie plamene, ako tie kvety na chladnej zemi, ako tie drahé kamene/ako tie kvety na chladnej zemi, ako tie drahé kamene/a pekný život tie kvety žili, a diamant v hrudi nezhnije/...55  Item nr. 54: tys/tys´/tys‘54  Item nr. 54: tys/tys´/tys‘54  Item nr. 32 Tak, že, za/tak, že, za/Tak že za/tak že za/Tak že za/za, že, tak/za že tak/za, tak, že/za tak že/že, za, tak/že za tak/že, tak, za/že tak za/...32  Item nr. 32 Tak, že, za/tak, že, za/Tak že za/tak že za/Tak že za/za, že, tak/za že tak/za, tak, že/za tak že/že, za, tak/že za tak/že, tak, za/že tak za/...32 E-test form of the answer key would be:

Changing the type of an item: item 54 Type of item: Fill Type of item: Marking text The item was prepared for paper-based testing ; in e-testing environment we want to avoid problems with many possible responses.

Changing the type of an item: item 32 Type of item: Fill Type of item: Marking text The item was prepared for paper-based testing ; in e-testing environment we want to avoid problems with many possible responses.

Granada: IATEFL conference E-testing presentation