Download presentation
Presentation is loading. Please wait.
Published byAlbert Allen Modified over 9 years ago
1
DESIGNING CLASSROOM TESTS TSL3112 LANGUAGE ASSESSMENT PISMP TESL SEMESTER 6 IPGKDRI
2
STAGES OF TEST CONSTRUCTION What is the purpose of the test? What are the objectives of the test? How will the test specifications reflect both the purpose and the objectives? How will the test item types (tasks) be selected and the separate items arranged? In administering the test, what details should I attend to in order to help students achieve optimal performance? What kind of scoring, grading, and/or feedback is expected?
3
DETERMINING THE PURPOSE OF A TEST Consider the overall purpose of the exercise that students are about to perform. For examples: – Why create the test. – Its significance relative to he course – to evaluate overall proficiency or place a student in a course. – The importance of the test compared to other student performance. – The test impact before and after – to teachers and students.
4
DETERMINING THE PURPOSE OF A TEST Bachman & Palmer (1996) – the purpose of an assessment refers to as test usefulness, or to what use teachers will put an assessment.
5
DESIGNING CLEAR, UNAMBIGUOUS OBJECTIVES Begin by taking a careful look at everything that what students should know and be able to do based on the material that the students are responsible for. In other words, examine the objectives for the unit teachers are testing.
6
DESIGNING CLEAR, UNAMBIGUOUS OBJECTIVES Consider: – What to find out. – Establishing appropriate objectives involves a number of issues; from relatively simple ones (e.g. about forms and functions covered in a course unit) to more complex ones (e.g. about constructs to be represented on the test). – Language abilities to be assessed.
7
DRAWING UP TEST SPECIFICATIONS An outline of the test – what it will look like. To design or evaluate a test – make sure the test has a structure that logically follows from the unit or lesson it is testing. The class objectives should be present in the test through appropriate task types and weights, a logical sequence, and a variety of tasks.
8
DRAWING UP TEST SPECIFICATIONS A blueprint of the test that includes: – a description of its content. – item types (method, such as multiple-choice, cloze, etc.). – tasks (e.g., written essay, reading a short passage, etc.). – skills to be included. – how the test will be scored. – how it will be reported to students.
9
DRAWING UP TEST SPECIFICATIONS For classroom purposes, the specifications (specs) are a guiding plan for designing an instrument that effectively fulfills desired principles esp. validity (Davidson & Lynch, 2002). Spaan (2006) – for large-scale standardised test to be widely distributed, and therefore, broadly generalised, test specifications are much more formal and detail. Also, usually confidential – to ensure the validity of subsequent forms of a test.
10
DEVISING TEST ITEMS The tasks need to be practical. For content validity – tasks should mirror tasks of the course, lesson, or segment. Should be authentic – with a progression biased for best performance. Can be evaluated reliably by the teacher/scorer.
11
DEVISING TEST ITEMS The test development is not always a clear, linear process. Test design usually involves a number of loops – problems and shortcomings.
12
ADMINISTERING THE TEST Once the test is ready to administer, students need to feel well prepared for their performance. Reduce unnecessary anxiety in students, raise their confidence, and help them view the test as an opportunity to learn.
13
ADMINISTERING THE TEST Pre-test considerations (the day before the in- class essay): – Provide pre-test information on: The conditions for the test – e.g. time limit, no portable electronics, breaks, etc. Materials to bring. Test item types. Suggestions of strategies for optimal performance. Evaluation criteria (rubrics, show benchmark samples).
14
ADMINISTERING THE TEST – Offer a review of components of narrative and description essays. – Give students a chance to ask questions, and provide responses. Test administration details: – Arrive early and see to it that the classroom conditions (lighting, temperature, a clock, furniture arrangement, etc.) are conducive.
15
ADMINISTERING THE TEST – Try the audio/video or other technology that is needed for administration in advance. – Have extra paper, writing instruments, or other response materials on hand. – Start on time. – Distribute the test itself. – Remain quietly seated at the teacher’s desk, available for questions from students as they proceed. – For a timed test, warn students when time is about to run out, and encourage their completion of their work.
16
SCORING, GRADING, AND GIVING FEEDBACK SCORING: – Scoring plan reflects the relative weight placed on the items in each section. – Greater weight should be placed in accordance with the significance of the tasks – e.g., tasks which represent more general or integrative language ability. – Classroom teachers may decide to revise the scoring plan for the course the next time you teach it.
17
SCORING, GRADING, AND GIVING FEEDBACK SCORING: – At that point, teachers might have valuable information about how easy or difficult a test was, whether the time limit was reasonable, students’ affective reaction to it, and their general performance. – Finally, teachers will have an intuitive judgement about whether a test correctly assessed the students.
18
SCORING, GRADING, AND GIVING FEEDBACK GRADING: – Grading is such a thorny issue! – The assignment of letter grades to the test is a product of: the country, culture, and context of the English classroom. institutional expectations (most are unwritten). explicit and implicit definitions of grades that you have set forth. the relationship you have established with the class. student expectations that have been engendered in previous tests and quizzes in the class.
19
SCORING, GRADING, AND GIVING FEEDBACK GIVING FEEDBACK: – Normally beneficial feedback. – A few of the many possible manifestations of feedback associated with tests:
20
SCORING, GRADING, AND GIVING FEEDBACK In general, scoring/grading for a test: a.a letter grade b.a total score c.subscores (e.g., of separate skills or sections of a test) For responses to listening and reading items: a.indication of correct/incorrect responses b.diagnostic set of scores (e.g., scores on certain grammatical categories) c.checklist of areas needing work and strategic options
21
SCORING, GRADING, AND GIVING FEEDBACK For oral production tests: a.scores for each element being rated b.checklist of areas needing work and strategic options c.oral feedback after performance d.post-interview conference to go over the results On written essays: a.scores for each element being rated b.checklist of areas needing work and suggested strategies/techniques for improving writing c.marginal and end-of-essay comments, suggestions d.post-test conference to go over work
22
SCORING, GRADING, AND GIVING FEEDBACK Additional/alternative feedback for a test: a.on all or selected parts of a test, peer conferences on results b.whole-class discussion of results of the test c.individual conferences with each student to review a complete test d.self-assessment in various manifestations
23
CONCLUSION DETERMINE PURPOSE/USEFULNESS STATE OBJECTIVES DRAW UP SPECIFICATIONS SELECT TASKS AND ITEM TYPES AND ARRANGE THEM SYSTEMATICALLY IN ADMINISTERING THE TEST, HELP STUDENTS TO ACHIEVE OPTIMAL PERFORMANCE CONSTRUCT A SYSTEM OF SCORING/GRADING AND PROVIDING STUDENT FEEDBACK
24
STAGES OF TEST CONSTRUCTION (COURSE PRO FORMA) Assessing clear, unambiguous objectives Drawing up Test specifications Item writing Item moderation Pre-testing Analysis Training Reporting
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.