Download presentation
Presentation is loading. Please wait.
Published byRose Sparks Modified over 9 years ago
1
Introduction to Assessment ESL Materials and Testing Week 8
2
What is assessment? Not the same as testing! Not the same as testing! An ongoing process to ensure that the course/class objectives and goals are met. An ongoing process to ensure that the course/class objectives and goals are met. A process, not a product. A process, not a product. A test is a form of assessment. (Brown, 2004, p. 5) A test is a form of assessment. (Brown, 2004, p. 5)
3
Informal and Formal Assessment Informal assessment can take a number of forms: Informal assessment can take a number of forms: unplanned comments, verbal feedback to students, observing students perform a task or work in small groups, and so on. unplanned comments, verbal feedback to students, observing students perform a task or work in small groups, and so on. Formal assessment are exercises or procedures which are: Formal assessment are exercises or procedures which are: systematic systematic give students and teachers an appraisal of students’ achievement such as tests. give students and teachers an appraisal of students’ achievement such as tests.
4
Traditional Assessment Multiple-choice Multiple-choice True-false True-false Matching Matching Norm-referenced and criterion referenced tests Norm-referenced and criterion referenced tests
5
Norm and Criterion-referenced tests Norm-referenced test Norm-referenced test standardized tests (college board, TOEFL, GRE) standardized tests (college board, TOEFL, GRE) Place test-takers on a mathematical continuum in rank order Place test-takers on a mathematical continuum in rank order Criterion-referenced tests Criterion-referenced tests give test-takers feedback on specific objectives (“criterea”) give test-takers feedback on specific objectives (“criterea”) test objectives of a course test objectives of a course known as “instructional value” known as “instructional value”
6
Authentic Assessment Authentic assessment Authentic assessment reflects student learning, achievement, motivation, and attitudes on instructionally relevant classroom activities (O’Malley & Valdez, 1996). reflects student learning, achievement, motivation, and attitudes on instructionally relevant classroom activities (O’Malley & Valdez, 1996). Examples: Examples: performance assessment performance assessment portfolios portfolios self-assessment self-assessment
7
Purposes for Assessment Diagnose students strengths and needs Diagnose students strengths and needs Provide feedback on student learning Provide feedback on student learning Provide a basis for instructional placement Provide a basis for instructional placement Inform and guide instruction Inform and guide instruction Communicate learning expectations Communicate learning expectations Motivate and focus students’ attention and effort Motivate and focus students’ attention and effort Provide practice applying knowledge and skills Provide practice applying knowledge and skills
8
Purposes continued Provide a basis for evaluation for the purpose of: Provide a basis for evaluation for the purpose of: Grading Grading Promotion/graduation Promotion/graduation Program admission/selection Program admission/selection Accountability Accountability Gauge program effectiveness Gauge program effectiveness
9
Assessment Instruments Pre-assessment (diagnostic) Formative (ongoing) Summative (final) PretestsQuizzes Teacher-made test ObservationsDiscussionsPortfolios Journals/logsAssignmentsProjects DiscussionsProjects Standardized tests QuestionnairesObservations InterviewsPortfolios Journal logs Standardized tests
10
Discussion How would you document a student performance during a discussion? How would you document a student performance during a discussion? Which types of assessments noted in the chart could be considered authentic assessment? Which types of assessments noted in the chart could be considered authentic assessment?
11
Principles of Language Assessment Practicality Practicality Reliability Reliability Validity Validity Authenticity Authenticity Washback Washback
12
Practicality An effective test is practical An effective test is practical Is not excessively expensive Is not excessively expensive Stays within appropriate time constraints Stays within appropriate time constraints Is relatively easy to administer Is relatively easy to administer Has a scoring/evaluation procedure that is specific and time-efficient Has a scoring/evaluation procedure that is specific and time-efficient
13
Reliability A reliable test is consistent and dependable. If you give the same test to the same students in two different occasions, the test should yield similar results. A reliable test is consistent and dependable. If you give the same test to the same students in two different occasions, the test should yield similar results. Student-related reliability Student-related reliability Rater reliability Rater reliability Test administration reliability Test administration reliability Test reliability Test reliability
14
Student Related Reliability The most common issue in student related reliability is caused by temporary illness, fatigue, a bad day, anxiety, and other physical and psychological factors which may make an “observed” score deviate from a “true” score. The most common issue in student related reliability is caused by temporary illness, fatigue, a bad day, anxiety, and other physical and psychological factors which may make an “observed” score deviate from a “true” score.
15
Rater Reliability Human error, subjectivity, and bias may enter into the scoring process. Human error, subjectivity, and bias may enter into the scoring process. Inter-rater reliability occurs when two or more scorers yield inconsistent scores of the same test, possibly for lack of attention to scoring criteria, inexperience, inattention, or even preconceived bias toward a particular “good” and “bad” student. Inter-rater reliability occurs when two or more scorers yield inconsistent scores of the same test, possibly for lack of attention to scoring criteria, inexperience, inattention, or even preconceived bias toward a particular “good” and “bad” student.
16
Test Administration Reliability Test administration reliability deals with the conditions in which the test is administered. Test administration reliability deals with the conditions in which the test is administered. Street noise outside the building Street noise outside the building bad equipment bad equipment room temperature room temperature the conditions of chairs and tables, photocopying variation the conditions of chairs and tables, photocopying variation
17
Test Reliability The test is too long The test is too long Poorly written or ambiguous test items Poorly written or ambiguous test items
18
Validity A test is valid if it actually assess the objectives and what has been taught. A test is valid if it actually assess the objectives and what has been taught. Content validity Content validity Criterion validity (tests objectives) Criterion validity (tests objectives) Construct validity Construct validity Consequential validity Consequential validity Face validity Face validity
19
Content Validity A test is valid if the teacher can clearly define the achievement that he or she is measuring A test is valid if the teacher can clearly define the achievement that he or she is measuring A test of tennis competency that asks someone to run a 100-yard dash lacks content validity A test of tennis competency that asks someone to run a 100-yard dash lacks content validity If a teacher uses the communicative approach to teach speaking and then uses the audiolingual method to design test items, it is going to lack content validity If a teacher uses the communicative approach to teach speaking and then uses the audiolingual method to design test items, it is going to lack content validity
20
Criterion-related Validity The extent to which the objectives of the test have been measured or assessed. For instance, if you are assessing reading skills such as scanning and skimming information, how are the exercises designed to test these objectives? The extent to which the objectives of the test have been measured or assessed. For instance, if you are assessing reading skills such as scanning and skimming information, how are the exercises designed to test these objectives? In other words, the test is valid if the objectives taught are the objectives tested and the items are actually testing this objectives. In other words, the test is valid if the objectives taught are the objectives tested and the items are actually testing this objectives.
21
Construct Validity A construct is an explanation or theory that attempts to explain observed phenomena A construct is an explanation or theory that attempts to explain observed phenomena If you are testing vocabulary and the lexical objective is to use the lexical items for communication, writing the definitions of the test will not match with the construct of communicative language use If you are testing vocabulary and the lexical objective is to use the lexical items for communication, writing the definitions of the test will not match with the construct of communicative language use
22
Consequential Validity Accuracy in measuring intended criteria Accuracy in measuring intended criteria Its impact on the preparation of test-takers Its impact on the preparation of test-takers Its effect on the learner Its effect on the learner Social consequences of a test interpretation (exit exam for pre-basic students at El Colegio, the College Board) Social consequences of a test interpretation (exit exam for pre-basic students at El Colegio, the College Board)
23
Face Validity Face validity refers to the degree to which a test looks right, and appears to measure the knowledge or ability it claims to measure Face validity refers to the degree to which a test looks right, and appears to measure the knowledge or ability it claims to measure A well-constructed, expected format with familiar tasks A well-constructed, expected format with familiar tasks A test that is clearly doable within the allotted time limit A test that is clearly doable within the allotted time limit Directions are crystal clear Directions are crystal clear Tasks that relate to the course (content validity) Tasks that relate to the course (content validity) A difficulty level that presents a reasonable challenge A difficulty level that presents a reasonable challenge
24
Authenticity The language in the test is as natural as possible The language in the test is as natural as possible Items are contextualized rather than isolated Items are contextualized rather than isolated Topics are relevant and meaningful for learners Topics are relevant and meaningful for learners Some thematic organization to items is provided Some thematic organization to items is provided Tasks represent, or closely approximate, real- world tasks Tasks represent, or closely approximate, real- world tasks
25
Washback Washback refers to the effects the tests have on instruction in terms of how students prepare for the test “Cram” courses and “teaching to the test” are examples of such washback Washback refers to the effects the tests have on instruction in terms of how students prepare for the test “Cram” courses and “teaching to the test” are examples of such washback In some cases the student may learn when working on a test or assessment In some cases the student may learn when working on a test or assessment Washback can be positive or negative Washback can be positive or negative
26
Alternative Assessment Options Self and peer-assessments Self and peer-assessments Oral production-student self-checklist, peer checklist, offering and receiving holistic rating of an oral presentation Oral production-student self-checklist, peer checklist, offering and receiving holistic rating of an oral presentation Listening comprehension- listening to TV or radio broadcasts and checking comprehension with a partner Listening comprehension- listening to TV or radio broadcasts and checking comprehension with a partner Writing-revising work on your own, peer-editing Writing-revising work on your own, peer-editing Reading- reading textbook passages followed by self-check comprehension questions, self-assessment of reading habits Reading- reading textbook passages followed by self-check comprehension questions, self-assessment of reading habits (page 416, Brown, 2001)
27
Authentic Assessment Performance assessment- any form of assessment in which the student constructs a response orally or in writing. It requires the learner to accomplish a complex and significant task, while bringing to bear prior knowledge, recent learning, and relevant skills to solve realistic or authentic problems (O’Malley & Valdez, 1996; Herman, et. al., 1992). Performance assessment- any form of assessment in which the student constructs a response orally or in writing. It requires the learner to accomplish a complex and significant task, while bringing to bear prior knowledge, recent learning, and relevant skills to solve realistic or authentic problems (O’Malley & Valdez, 1996; Herman, et. al., 1992).
28
Examples of Authentic Assessment Portfolio assessment Portfolio assessment Student self-assessment Student self-assessment Peer assessment Peer assessment Student-teacher conferences Student-teacher conferences Oral interviews Oral interviews Writing samples Writing samples Projects or exhibitions Projects or exhibitions Experiments or demonstrations Experiments or demonstrations
29
Characteristics of performance assessment Constructed response Constructed response Higher-order thinking Higher-order thinking Authenticity Authenticity Integrative Integrative Process and product Process and product Depth versus breadth Depth versus breadth
30
Journals Specify to students the purpose of the journal Specify to students the purpose of the journal Give clear directions to students on how to get started (prompts for instance “I was very happy when…) Give clear directions to students on how to get started (prompts for instance “I was very happy when…) Give guidelines on length of each entry Give guidelines on length of each entry Be clear yourself on the principal purpose of the journal Be clear yourself on the principal purpose of the journal Help students to process your feedback, and show them how to respond to your responses Help students to process your feedback, and show them how to respond to your responses
31
Conferences Commonly used when teaching writing Commonly used when teaching writing One-on-one interaction between teacher and student One-on-one interaction between teacher and student Conferences are formative assessment as opposed to offering a final grade or a summative assessment. In other words, they are meant to provide guidance and feedback. Conferences are formative assessment as opposed to offering a final grade or a summative assessment. In other words, they are meant to provide guidance and feedback.
32
Portfolios Commonly used with the communicative language teaching approach (CLT) Commonly used with the communicative language teaching approach (CLT) It is a collection of students’ work that demonstrates to students and others the efforts, progress and achievements in a given area. You can have a reading portfolio or a writing portfolio, for instance It is a collection of students’ work that demonstrates to students and others the efforts, progress and achievements in a given area. You can have a reading portfolio or a writing portfolio, for instance You can also have a reflective or assessment portfolio as opposed to collecting every piece of evidence for each objective achieved in the course You can also have a reflective or assessment portfolio as opposed to collecting every piece of evidence for each objective achieved in the course
33
Portfolio Guidelines Specify the purpose of the portfolio Specify the purpose of the portfolio Give clear directions to students on how to get started Give clear directions to students on how to get started Give guidelines of acceptable materials or artifacts Give guidelines of acceptable materials or artifacts Collect portfolios on a pre-announced dates and return promptly Collect portfolios on a pre-announced dates and return promptly Help students to process your feedback Help students to process your feedback Establish a rubric to evaluate the portfolio and discuss it with your students Establish a rubric to evaluate the portfolio and discuss it with your students
34
Cooperative Test Construction Cooperative test construction involves the students contribution to the design of test items. It is based on the concept of collaborative and cooperative learning in which students are involved in the process Cooperative test construction involves the students contribution to the design of test items. It is based on the concept of collaborative and cooperative learning in which students are involved in the process (Brown, 2001, p. 420) (Brown, 2001, p. 420)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.