Download presentation
Presentation is loading. Please wait.
Published byEugene Daniel Modified over 9 years ago
1
Shawna Williams BC TEAL Annual Conference May 24, 2014
2
Agenda Introduction Assessment terminology – definitions Principles of Assessment
3
My assessment history…
4
Language Assessment: Principles and Classroom Practices
6
Assessment Terminology Assessment ≠ Testing
7
Assessment is… “appraising or estimating the level or magnitude of some attribute of a person.” (Mousavi, 2009) “an ongoing process of collecting information about a given object of interest according to procedures that are systematic and substantively grounded.” (Brown & Abeywickrama, 2010)
8
Assessment “A good teacher never ceases to assess students, whether those assessments are incidental or intended.” (Brown & Abeywickrama, 2010)
9
Assessment and Learning Teaching Assessment Measurement Tests Evaluation (Brown & Abeywickrama, 2010, p. 6)
10
Function of Assessment Informal Formal Incidental, unplanned comments Coaching Impromptu feedback on homework Nonjudgmental Systematic and planned Give T and Ss appraisal of achievement Tests and assignments
11
Function of Assessment Formative Summative “evaluating students in the process of ‘forming’ their competencies and skills with the goal of helping them to continue that growth process” feedback on performance future continuation of learning “aims to measure, or summarize, what the student has grasped” end of course or unit looking back and taking stock (Brown & Abeywickrama, 2010, p.7)
12
How do you know if an assessment task is effective, appropriate, useful, or... “good”?
13
Practicality Reliability Validity Authenticity Washback
14
A PRACTICAL TEST...
15
budgetary limits appropriate time constraints clear directions for administration appropriately utilizes human resources does not exceed available material resources considers time and effort for design and scoring (Brown & Abeywickrama, 2010, p. 26)
16
A RELIABLE TEST... =
17
consistent across two or more administrations clear directions for scoring/evaluation uniform rubrics for scoring/evaluation consistent application of rubrics by scorer unambiguous to the test-taker (Brown & Abeywickrama, 2010, p. 27)
18
Reliability con’t Rater Reliability Inter-Rater Reliability Intra-Rater Reliability
19
Reliability con’t Rater Reliability Inter-Rater Reliability Intra-Rater Reliability Student-Related Reliability Test Administration Reliability Test Reliability
20
A VALID TEST...
21
measures exactly what it proposes to measure does not measure irrelevant or “contaminating” variables relies on empirical evidence (performance) performance that samples the test’s criterion (objective) useful, meaningful information about test- taker’s ability supported by theoretical rationale or argument (Brown& Abeywickrama, 2010, p. 30)
22
Validity con’t Content-Related Evidence Criterion-Related Evidence Construct-Related Evidence Consequential Validity (Impact) Face Validity
23
AN AUTHENTIC TEST...
24
language as natural as possible items are contextualized rather than isolated meaningful, relevant, interesting topics thematic organization real-world tasks (Brown & Abeywickrama, 2010, p. 37)
25
A TEST THAT PROVIDES BENEFICIAL WASHBACK...
26
positively influences teachers’ teaching positively influences learners’ learning learners can adequately prepare feedback for language development more formative than summative conditions for peak performance (Brown & Abeywickrama, 2010, p. 38)
27
Applying Principles to Creation of Assessment Tools 1. Practical test procedures? 2. Test is reliable? 3. Rater reliability? 4. Content validity? 5. Impact has been accounted for? 6. Procedure is “biased for best”? 7. Test tasks are authentic? 8. Test offers beneficial washback? 9. See Brown & Abeywickrama, Chpt. 2
28
Shawna Williams swilliams@listn.info
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.