Testing Writing Rio Darmasetiawan 69100077.

Slides:



Advertisements
Similar presentations
The meaning of Reliability and Validity in psychological research
Advertisements

Writing. It is important that the practitioner observes the child in the process of writing in order to see how the writing is attempted: Concentration.
Facilitator Notes Materials Needed: –PPT –4 How To support docs on each type of rubric –Rubrics: Other Points to Consider document.
Overview of Progress Monitoring Training Session Part of a training series developed to accompany the AIMSweb Improvement System. Purpose is to provide.
Standardized Scales.
TESTING SPEAKING AND LISTENING
Presented by Eroika Jeniffer.  We want to set tasks that form a representative of the population of oral tasks that we expect candidates to be able to.
A2 Unit 4A Geography fieldwork investigation Candidates taking Unit 4A have, in section A, the opportunity to extend an area of the subject content into.
Testing What You Teach: Eliminating the “Will this be on the final
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
© Cambridge International Examinations 2013 Component/Paper 1.
Chapter 8 Criteria and Validity PERSIAN GROUP. ارزیابی امتحان آزمون ارزیابی امتحان آزمون ارزیابی امتحان آزمون ارزیابی امتحان آزمون ارزیابی امتحان آزمون.
Spanish Assessment Smackdown. What are two of the 5 important aspects of objectives? Hint, look at p. 22 Genesee and Upshur 1. CO’s = general 2. CO’s.
TVCA Assessment Workshop Series: Understanding the THINK Rubric Celine Kavalec-Miller Jenny Britton Learning Evidence Team Valencia Community College.
ASSESSMENT LITERACY PROJECT Kansas State Department of Education Rubrics “You need to learn what rubrics are and how they can help you do a better job.
The controlled assessment is worth 25% of the GCSE The project has three stages; 1. Planning 2. Collecting, processing and representing data 3. Interpreting.
Consistency of Teacher Judgement Human Society and Its Environment 7-10 © 2007 Curriculum K-12 Directorate, NSW Department of Education and Training.
1 Testing Oral Ability Pertemuan 22 Matakuliah: >/ > Tahun: >
Teaching and Testing Pertemuan 13
1 BASIC CONSIDERATIONS in Test Design 2 Pertemuan 16 Matakuliah: >/ > Tahun: >
Uses of Language Tests.
Testing for Language Teachers
Understanding Standards: Biology An Overview of the Standards for Unit and Course Assessment.
WEEK 1 – TOPIC 1 OVERVIEW OF ASSESSMENT: CONTEXT, ISSUES AND TRENDS 1.
Testing Writing. We have to : have representative sample of the tasks that we expect the students to perform. those task should elicit valid samples of.
Testing Writing Miss. Mona AL-Kahtani.
Power Point Slides by Ronald J. Shope in collaboration with John W. Creswell Chapter 13 Survey Designs.
Overall Teacher Judgements
Near East University Department of English Language Teaching Advanced Research Techniques Correlational Studies Abdalmonam H. Elkorbow.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
Chris Barcock A680: English/ English Language Information and Ideas: Higher and Foundation Tiers.
Qualifications Update: Environmental Science Qualifications Update: Environmental Science.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 Education Assessment in the Classroom
TESTING.
Measuring Complex Achievement
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Process Success Criteria for Girls. Assessment for Learning Assessment for learning: Using the teacher’s assessment of pupils’ performance to inform planning.
New Advanced Higher Subject Implementation Events Statistics Unit Assessment at Advanced Higher.
Qualifications Update: Human Biology Qualifications Update: Human Biology.
Assessment in Special Education, SPED 4131 Professor Dr. Regina Blair
© 2008 Gatsby Technical Education Projects. These slides may be used solely in the purchaser’s school or college. Evaluating scientific writing.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
BACKWARD DESIGN AND ASSESSMENT Course Design Intensive ~ Dr. Catherine D. Rawn June 2014 This work is licensed under the Creative Commons Attribution-NonCommercial.
 A test is said to be valid if it measures accurately what it is supposed to measure and nothing else.  For Example; “Is photography an art or a science?
History CPD Presentation The Verification Process and Making Assessment Decisions.
Imagine…  A hundred students is taking a 100 item test at 3 o'clock on a Tuesday afternoon.  The test is neither difficult nor easy. So, not ALL get.
Tests can be categorised according to the types of information they provide. This categorisation will prove useful both in deciding whether an existing.
Stages of Test Development By Lily Novita
National PE Cycle of Analysis. Fitness Assessment + Gathering Data Why do we need to asses our fitness levels?? * Strengths + Weeknesses -> Develop Performance.
RelEx Introduction to the Standardization Phase Relating language examinations to the Common European Framework of Reference for Languages Gilles Breton.
Development of Assessments Laura Mason Consultant.
Designing Scoring Rubrics
EVALUATING EPP-CREATED ASSESSMENTS
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Chapter 5: Assessment and Accountability
Creating Analytic Rubrics April 27, 2017
Classroom Assessments Checklists, Rating Scales, and Rubrics
Content Analysis of Children’s Programmes
Designing Assessment Things to be considered:
What Are Rubrics? Rubrics are components of:
National Conference on Student Assessment
Assessment Methods.
Assessment Elementary Mathematics
Designing Your Performance Task Assessment
Suitability Test Wednesday, 22 May 2019.
The Teacher Work Sample: An Authentic Assessment
Presentation transcript:

Testing Writing Rio Darmasetiawan 69100077

Testing problem We have to set writing tasks that are properly representative of the population of tasks The tasks shoudl elicit valid samples of writing It is essential that the samples of writing can and will be scored validly and reliably

Representative tasks Specify all possible content In order to judge whether the tasks we set are representative of the tasks that we expect students to be able to perform, we have to be clear at the outset just what these tasks are that they should be able to perform.

Representative tasks Include a representative sample of the specified content The more tasks (within reason) that we set, the more representative of a candidate's ability (the more valid) will be the totality of the samples (of the candidate's ability) we obtain.

Elicit a valid sample of writing ability Set as many separate tasks as is feasible We have to offer candidates as many 'fresh starts' as possible, and each task can represent a fresh start. By doing this, we will achieve greater reliability and so greater validity

Elicit a valid sample of writing ability Test only writing ability and nothing else This advice assumes that we do not want to test anything other than the ability to write. Therefore, for the sake of validity, we should not set tasks which measure other abilities such as creative, imaginative, etc.

Elicit a valid sample of writing ability Restrict candidates Writing tasks should be well defined: candidates should know what is required of them, and they should not be allowed to go too far astray. One last thing to say about tasks is that they should not only fit well with the specifications, but they should also be made as authentic as possible

Ensure valid and reliable scoring Set tasks which can be reliably scored Set as many tasks as possible The more scores for each candidate, the more reliable should be the total score. Restrict candidates The greater the restrictions imposed on the candidates, the more directly comparable will be the performances of different candidates Give no choice of tasks Making the candidates perform all tasks also makes comparisons between candidates easier Ensure long enough samples The samples of writing that are elicited have to be long enough for judgments to be made reliably

Ensure valid and reliable scoring Create appropriate scales for scoring Holistic scoring Holistic scoring involves the assignment of a single score to a piece of writing on the basis of an overall impression of it. This kind of scoring has the advantage of being rapid. Analytic scoring Methods of scoring which require a separate score for each of a number of aspects of a task are said to be analytic.

Ensure valid and reliable scoring Calibrate the scale to be used Any scales which is to be used should first be calibrated, meaning collecting samples of performance collected under test conditions, and covering the full range of the scales. Select and train scorers They should be sensitive to language, have had experience of teaching writing and marking written work. It is also helpful if they have had training in testing.

Ensure valid and reliable scoring Follow acceptable scoring procedures Each task of each student should be scored independently by two or more scorers (as many scorers as possible should be involved in the assessment of each student's work. Senior member of the team should collate scores and identify discrepancies in scores awarded to the same piece of writing. Nevertheless, once scoring is completed, It is useful to carry out simple statistical analyses to discover if anyone's scoring is unacceptably aberrant