Chapter 3. High Quality Assessment Focus upon the use and consequences of assessment results Results then promote specific targeted learning goals Basic.

Slides:



Advertisements
Similar presentations
Parts of a Lesson Plan Any format that works for you and your JTEs is ok… BUT! Here are some ideas that might help you set up your LP format. The ALTs.
Advertisements

Project-Based vs. Text-Based
Understanding by Design Day 2 Roosevelt Complex Secondary Science Training.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Increasing your confidence that you really found what you think you found. Reliability and Validity.
Reliability for Teachers Kansas State Department of Education ASSESSMENT LITERACY PROJECT1 Reliability = Consistency.
Benchmark Assessment Item Bank Test Chairpersons Orientation Meeting October 8, 2007 Miami-Dade County Public Schools Best Practices When Constructing.
ASSESSMENT LITERACY PROJECT4 Student Growth Measures - SLOs.
Determining the Best Type of Assessment Tool for an Indicator.
Assessment: Reliability, Validity, and Absence of bias
Assessing Classroom Learning. OOPS – Bluebook Assessment Strategy #1 Group Presentation on Instruction Classroom Assessment Traditional Assessment Student.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Formative and Summative Evaluations
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Assessment & Evaluation  A measurement tool  Non-judgmental*  On-going  Answers the questions:  How much did they learn?  How well did they learn.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
 DIAGNOSTIC: provides instructors with information about student's prior knowledge and misconceptions before beginning a learning activity.  FORMATIVE:
SEPT 20 8:00-11:00 WHAT ARE WE MEASURING? HOW DO WE MEASURE? DHS English Department Professional Development.
Developing Evaluation Instruments
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
Writing Across the Curriculum Collins’ Writing. To develop successful, life-long writers, students must have: Opportunities to: write in many environments.
Becoming a Teacher Ninth Edition
Measurement in Exercise and Sport Psychology Research EPHE 348.
Group Discussion Explain the difference between assignment bias and selection bias. Which one is a threat to internal validity and which is a threat to.
Classroom Assessment and Grading
Technical Adequacy Session One Part Three.
Classroom Assessment LTC 5 ITS REAL Project Vicki DeWittDeb Greaney Director Grant Coordinator.
The World of Assessment Consider the options! Scores based on developmental levels of academic achievement Age-Equivalent scores.
August 2007FFP Testing and Evaluation Techniques Chapter 7 Florida State Fire College Ocala, Florida.
EDU 385 Education Assessment in the Classroom
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Teaching Today: An Introduction to Education 8th edition
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Performance-Based Assessment Authentic Assessment
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
Using NAEP To Improve Instruction. Instructional Strategies.
By: Dr. AWATIF ALAM ASSOCIATE PROFESSOR MEDICAL COLLEGE,KSU.
Lectures ASSESSING LANGUAGE SKILLS Receptive Skills Productive Skills Criteria for selecting language sub skills Different Test Types & Test Requirements.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
4-Day Agenda and Expectations Day 2 Building Formative Assessments linked to deconstructed content and skills Building Formative Assessments linked to.
Law of Contrariness "Our chief want in life is somebody who shall make us do what we can. Having found them, we shall then hate them for it." Ralph Waldo.
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
Assessment and Testing
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Building Exams Dennis Duncan University of Georgia.
Grand Island K-8 SCIENCE Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Year 2 SATS INFORMATION SESSION. General Information Although this is a new assessment, the format is the same as previous years. Year 2 SATs.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
Scoring Rubrics: Validity and Reliability Barbara M. Moskal Colorado School of Mines.
1 Assessment. 2 Classroom Assessment Classroom assessment is the collection, evaluation, and use of information to help teachers make better decisions.
 Rubrics for Literacy Learning.  What is your current view of rubrics? What do you know about them and what experiences have you had using them ? Self.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
CASL Chapter 7 – Performance Assessment Brian Ito Dawn Kodama-Nii Ellen Nishioka Holly Polk.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment in Education ~ What teachers need to know.
The pre- post assessment 3 points1 point Introduction Purpose Learning targets Instruments Eliminate bias & distortion Writing conventions.
Chapter 4. Formative Assessment Text – Formative assessment is the process of gathering evidence of student learning, evaluating this evidence, providing.
How to Use These Modules 1.Complete these modules with your grade level and/or content team. 2.Print the note taking sheets. 3.Read the notes as you view.
Assessment Design How do you know that they know what you taught them?
ASSESSMENT OF STUDENT LEARNING
EDP 303 Presentations Lawrence W. Sherman, Ph. D
Milwee Middle School Math Night
Critically Evaluating an Assessment Task
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Chapter 3

High Quality Assessment Focus upon the use and consequences of assessment results Results then promote specific targeted learning goals Basic Criteria Clear and appropriate learning targets Alignment of assessment methods to learning targets Validity & Reliability Fairness & Positive consequences Alignment to standards and benchmarks and/or common core Practical, usable and efficient Understood by teacher and students

High Quality Assessment Basic Types of Assessment Selected Response (Fill in the blank, multiple choice, T or F, matching or Constructed Response (Fill in the blank, essay, short answer, “show all your work”) Observation Self evaluation questionnaire, inventory, survey) Matching Targets with Assessment Methods (refer to Figure 3.3)

High Quality Assessment What is validity –Validity refers to how well a test measures what it is purported to measure. To determine validity 1)Content related validity (importance in instruction equals importance in assessment) 2)Criterion related validity (will another assessment design provide the same results) 3)Instructional validity (does it assess what was taught) What is reliability - Reliability is the degree to which an assessment tool produces stable and consistent results. Opposite of a reliable assessment is an unreliable one

High Quality Assessment 1)A mile race will utilize a stop watch which should justify a reliable result 2)A disposition/ attitude or “feelings” inventory may have an unreliable result 3)An assessment may be considered reliable although there may be an assessment error due to a variety of sources internal errors (anxiety, motivation, mood) or external errors (test interruptions, scoring, test room conditions). This may be considered an assessment error

Bias in Assessment Bias may be present in assessment and may distort performance due to a variety of factors including: race, gender, age, religion, etc. Bias may be present in (a) offensiveness due to the nature of a question, (b) the above factors as well as geography Bias may be unintentional – example – the man smoking a pipe and a man chopping wood – discussed in class may reflect one’s superficial observation

What are the positive consequences of assessment 1)Students will be accustomed to reforming their study habits to reflect the teacher’s assessment style (essay questions will demand a different style of studying) 2)Appropriate assessment may increase student motivation due to the fact that the student will understand the assessment style and the feedback provided by the teacher 3)Motivation will increase by having appropriate assessment designs restructured 4)The teacher will structure the instruction to reflect the impending assessment designs

Assessment and the alignment with standards AERA asks 1)Does the content of the assessment match the current standards? 2)Does the content of the assessment match the required content which was used in instruction? 3)Do the standards reflect a variety of assessment options 4)Do the level of the assessments allow for upper level performance (based upon normal distribution)? 5)Does the assessment reflect relevant instruction and not extraneous instruction? Refer to table 3.13

Biased questions (?) STRAWBERRY:RED (A) peach:ripe (B) leather:brown (C) grass:green (D) orange:round (E) lemon:yellow Two students solve the following problem A + B = C Kim Lee states B + A = C and Kareem states that C – A = B Who is correct (Kim Lee), (Kareem) (Both students) (Neither student) Where can a woman place a cup of coffee once she is finished ? (1) Table (2) Saucer (3) Both are correct (4) Neither are correct

Biased questions (?) Which word(s) is correctly spelled? (1) Affect (2) Effect (3) Honor (4) Honour (5) All are correct What are the ways a man can cross a river (1)Bridge (2) Boat (3) Walking (4) Bridge and Boat (5) Bridge, Boat and Walking A team scores two touchdowns and one field goal How many points has the team scored? What will you call your father’s brother? Fill in the blank boat: marina as race car: __________ subway: turnstile as taxicab: __________

Design of the GNED 302 Midterm 1) What type of assessment design is the most appropriate ? 2) How will the assessment be administered ? 3) How will the assessment become “fair” for all site ? 4) Who will design the assessment (instructor, text test bank, text chapters?) 5) What will affect the results? Style and design Length Submission design Value for grade