1 Innovations in assessment and evaluation 1 Martin Valcke

Slides:



Advertisements
Similar presentations
Assessment types and activities
Advertisements

Evaluation Overview - Basics. Purpose of Testing Diagnostic Formative Summative.
Performance Assessment
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Session Outcomes Explain how assessment contributes to the learning process Use a model of feedback to enhance student learning Identify a range of feedback.
KEMENTERIAN PENDIDIKAN DAN KEBUDAYAAN BADAN PENGEMBANGAN SUMBER DAYA MANUSIA PENDIDIKAN DAN KEBUDAYAAN DAN PENJAMINAN MUTU PENDIDIKAN AUTHENTIC ASSESSMENT.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.
Developing Rubrics Presented by Frank H. Osborne, Ph. D. © 2015 EMSE 3123 Math and Science in Education 1.
Classroom Assessment (1)
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Types and Purposes of Assessment Mathematics Assessment and Intervention.
1 Alternative Assessment in Eduation Prof. dr. Martin Valcke Workshop Innovative teaching and Learning Strategies in Higher Education Maputo 4-6 August,
Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center
Assessing Student Learning
Critical Partnerships: Using Peer Support to Develop Skills in Writing at Masters Level Sue Forsythe Maarten Tas School of Education
Assessment in Physical Education
Principles of Assessment
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Diploma in Teaching in the Lifelong Learning Sector
David Steer Department of Geosciences The University of Akron Learning objectives and assessments Oct 2013.
Evaluation: A Challenging Component of Teaching Darshana Shah, PhD. PIES
1 Innovation in Assessment and Evaluation Prof. dr. Martin Valcke Ghent University Maputo July 2011.
The difference between learning goals and activities
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
Classroom Assessment A Practical Guide for Educators by Craig A
Four Basic Principles to Follow: Test what was taught. Test what was taught. Test in a way that reflects way in which it was taught. Test in a way that.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
1 Issues in Assessment in Higher Education: Science Higher Education Forum on Scientific Competencies Medellin-Colombia Nov 2-4, 2005 Dr Hans Wagemaker.
Understanding Meaning and Importance of Competency Based Assessment
Assessment Workshop College of San Mateo February 2006.
Measuring Complex Achievement
Alternative Assessment
Teaching Today: An Introduction to Education 8th edition
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Lecture 7. The Questions: What is the role of alternative assessment in language learning? What are the Reasons.
Performance-Based Assessment Authentic Assessment
Introduction to Assessment for Learning by Dr. Marina Wong Hong Kong Baptist University Department of Education Studies.
Assessment Tools.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
PORTFOLIO ASSESSMENT OVERVIEW Introduction  Alternative and performance-based assessment  Characteristics of performance-based assessment  Portfolio.
The selection of appropriate assessment methods in a course is influenced by many factors: the intended learning outcomes, the discipline and related professional.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
What Are the Characteristics of an Effective Portfolio? By Jay Barrett.
ASSESSMENT TOOLS DEVELOPMENT: RUBRICS Marcia Torgrude
Identifying Assessments
MAVILLE ALASTRE-DIZON Philippine Normal University
Welcome Fontys and FLOT Fontys Hogescholen Focus on Teachers of Secundary Schools Focus on Teachers of Primary Schools Focus on others, non teachers.
Checklists and Rubrics EDU 300 Newberry College Jennifer Morrison.
ASSESSMENT and EVALUATION (seeing through the jargon and figuring out how to use the tools)
 Rubrics for Literacy Learning.  What is your current view of rubrics? What do you know about them and what experiences have you had using them ? Self.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Grading based on student centred and transparent assessment of learning outcomes Tommi Haapaniemi
CDIO: Overview, Standards, and Processes (Part 2) Doris R. Brodeur, November 2005.
TRANSLATING PRINCIPLES OF EFFECTIVE FEEDBACK FOR STUDENTS INTO THE CS1 CONTEXT By Claudia Ott, Anthony Robins and Kerry Shepard Presented by Laurel Powell.
Designing Scoring Rubrics
PORTFOLIO ASSESSMENT (Undergraduate Medical Education)
Assessment of Learning 1
Classroom Assessment A Practical Guide for Educators by Craig A
Chapter 6: Checklists, Rating Scales & Rubrics
MENTEP, Brussels Janet Looney
ASSESSMENT OF STUDENT LEARNING
Assessments TAP 1- Strand 5.
Critically Evaluating an Assessment Task
A Research Companion to Principles and Standards
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Rubrics for evaluation
Designing Your Performance Task Assessment
Alternative Assessment
Presentation transcript:

1 Innovations in assessment and evaluation 1 Martin Valcke

2 Structure Advance organizer: evidence-base for evaluation and assessment Activity 1: What is assessment and evaluation? Trends assessment & evaluation Self and peer assessment Activity 2: Develop a self-assessment activity Rubrics Activity 3: Develop a rubric

3 Conclusions Consider differences between: measurement, evaluating and scoring Reconsider the responsibilities for assessment and evaluation Develop reflective competencies of staff and students

4 Advance organizer Where is the evidence about assessment and evaluation?

5 The importance of assessment & evaluation

6 6

7 Activity 1: What is assessment & evaluation? Write down elements that define what assessment and/or evalution comprise.

8 What is assessment & evaluation? Three elements: ‣ Measure: get data from student about his/her answer or behavior (test, exam, task) ‣ Evaluate: what is the value of the “answer/behavior”? ‣ Score: what score will be attributed to a quality level in the answer or behavior?

9 Question What responsibility would you hand over to students? ‣ Measure: get data from student about his/her answer or behavior (test, exam, task) ‣ Evaluate: what is the value of the “answer/behavior”? ‣ Score: what score will be attributed to a quality level in the answer or behavior?

10 Trends in Assessment & Evaluation Major changes in the place, role, function, focus, approach, tools, … in higher education Among trends: major shift in WHO is responsible for the evaluation and in the TOOLS being used

11 Trends in assessment & evaluation Shared characteristics: ‣ Focus on “behavior” ‣ Focus on “authentic” behavior ‣ Foicus on “complex” behavior ‣ Explicit “criteria” ‣ Explicit “standards” ‣ Need for concrete feedback ‣ Focus on “consequential validity” Gielen, Dochy & Dierick (2003)

12 Trends in Assessment & Evaluation Trends according to Fant et al. (1985, 2000) : Assessment centres Self and Peer assessment Portfolio assessment Logbooks Rubrics

13 Trend 1 Self and Peer assessment Trends according to Fant et al. (1985, 2000) : Assessment centres Self and Peer assessment Portfolio assessment Logbooks Rubrics

14 Individual learner Group learner External institution Teachers Expert teacher Assessment system Institutional level

15 Definition self assessment Self assessment can be defined as “the evaluation or judgment of ‘the worth’ of one’s performance and the identification of one’s strengths and weaknesses with a view to improving one’s learning outcomes” (Klenowski, 1995, p. 146). 15

16 Definition peer assessment Peer assessment can be defined as “an arrangement in which individuals consider the amount, level, value, worth, quality, or success of the products or outcomes of learning of peers of similar status” (Topping, 1998, p. 250). 16

17 Self- and peer assessment Learn about your own learning process. Schmitz (1994): “assessment-as- learning”. ~ self corrective feedback 17

18 See experiential learning cycle of Kolb. Boekaerts (1991) self evaluation as a competency. Development of metacognitive knowledge and skills (see Brown, Bull & Pendlebury, 1998, p.181). Freeman & Lewis (1998, p.56-59): developing pro-active learner s 18

19 The Learning Cycle Model

Self – and Peer Assessment in Medical education: Some studies

Accuracy 21

22

23

Tool for self assessment 24

25

26

Attitudes 27

28

29

Attitudes 2 30

31

32

Reliability 33

34

35

Accuracy 2 36

37

38

Confidence / performance 39

40

41

Accuracy 3 42

43

44

Assessment ability 45

46

47

Longitudinal 48

49

50

Follow up 51

52

53

Review: Accuracy 54

55

56

57

Review Effectiveness 58

59

60

61

PA enhances performance 62

63

64

PA longitudinal stability 65

66

67

PA & rater selection 68

69

70

PA, formative & multiple observations 71

72

73

PA, hard to generalize 74

75

76

77 Is it possible? 77 Group evaluations tend to fluctuate around the mean

78 Learning to evaluate Develop checklists Give criteria Ask to look for quality indicators. Analysis of examples good and less good practices: develop a quality “nose” 78

79 Learning to evaluate Freeman & Lewis (1998, p.127) : ‧ Learner develops list of criteria. ‧ Pairs of learners compare listed criteria. ‧ Pairs develop a criterion checklist. ‧ Individual application of checklist. ‧ Use of checklist to evalute work of other learner. ‧ Individual reworks his/her work. ‧ Final result checkeed by teacher and result compared to learner evaluation. ‧ Pairs recheck their work on the base of teacher feedback. 79

80 Learning to evaluate Peer evaluation is not the same as Peer grading Final score is given by teacher! Part of score could build on accuracy of self/peer evaluation and self-correction Example: 1st year course Instructional Sciences 80

81

82

83

84

85

86

87 Information processing 87

Activity 2: develop a self assessment exercise Develop the basic instructions for a self- assessment exercise. 88

Importance of Feedback Where am I going? feed up How am I going? feed back Where to next? feed forward (H attie & Timperly, 2007) 89

90 Trend 2 Rubrics Trends according to Fant et al. (1985, 2000) : Assessment centres Self and Peer assessment Portfolio assessment Logbooks Rubrics

91

92

93

94

95 Rubrics

96 Rubrics

97

98 Rubrics Rubric: scoring tool for a qualitative assessment of complex authentic activoity. ‣ A rubric builds on criteria that are enriched with a scale that help to detremine mastery levels ‣ For each mastery level, standards are available. ‣ A rubric helps both the staff and the student in view of what is expected at process/product level. ‣ Rubrics for “high stake assessment” and for “formative assessment” (in view of learning). (Arter & McTighe, 2001; Busching, 1998; Perlman, 2003). Rubrics focus on the relationship between competencies- criteria, and indicators and are organized along mastery levels (Morgan, 1999).

99 Rubrics Holistic – Analytic Taak specific - Generic

100 Assumptions about rubrics Larger consistency in scores (reliability). More valid assessment of complex behavior. Positive impact on subsequent learning activity.

101 Performance assessment Rubrics focus on the relationship between competencies-criteria, and indicators and are organized along mastery levels (Morgan, 1999).

102 Doubts? Approach marred by beliefs of staff/students about evaluation (see Chong, Wong, & Lang, 2004); Joram & Gabriele, 1998) Validity criteria and indicators (Linn, 1990), Reliability when used by different evaluators (Flowers & Hancock, 2003).

103 Activity 3: develop a rubric

104 Research rubrics Review article 75 studies m.b.t. rubrics : Jonsson, A., & Svingby, G. (2007). The use of scoring rubrics: Reliability, validity and educational consequences. Educational Research Review, 2, 130– 144. ‣ (1) the reliable scoring of performance assessments can be enhanced by the use of rubrics, especially if they are analytic, topic-specific, and complemented with exemplars and/or rater training; ‣ (2) rubrics do not facilitate valid judgment of performance assessments per se. However, valid assessment could be facilitated by using a more comprehensive framework of validity; ‣ (3) rubrics seem to have the potential of promoting learning and/or improve instruction. The main reason for this potential lies in the fact that rubrics make expectations and criteria explicit, which also facilitates feedback and self-assessment.

105 Conditions effective usage Develop assessment frame of reference Training in usage Interrater usage

106 Development rubric Choose criteria for expected behavior ‣ 4 to 15 statements describing criterion Detremine bandwith quality differences ‣ E.g. 0 to 5 qualitative levels Describe eachj value in quality level ‣ Concrete observable qualifications

107

108

109 Critical thinking rubric

110 Informative websites Overview tools, examples, theory, background, research: Critical thinking rubrics: g.html Rubric generators: Intro on interesting rubric sites: Rubric APA research paper: General intro and overview:

111 Conclusions Consider differences between: measurement, evaluating and scoring Reconsider the responsibilities for assessment and evaluation Develop reflective competencies of staff and students

112 Innovations in assessment and evaluation 112 Martin Valcke