ASSESSMENT: DIVERSITY OF STRATEGIES Chris Rust Oxford Brookes University.

Slides:



Advertisements
Similar presentations
Student Learning Strategies for Success in Computer Networking July 06 Student Learning Strategies for Success in Computer Networking By Name Neville Palmer.
Advertisements

Personal Development Plans (PDPs) Subject-specific PDPs in Economics.
Enhancing Induction: Principles for Improving the Student Experience Engaging the learner: Why did I get 37%? Professor Brenda Smith Goldsmiths, University.
1 Assessing and Giving Feedback. 2 Learning Outcomes By the end of the session, participants should be able to: Identify the purposes and use of different.
Online assessment scenarios Oxford Centre for Staff and Learning Development Oxford Brookes University.
Human Resources Brookes Assessment Compact Dr Chris Rust, Head, Oxford Centre for Staff and Learning Development Deputy Director, ASKe.
Directorate of Human Resources Examples of blended course designs Oxford Centre for Staff and Learning Development
Assessment Design Practical examples. Peer marking using model answers (Forbes & Spence, 1991) Scenario: Engineering students had weekly maths problem.
Directorate of Human Resources Engaging students with assessment and feedback Chris Rust, Head, Oxford Centre for Staff and Learning Development.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
Case Studies M.Sc. in Applied Statistics Dr. Órlaith Burke Michaelmas Term 2012.
Course Review on Assessment Professor Brenda Smith.
PROBLEM-BASED LEARNING & CAPACITY BUILDING
Placement Workshop Y2, Sem 2 Professional Practice Module (PPM)
Curriculum Design for Assuring Learning in Business Education
Session Outcomes Explain how assessment contributes to the learning process Use a model of feedback to enhance student learning Identify a range of feedback.
Professional Perspectives: Electronic Engineering Paul Spencer Dean of School, Electronic Engineering Kal Winston* Adviser, Study Skills Centre.
Exploring the Psychological Contracts of first year students and associated links to retention PLAT 2010.
Responding to the Assessment Challenges of Large Classes.
Large Classes & Assessment Professor Margaret Price Director ASKe Centre for Excellence in Teaching and Learning (Assessment Standards Knowledge exchange)
Making Sense of Assessments in HE Modules (Demystifying Module Specification) Jan Anderson University Teaching Fellow L&T Coordinator SSSL
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Improving Students’ understanding of Feedback
1 Learning from each other. 28-Jun-15Sandra Windeatt, Online Services, UNN2 What I used to do.
Assessment: generic issues Chris Shiel, Head of Learning and Teaching, IBAL 2 nd Project Management Conference for Excellence in Teaching Learning and.
Oxford Centre for Staff and Learning Development Improving the effectiveness of feedback through increased student engagement Dr Chris Rust Head, Oxford.
Introducing Assessment
Assessment Standards Knowledge exchange Engaging students with assessment and feedback Dr Chris Rust, Deputy Director ASKe CETL Directorate: Margaret Price,
Assessment in Higher Education Linda Carey Centre for Educational Development Queen’s University Belfast.
1 Health Management and Social Care Third Consultation.
Oxford Centre for Staff and Learning Development Developing a variety of assessment methods Chris Rust Oxford Centre for Staff and Learning Development.
BSc Management BSc Management (Specialism) Dr. Ilias Petrounias Programme Director BSc Management/Management (Specialism)
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
OCTOBER ED DIRECTOR PROFESSIONAL DEVELOPMENT 10/1/14 POWERFUL & PURPOSEFUL FEEDBACK.
Goal Understand the impact on student achievement from effective use of formative assessment, and the role of principals, teachers, and students in that.
1. Roles, responsibilities and relationships in lifelong learning 2. Understanding inclusive learning and teaching in lifelong learning 3. Using inclusive.
Jeremy Hall Nicholas Jones Wouter Poortinga An Exploration of Assessment Practices at Cardiff University’s Schools of Engineering, Psychology and the Centre.
ASSESSMENT and EVALUATION FOR IMPROVED STUDENT LEARNING:
Theme 2: Expanding Assessment and Evaluation for FNMI Students Goal #1: First Nations, Métis and Inuit student achievement is increased as measured by.
External Examiners’ Briefing Day Assessment Policy Tuesday 6 th January 2015.
Introducing small-group workshops as formative assessment in large first year psychology modules Suzanne Guerin School of Psychology, University College.
Thinking Actively in a Social Context T A S C.
Marion Webb January  By the end of the session, participants will be able to:  Discuss the role of assessment for learning  Describe key assessment.
Slide 1 of 19 Lessons from the Foundation Learning provision for the new 16 to 19 Study Programmes Discussion materials Issue 1: Attendance, retention,
Improving feedback, & student engagement with feedback.
Engaging students in assessment Chris Rust Deputy Director, ASKe Centre for Excellence in Teaching and Learning (Assessment Standards Knowledge exchange)
Enhancing student learning through assessment: a school-wide approach Christine O'Leary, Centre for Promoting Learner Autonomy Sheffield Business School.
Lessons learnt and changed understanding: Can Machine Dynamics students survive without my lectures? Ian Howard Mechanical Engineering.
PBL in Team Applied to Software Engineering Education Liubo Ouyang Software School, Hunan University CEIS-SIOE, January 2006, Harbin.
Using Feedback. Objectives Assess the effectiveness of a range of factors affecting achievement Assess the effectiveness of a range of factors affecting.
How does assessment support learning?. Assessment is the key driver of student learning Assessment is at the heart of the student experience” (Brown,
Let’s Talk Assessment Rhonda Haus University of Regina 2013.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
Tutoring Groups School of Electrical Engineering Systems.
Human Resources Brookes Assessment Compact Dr Chris Rust, Head, Oxford Centre for Staff and Learning Development Deputy Director, ASKe.
Feedback in University Teaching Prof. Arif Khurshed Division of Accounting and Finance.
ON LINE TOPIC Assessment.  Educational assessment is the process of documenting, usually in measurable terms, knowledge, skills, attitudes and beliefs.
ASSESSMENT AND FEEDBACK: PRINCIPLES & PRACTICE. Assessment and Feedback: Principles and Practice Chris Rust Oxford Centre for Staff and Learning Development.
CERTIFICATE IN ASSESSING VOCATIONAL ACHIEVEMENT (CAVA) Unit 1: Understanding the principles and practices of assessment.
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
Observation System Kidderminster College January 2012.
PGCE Evaluation of Assessment Methods. Why do we assess? Diagnosis: establish entry behaviour, diagnose learning needs/difficulties. Diagnosis: establish.
Tips for Academic Quality Assurance Hugh Starkey Tempus DO IT 2 nd Consortium Meeting Landau 13 March 2013.
Assessment for Learning Centre for Academic Practice Enhancement, Middlesex University.
Community Studies Implementation workshop Integration of Australian Curriculum Capabilities Community Studies B.
Understanding Standards: Nominee Training Event
Assessment and Feedback – Module 1
Assessment A.E.T. Week 10 Cate Clegg.
Engaging students with assessment and feedback
Presentation transcript:

ASSESSMENT: DIVERSITY OF STRATEGIES Chris Rust Oxford Brookes University

Student learning & assessment “Assessment is at the heart of the student experience” (Brown, S & Knight, P., 1994) “From our students’ point of view, assessment always defines the actual curriculum” (Ramsden, P.,1992) “Assessment defines what students regard as important, how they spend their time and how they come to see themselves as students and then as graduates If you want to change student learning then change the methods of assessment” (Brown, G et al, 1997)

Session outline 6 Strategies  Change the criteria  Change the task  Mechanise assessment*  Assess the process  Assess groups  Involve the students* plus Strategic Programme Decisions* *can reduce staff workload

Purposes of assessment (adapted from Brown G et al 1997 )  motivate students  diagnose a student's strengths and weaknesses  help students judge their own abilities  provide a profile of what each student has learnt  provide a profile of what the whole class has learnt  grade or rank a student  permit a student to proceed  select for future courses  license for practice  select, or predict success, in future employment  provide feedback on the effectiveness of the teaching  evaluate the strengths and weaknesses of the course  achieve/guarantee respectability and gain credit with  other institutions and employers To:

Purposes of assessment 2 1.Motivation 2.Create learning activities 3.Providing feedback 4.Judging performance (to produce marks, grades, degree classifications; to differentiate; gatekeeping; qualification) 5.Quality assurance 1, 2 & 3 concern learning and perform a largely formative function; should be fulfilled frequently 4 & 5 are largely summative functions; need to be fulfilled infrequently but well

Formative vs Summative Formative: focus is to help the student learn Summative: focus is to measure how much has been learnt. not necessarily mutually exclusive, but…. Summative assessment tends to:  come at the end of a period or unit of learning  focus on judging performance, grading, differentiating between students, gatekeeping  be of limited or even no use for feedback

Constructive alignment & issues of validity “The fundamental principle of constructive alignment is that a good teaching system aligns teaching method and assessment to the learning activities stated in the objectives so that all aspects of this system are in accord in supporting appropriate student learning” (Biggs, 1999)

Constructive alignment: 3-stage course design  What are “desired” outcomes?  What teaching methods require students to behave in ways that are likely to achieve those outcomes?  What assessment tasks will tell us if the actual outcomes match those that are intended or desired? This is the essence of ‘constructive alignment’ (Biggs, 1999)

Change the criteria  Essay - library and journals example  Laboratory reports - Information Communication Technology* skills example (* spreadsheets, statistical packages, word-processing, graphics, etc.) Task: Fill in the skills checklist for the average student at a particular stage on one of your courses. Where you have high importance ratings but low skills ratings, consider ways those desired skills could be highlighted by changing the criteria

Change the task  Traditional assessment samples a narrow range of abilities  Validity  Transferability  Relevance/interest/motivation  Plagiarism and cheating NB Sense of ‘audience’ and ‘real’ purpose

Change the task - task Take the most traditional assessment from one of your courses and invent as many different assessment tasks as possible. Especially keep in mind the issue of validity and the learning outcome/s being assessed, and try to ensure that each new task has a sense of ‘real’ audience and purpose.

Mechanise Assessment 1.Statement banks 2.Computer aided-assessment 3.Assignment attachment sheets

Mechanise assessment 1 - statement banks Write out frequently used feedback comments, for example: 1.I like this sentence/section because it is clear and concise 2.I found this paragraph/section/essay well organised and easy to follow 3.I am afraid I am lost. This paragraph/section is unclear and leaves me confused as to what you mean 4.I would understand and be more convinced if you gave an example/quote/statistic to support this 5.It would really help if you presented this data in a table 6.This is an important point and you make it well etc…….

Weekly CAA testing – case study data StudentWeek 1Week 2Week 3Week 4Week 5Week 6Week 7 A B C D E F G (Brown, Rust & Gibbs,1994)

CAA quizzes Scenario  First term, first year compulsory law module  A new subject for most (75%) students  High failure rate (25%), poor general results (28% 3rd class, 7% Ist) Solution:  Weekly optional WebCT quizzes (50% take-up) Outcome: Quiz takers: 4% fail, 14% 3rd class, 24% Ist Non-quiz takers: same pattern as before Overall:14% fail (approx half previous figure) 21% 3rd class 14% 1st (double previous figure)

Assess groups – major issues Reasons - Active learning/engagement/exploratory talk - to develop interpersonal/group skills - to produce a bigger, more complex product/outcome - pragmatic, logistical reasons (e.g. staff time/limited resources) Scale - size of group (pairs, triads, 4-6) - length of time - size/complexity of outcome Composition of group, and how chosen Need for preparation, ‘training’ and/or guidance Assessment - none - process vs product - formative/feedback only - summative (N.B. fairness)

Assess groups – preparation, training and guidance  Reflection on previous groupwork experience/s  Negative brainstorm › set of guidelines, ?contract  Definition/allocation of roles  Guidelines on process – e.g. minutes, project plan, etc.  Consideration of how to deal with problems  Team skills development checklist

Assessing groups – ‘yellow card’ system The assignment referee (or dealing with dysfunctional group members) White Card The offence has been noted and group/tutors have voted for a white card. A recorded warning but no further penalty Green Card A further offence(s) have been recorded. A green card has been voted for by the group and seconded by tutors. 5 penalty points, and further offence(s) will incur a yellow card Yellow Card A further offence(s) has been recorded. A yellow card has been voted for by the group and seconded by tutors. 10 further penalty points and further offence(s) will incur a red card Red Card Exclusion from the group and 0 marks for the project (this means you would be required to re-take the 7410 module) This individual has been judged: a)Not to have made any meaningful contribution to the group over term 2 and at least half of term 3 b)Their behaviour has seriously disrupted the efforts of the rest of the group (Retail Management Field, Oxford Brookes Business School)

Involve the students - 1 self assessment Strengths of this piece of work Weaknesses in this piece of work How this work could be improved The grade it deserves is….. What I would like your comments on it is the interaction between both believing in self-responsibility and using assessment formatively that leads to greater educational achievements ( Brown & Hirschfeld, 2008 )

Involve the students – 2 peer marking using model answers Scenario:  Engineering students had weekly maths problem sheets marked and problem classes  Increased student numbers meant marking impossible and problem classes big enough to hide in  Students stopped doing problems  Exam marks declined (Average 55%>45%) Solution:  Course requirement to complete 50 problem sheets  Peer assessed at six lecture sessions but marks do not count  Exams and teaching unchanged Outcome: Exam marks increased (Av. 45%>80%)

Involve the students – 3 peer feedback Scenario:  Geography students did two essays but no apparent improvement from one to the other despite lots of tutor time writing feedback  Increased student numbers made tutor workload impossible Solution:  Only one essay but first draft required part way through course  Students read and give each other feedback on their draft essays  Students rewrite the essay in the light of the feedback  In addition to the final draft, students also submit a summary of how the 2nd draft has been altered from the1st in the light of the feedback Outcome: Much better essays

Involve the students – 4 peer feedback (Zeller, 2000*) The Praktomat system allows students to read, review, and assess each other’s programs in order to improve quality and style. After a successful submission, the student can retrieve and review a program of some fellow student selected by Praktomat. After the review is complete, the student may obtain reviews and re-submit improved versions of his program. The reviewing process is independent of grading; the risk of plagiarism is narrowed by personalized assignments and automatic testing of submitted programs. In a survey, more than two thirds of the students affirmed that reading each other’s programs improved their program quality; this is also confirmed by statistical data. An evaluation shows that program readability improved significantly for students that had written or received reviews. [*Available at: ]

Assessing a selection Scenario:  Weekly lab reports submitted for marking  Increased student numbers meant heavy staff workload and increasingly lengthy gap before returned so feedback of limited/no use Solution:  Weekly lab reports still submitted  Sample number looked at, and generic feedback ed to all students within 48 hours  At end of semester, only three weeks’ lab reports selected for summative marking Outcome:  Better lab reports and significantly less marking

Final task – review and plan Individually: Review everything that has been covered in this workshop Make a note of things that you intend to do as a result In pairs: Tell your partner what you are intending to do