Improving student success through implementation of weekly, student unique, CAA tutorial sheets. Mark Russell & Peter Bullen University of Hertfordshire.

Slides:



Advertisements
Similar presentations
Use of an objective assessment tool to evaluate students basic electrical engineering skills Nandini Alinier University of Hertfordshire, U.K. Engineering.
Advertisements

USING EXCEL TO INDIVIDUALISE BASIC MATHEMATICS ASSIGNMENTS DEE Conference, Cambridge, Sept 2007 Mike Rosser Department of Economics, Finance and Accountancy.
Using portfolios to engage first year geoscience students in their subject and to develop learning skills. Alan Boyle & Dave Prior.
Innovation in Assessment? Why? Poor student feedback regarding feedback timeliness and usefulness Staff workloads Student lack of awareness as to what.
The Creative Learning Journey an introduction . . .
Feedback to the Future Sue Gill, QuILT and Rachael Thornton, NUSU.
DEVELOPING CRITICAL THINKING SKILLS - THE IMPORTANCE OF DEEP KNOWLEDGE ACQUIRED BY IMPROVED STUDY METHODS Greg Foley School of Biotechnology, Dublin City.
HELM Computer Aided Assessment Presented by David Pidcock Project Manager.
The Mid-Semester Review – Bridge the Gap Barbara Vohmann, Dean Bowyer, Debbie Philipson, Pauline Start, Mark Tree, Dr. Julian Priddle.
Context-Based Learning in Physics. “New” processes for students Note: These skills may be new to Physics classes but they are not necessarily new to students.
Reflect on our progress since OFSTED (focus on assessment) Identify areas in which each of us can make assessment more effective.
‘INCENTIVISED READING’-USING MASTERING BIOLOGY TO ENCOURAGE EARLIER ENGAGEMENT BY STUDENTS Louise Beard School of Biological Sciences University of Essex.
The HELM Project: Helping Engineers Learn Mathematics
Introduction Differentiation in the classroom can be used to involve all types of students. Including kids with learning disabilities, children of a minority,
Bella Reichard INTO Newcastle University
AJGregory, JCRead - University of Central Lancashire 1 Using WebCT as a tool for Mathematical Discussion Peggy Gregory Janet Read Stewart Eyres UCLAN,
LTI CONFERENCE University of Hertfordshire MAY 6 TH 2010 Karen Clark and Sharon Korek.
Peer assessment of group work using WebPA Neil Gordon Symposium on the Benefits of eLearning Technologies University of Manchester, in conjunction with.
Seven Principles for Good Practice in Undergraduate Education: A useful driver? Mark Russell Deputy Director of Blended Learning Unit (BLU) Lauren Anderson.
The impact of case studies formatively and summatively assessed on students’ examination performance. Geeta Hitch (Senior Lecturer, Dept of Pharmacy) Janet.
ANY A HIGGINS Senior Lecturer Sport Studies USING FLIP CAMERA IN ASSESSMENT.
Innovations in biochemistry teaching – a new approach to course design and delivery Sue R Whittle Faculty of Biological Sciences On behalf of the Biochemistry.
The development of lessons, case studies, scenarios and simulations in the Moodle Virtual Learning Environment for self directed learning (SDL) By Michael.
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Pharmaceutical Weekly Assessed Tutorial Sheets (PWATS) Tracy Garnier 1, Mark Russell 2 Introduction Performing accurate pharmaceutical calculations is.
Teaching in Maths Background Marking Tutorials Practical information Handouts: PGs yellow+white, UGs pink+whiteyellowwhitepinkwhite Handouts and slides.
BORIS MILAŠINOVIĆ FACULTY OF ELECTRICAL ENGINEERING AND COMPUTING UNIVERSITY OF ZAGREB, CROATIA Experiences after three years of teaching “Development.
Assessment and Feedback Peer and Self Assessment
Miguel Martins, Senior Lecturer – Marketing Department eLearning Celebration 22 July 2007 Improving performance on a Marketing module through the use of.
Personal Statement References. Aims and Objectives What Are Universities Looking For? What to Include in the Reference School Information Student Information.
Foundation Pathways in Technology BEng and BSc Programmes in Engineering, Computing and Mathematics with Integrated Foundation Year.
Inspiring Challenging Achieving Year 11 Maths and English Parental Learning Evening 4 th October.
Meeting the Needs of Individuals
19 th November Tutor Day 13 th December Giving parents the full monty.
Oxford Centre for Staff and Learning Development Student assessment: lightening the load while increasing the learning Dr Chris Rust Head, Oxford Centre.
Integrating Technology In a Classroom Lesson Step-by-Step Instructions on how to Integrate technology in a classroom lesson.
Oral pathology in a PBL course Dr Mary Toner School of Dental Science Trinity College, Dublin.
1 ThinkLink Learning Online User Manual for Predictive Assessment Series Go to www2.thinklinklearning.com/pas4mlwk. Click Educator Login. Your username.
Games Technology BSc Multimedia Computing BSc Creative Computing BSc.
How to mark 100 assignments and give feedback in an hour Seb Savory UCL Electronic & Electrical Engineering.
Student Engagement Survey Results and Analysis June 2011.
Project.  Topic should be: Clear and specific Practical and meaningful, this means the results of your research must have some implications in real life.
1 Bacon – T. A. Webinar – 7 March 2012 Transforming Assessment with Adaptive Questions Dick Bacon Department of Physics University of Surrey
Human-Computer Interaction Introduction © Brian Whitworth.
Wiley eGrade. What is eGrade? Web-based software that enables instructors to automate the process of assigning and grading homework and quiz assignments.
Feb using assessment to motivate learning. QAA Assessment Driven Learning Jean Cook
Stephen Broughton Paul Hernandez-Martinez Carol L. Robinson Lecturers’ use of computer-aided assessment How history and contradictions help shape lecturers’
GCSE Maths Information Evening Monday 21 st September Welcome.
Boyton Primary School - Assessment & Expectations Parents Meeting Septmber 2015.
Our assessment objectives
The Delivering Social Change Signature Project Planning effective intervention.
BSc Honours Project Introduction CSY4010 Amir Minai Module Leader.
Year 11 Raising Achievement  Your child has been issued with a target grade for each subject  You have been sent an updated copy of these target.
Multi-media Information Systems Introduction Brian Whitworth © 2001.
Student Preferences For Learning College Algebra in a Web Enhanced Environment Dr. Laura J. Pyzdrowski, Pre-Collegiate Mathematics Coordinator Institute.
Heriot Watt University Breakout Session MCQ with Confidence Dr Phil Davies Division of Computing & Mathematical Sciences Department of Computing FAT University.
The role of CAA in Helping Engineering undergraduates to Learn Mathematics David Green, David Pidcock and Aruna Palipana HELM Project Mathematics Education.
Alessio Peluso 1 Critical evaluation of the module ‘Introduction to Engineering Thermo Fluid Dynamics’ First Steps in Learning and Teaching in Higher Education.
What will constitute “Outstanding” in MFL lessons? OfSTED guidance for subject inspections. MFL 2012 onwards.
Check In Check Out Technical Assistance. Think and Respond  Where are you at with your development and implementation of CICO?
Helmingham Community Primary School Assessment Information Evening 10 February 2016.
E-Assessment: Removing the Boundaries of C.A.A. Phil Davies School of Computing University of Glamorgan South Wales UK.
Key Stage 1 Curriculum and Assessment changes. Wyndham Park’s vision Our vision is to develop deep learning through everyone’s unique talents; giving.
A research and policy informed discussion of cross-curricular approaches to the teaching of mathematics and science with a focus on how scientific enquiry.
Enhancing mathematical skills of undergraduate science students using individualised assessment and feedback strategies to sustain learning. Ela Bryson,
© Nexedi SA 2010 – All rights reserved– Creative Commons License – No Commercial Use Permitted ERP5 Configuration Questionnaire This guide will teach you:
Parents and Carers Forum Teaching, Learning and Assessment
On Line Examining and Timely Indicators of Student Performance
“Frequent e-Assessments”
HLPs and EBPs in mathematics and science in clinical experiences
Presentation transcript:

Improving student success through implementation of weekly, student unique, CAA tutorial sheets. Mark Russell & Peter Bullen University of Hertfordshire

Why listen to me? This CAA implementation has made a difference! –Student attendance is up. –Exam performance is up. –Retention of students is likely to be up. –This project is transportable to other areas.

Background. First Year module in Fluid mechanics and Thermodynamics. ~ 150 Student 4 teaching staff on team Variety of teaching and learning settings. Explicit use of Universities MLE (StudyNET).

So why bother? Poor exam performance. Possible causes include- –The language of subject is probably new to many students. –There is a need for some mathematical ability. –Attendance at lectures and tutorials is patchy. The exam performance is particularly concerning given the development of structured learning materials and a real desire to support the students using StudyNET. We believed we were doing our bit!

The perceived problem. Students did not always expend the required effort. Tutorial questions remained unanswered. Revision time became the primary learning time! Did the assessment process actively support the learning?

What do we want to do? Actively encourage (force!) students to engage with all the materials. Why - –Consolidates learning. –Helps apply the new language. –The maths is problem oriented. –Develops confidence in students ability. –Forces a structured study pattern.

How did we do it? Developed an integrated, summative, assessment regime. It’s key features were –Weekly. –Student unique. –Forcing.

The integrated assessment regime included A student unique, Weekly Assessed Tutorial Sheet (WATS) An evolving automated marking sheet. A manual nudge to non-participants. A full worked solution (uploaded to StudyNET after student submissions). A report on the groups weekly performance. Some generic notes on issues observed.

How did we do it? ‘Word’ sets generic question‘Excel’ creates randomised numeric data StudyNET is used to deliver questions to the students ‘Mail merge’ combines generic question with randomised numeric data to create a student unique WATS. WATS

What did it look like? 11 WATS were developed. –Each WATS had parent and child questions –Each WATS had to be done within one week. –The students had to submit a hard copy of the their results sheet. –A supplementary Excel marking sheet was developed to help the MANUAL marking! –Marking and analysis of the groups performance was prompt. Within 1 day of submission.

As time went on … WATS 9-11 inc. had an automated student submission sheet. Unfortunately the timing of the development and server security issues did not allow widespread deployment. We had to settle for loading on one pc only. Students were issued passwords and were only allowed one submission per WATS. A plea to their good nature not to go looking for things to delete also helped!

Why did we choose this approach? Did not suffer the more obvious potential drawbacks of a MCQ CAA approach. i.e. –Answers for Q1 are not all the same! –Fairness of test/equality of questions in a question bank. –Does not tie students to a PC. –Does not inadvertently create bias. –Does not inadvertently give hints. –A ‘chance’ answer is not an issue.

Key benefits. Stops solution sharing at source. Students could still help each other. Students get the feedback they so often like. Forces a structured study pattern. ‘Not cool to be studious’ is not an issue with this approach. Attempts to engage with everybody. Allows students to see where they went wrong.

Critical success factors. Attendance at lectures and tutorials was improved. More tutorial questions were tackled by the students. But what about exam performance ?

WATS & exam performance.

Overlaying the WATS & exam data

2002 vs exam performance % Increase Mean Median S.D Over 35% 6288 % Over 35% Population

2002 vs exam performance.

Can WATS predict exam grade? 52% of the students had a difference of only 15% or less between their WATS and their exam grade.

Correlating the exam to the WATS.

What did the students think (1/7)? +1.1

What did the students think (2/7)? +0.9

What did the students think (3/7)? +1.42

What did the students think (4/7)? +0.15

What did the students think (5/7)? -0.09

What did the students think (6/7)? -0.61

What did the students think (7/7)? +0.8

Where next ? Build on lessons with greater emphasis on automating more of the processes. There now exists a one stop C++ program for entry to the WATS submissions.

More where next ? More analysis of results Incorporate automated nudges. –Provides additional student care and individualised contact. Consider adopting a competence pass threshold. –May help close the learning cycle. Provide student unique additional study material. –Match material to individual weakness.

Why automate? Reduce staff time. Marking rules can be set up and applied to all students. No matter how fair! Will help with the move towards a competence pass structure. Allows implementation without becoming too time consuming. Approach is already likely to be borrowed by an electrical science module.

Conclusions. The WATS has improved the exam performance. The WATS has improved attendance. The WATS will help with student retention. This WATS approach would not have been feasible without exploiting the use of computers.

Conclusions. The students liked the whole experience We will be looking to export this approach to other modules. There are still some outstanding issues to investigate. The application of CAA to this module has been a remarkable success.

Acknowledgements. The authors wish to thank the LTSN engineering for their support of this work.