#jiscassess Assessment and Feedback programme 24 th April 2012
Overview #JISCASSESS Overview of programme, strands and deliverables
Programme overview Strand A 8 Projects 3 years Strand B 8 projects 6 months to 2 years Support and Synthesis Project Strand C 4 projects 9 months to 2 years
Locations
Programme level outcomes Increased usage of appropriate technology-enhanced assessment and feedback, leading to: –Change in the nature of assessment –Efficiencies, and improvement of assessment quality –Enhancement of the student and staff experience Clearly articulated business cases Models of sustainable institutional support, and guidance on costs and benefits Evidence of impact – on staff and students, workload and satisfaction
Strand A goals and objectives Improved student learning and progression Increased efficiency Enhanced learning and teaching practice Integrated strategies, policies & processes Overarching goals from Strand A projects synthesised from their bid documents.
Deliverables A Baseline report Summary of previous work in the area Evaluation report Range of assets - evidence of impact Guidance and support materials B Evaluation report Range of assets - evidence of impact Short briefing paper summarising the innovation and benefits C Description of user scenarios Descriptions of the technical model Open source widgets and code Developer guidelines Documentation for users Active community of users Short summary of the innovation
Technologies
Themes and challenges
#JISCASSESS Programme and support team
Programme Support Team Critical Friends Evaluation Support Synthesis Programme Team Support Co- ordinator
#JISCASSESS What are we learning about technology-enhanced assessment and feedback practices?
Why baseline? Programme Level View of landscape & direction of travel Validate aims & rationale Shared understanding Identify synergies with other work Deliver effective support
Why baseline? Project Level View of landscape & direction of travel Validate scope Confirm/Identify challenges Identify stakeholders Manage & communicate scope Challenge myths Identify readiness for change Show evidence of improvement Important stage of engagement/ownership
Sources of baseline evidence structured and semi- structured interviews (some video) workshops and focus groups process maps rich pictures institutional (and devolved) strategy & policy documents institutional QA documentation reports by QAA, OFSTED & external examiners course evaluations student surveys quantitative analysis of key data sets data from research projects questionnaires
Differences in emphasis
Are our projects typical of the landscape?
Issues: strategy / policy / principles Formal strategy/policy documents lag behind current thinking Educational principles are rarely enshrined in strategy/policy Devolved responsibility makes it difficult to achieve parity of learner experience
Issues: stakeholder engagement Learners are not often actively engaged in developing practice Assessment and feedback practice does not reflect the reality of working life Administrative staff are often left out of the dialogue
Finding: assessment and feedback practice Traditional forms such as essays/exams still predominate Timeliness of feedback is an issue Curriculum design issues inhibit longitudinal development
#JISCASSESS Key resources
Transforming Assessment & Feedback Peer assessment & review Assessment management Feedback & feed forward Asset Assessment & Feedback hub pages Authentic assessment Longitudinal & ipsative assessment Effectiveness & efficiency in assessment Assessment for learning Work-based learning & assessment Employability & assessment
Activity Decide if you agree or disagree with each of the statements made on the previous slides (as being representative of mainstream practice in the sector) If you agree – state examples of what can be done about it If you disagree – state examples of evidence to the contrary
© HEFCE 2012 The Higher Education Funding Council for England, on behalf of JISC, permits reuse of this presentation and its contents under the terms of the Creative Commons Attribution-Non-Commercial-No Derivative Works 2.0 UK England & Wales Licence. slide 28
Evidence and evaluation projects – Strand B EBEAM – University of Huddersfield EEVS – University of Hertfordshire EFFECT – University of Dundee The evaluation of Assessment Diaries and Grademark – University of Glamorgan OCME – University of Exeter MACE – University of Westminster SG4CL – University of Edinburgh
Timings – 11.35: Participants move round all 3 rooms to look at the 7 posters and have short introductory discussions with projects –Identify 3 projects youd like to know more about – 11.50: Discussion with Project – 12.05: Discussion with Project – 12.20: Discussion with Project 3
Rooms Proceed – Evaluating the Benefits of Electronic Assessment Management, (EBEAM project), Cath Ellis, University of Huddersfield Online Coursework Management Evaluation (OCME project), Anka Djordjevic, University of Exeter The Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan - Karen Fitzgibbon and Sue Stocking, University of Glamorgan Propel 1 - Making Assessment Count Evaluation, (MACE Project), Gunter Saunders and Peter Chatterton, University of Westminster, Mark Kerrigan, University of Greenwich and Loretta Newman-Ford, Cardiff Metropolitan University Evaluating feedback for e-learning: centralized tutors (EFFECT project), Aileen McGuigan, University of Dundee Propel 2 - Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance (SGC4L project), Judy Hardy, University of Edinburgh Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS project), Amanda Jefferies, University of Hertfordshire