Download presentation
Presentation is loading. Please wait.
Published byCarissa Nordan Modified over 10 years ago
1
#jiscassess www.jisc.ac.uk/assessmentandfeedback Assessment and Feedback programme 24 th April 2012
2
Overview www.jisc.ac.uk/assessmentandfeedback #JISCASSESS Overview of programme, strands and deliverables
3
Programme overview Strand A 8 Projects 3 years 2011-2014 Strand B 8 projects 6 months to 2 years 2011-2013 Support and Synthesis Project Strand C 4 projects 9 months to 2 years 2011-2013
4
Locations
5
Programme level outcomes Increased usage of appropriate technology-enhanced assessment and feedback, leading to: –Change in the nature of assessment –Efficiencies, and improvement of assessment quality –Enhancement of the student and staff experience Clearly articulated business cases Models of sustainable institutional support, and guidance on costs and benefits Evidence of impact – on staff and students, workload and satisfaction
6
Strand A goals and objectives Improved student learning and progression Increased efficiency Enhanced learning and teaching practice Integrated strategies, policies & processes Overarching goals from Strand A projects synthesised from their bid documents.
7
Deliverables A Baseline report Summary of previous work in the area Evaluation report Range of assets - evidence of impact Guidance and support materials B Evaluation report Range of assets - evidence of impact Short briefing paper summarising the innovation and benefits C Description of user scenarios Descriptions of the technical model Open source widgets and code Developer guidelines Documentation for users Active community of users Short summary of the innovation
8
Technologies
9
Themes and challenges
10
www.jisc.ac.uk/assessmentandfeedback #JISCASSESS Programme and support team
11
Programme Support Team Critical Friends Evaluation Support Synthesis Programme Team Support Co- ordinator
12
www.jisc.ac.uk/assessmentandfeedback #JISCASSESS What are we learning about technology-enhanced assessment and feedback practices?
13
Why baseline? Programme Level View of landscape & direction of travel Validate aims & rationale Shared understanding Identify synergies with other work Deliver effective support
14
Why baseline? Project Level View of landscape & direction of travel Validate scope Confirm/Identify challenges Identify stakeholders Manage & communicate scope Challenge myths Identify readiness for change Show evidence of improvement Important stage of engagement/ownership
15
Sources of baseline evidence structured and semi- structured interviews (some video) workshops and focus groups process maps rich pictures institutional (and devolved) strategy & policy documents institutional QA documentation reports by QAA, OFSTED & external examiners course evaluations student surveys quantitative analysis of key data sets data from research projects questionnaires
16
Differences in emphasis
18
Are our projects typical of the landscape?
19
Issues: strategy / policy / principles Formal strategy/policy documents lag behind current thinking Educational principles are rarely enshrined in strategy/policy Devolved responsibility makes it difficult to achieve parity of learner experience
20
Issues: stakeholder engagement Learners are not often actively engaged in developing practice Assessment and feedback practice does not reflect the reality of working life Administrative staff are often left out of the dialogue
21
Finding: assessment and feedback practice Traditional forms such as essays/exams still predominate Timeliness of feedback is an issue Curriculum design issues inhibit longitudinal development
22
www.jisc.ac.uk/assessmentandfeedback #JISCASSESS Key resources
23
http://www.jisc.ac.uk/assessment
24
http://www.netvibes.com/jiscinfonet#%23jiscassess
25
http://jiscdesignstudio.pbworks.com
26
Transforming Assessment & Feedback Peer assessment & review Assessment management Feedback & feed forward Asset Assessment & Feedback hub pages Authentic assessment Longitudinal & ipsative assessment Effectiveness & efficiency in assessment Assessment for learning Work-based learning & assessment Employability & assessment http://tinyurl.com/jiscafds
27
Activity Decide if you agree or disagree with each of the statements made on the previous slides (as being representative of mainstream practice in the sector) If you agree – state examples of what can be done about it If you disagree – state examples of evidence to the contrary
28
© HEFCE 2012 The Higher Education Funding Council for England, on behalf of JISC, permits reuse of this presentation and its contents under the terms of the Creative Commons Attribution-Non-Commercial-No Derivative Works 2.0 UK England & Wales Licence. http://creativecommons.org/licenses/by-nc-nd/2.0/uk slide 28
29
Evidence and evaluation projects – Strand B EBEAM – University of Huddersfield EEVS – University of Hertfordshire EFFECT – University of Dundee The evaluation of Assessment Diaries and Grademark – University of Glamorgan OCME – University of Exeter MACE – University of Westminster SG4CL – University of Edinburgh
30
Timings 11.15 – 11.35: Participants move round all 3 rooms to look at the 7 posters and have short introductory discussions with projects –Identify 3 projects youd like to know more about 11.35 – 11.50: Discussion with Project 1 11.50 – 12.05: Discussion with Project 2 12.05 – 12.20: Discussion with Project 3
31
Rooms Proceed – Evaluating the Benefits of Electronic Assessment Management, (EBEAM project), Cath Ellis, University of Huddersfield Online Coursework Management Evaluation (OCME project), Anka Djordjevic, University of Exeter The Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan - Karen Fitzgibbon and Sue Stocking, University of Glamorgan Propel 1 - Making Assessment Count Evaluation, (MACE Project), Gunter Saunders and Peter Chatterton, University of Westminster, Mark Kerrigan, University of Greenwich and Loretta Newman-Ford, Cardiff Metropolitan University Evaluating feedback for e-learning: centralized tutors (EFFECT project), Aileen McGuigan, University of Dundee Propel 2 - Student-Generated Content for Learning: Enhancing Engagement, Feedback and Performance (SGC4L project), Judy Hardy, University of Edinburgh Evaluating Electronic Voting Systems for Enhancing Student Experience (EEVS project), Amanda Jefferies, University of Hertfordshire
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.