The evaluation life-cycle Tristram Hooley – Postgraduate Training Co-ordinator Rob Shaw – Learning Technologist
Project aims 1.To produce and evaluate a high-quality online portal to provide training in online research methods; 2.To act as a self-supporting online resource to enhance understanding of both the theoretical and practical aspects of online research methods including web-based questionnaires, virtual synchronous and asynchronous interviews; 3.To draw on a wide range of successful good practice case studies, cover associated ethical issues of online research, provide important resource links and technical guidance.
The Team Project initiated by Clare Madge (Lecturer) Henrietta O Connor (Lecturer) Jane Wellens (Educational Developer) Key work undertaken by Rob Shaw (Learning Technologist) Project supported by Tristram Hooley (PG Training Coordinator) Lisa Barber (Cartographer) Bill Hickin (Senior Computer Officer) Julia Meek (Evaluation Consultant)
Problems with on-line resources Expensive; Rarely used; Difficult to use; Inflexible – rarely meet our exact needs; Frequently out of date; Anxiety about the reliability of the information; Frequently poorly referenced.
Conventional classroom two way communication
On-line learning environment Brilliant! Fantastic! Academically rigorous! Interesting! Publish on-line one-way communication
Addressing These Problems Through Evaluation Academic contentAcademic peer review Graphic design/lookUser feedback UsabilityHeuristic (expert) evaluation Watching sample users (cognitive walkthrough) FlexibilityClassroom evaluation
Designing Evaluation Ask the experts – speak to an evaluation consultant; Evaluate early and often; Use a range of evaluation techniques; Try to evaluate with a range of different users who are drawn from your target audiences; Act on your evaluation – redraft, redesign and re- evaluate.
Development and Evaluation Project idea Project group evaluation Expert evaluation Usability studies User feedback Design Classroom testing Academic Peer Review Complete Project
Stages of Evaluation Project group evaluation Expert evaluation Usability studies User feedback Ongoing Regular written feedback and discussion in team meetings. Evaluation activities completed so far: Initial usability studies (October - November) Heuristic evaluation Focus on the design - prioritised report produced. Pilot with group of potential users – feedback anecdotal and via evaluation questionnaire. Cognitive walkthrough. Observation of use of site by potential users and eliciting of immediate feedback.
Project aims 1.To produce and evaluate a high-quality online portal to provide training in online research methods 2.To act as a self-supporting online resource to enhance understanding of both the theoretical and practical aspects of online research methods including web-based questionnaires, virtual synchronous and asynchronous interviews 3.To draw on a wide range of successful good practice case studies, cover associated ethical issues of online research, provide important resource links and technical guidance.
First draft
Most recent draft
Informal team evaluation Positives : Swift response so glaring problems are likely to be picked up early and not embedded; A consensus is usually right; Forum for debate and discussion of possibilities; Helpful for whole team to have an insight into the process and the ongoing development. Negatives Can be a tendency to go round in circles; Ability to see the site with a fresh eye diminishes over time – Over-familiarity with the rules of the site; Can be difficult to be objective.
Developments inspired by the team: Before
Developments inspired by the team: After then
Heuristic evaluation Heuristic - process or method for attempting the solution of a problem Heuristic evaluation carried out by experts in web design and human-computer interaction. Focus on the design, navigation and accessibility of the site rather than content. Report produced with any problems given a severity rating from 0-4.
Extract from report
Heuristic evaluation Positives: Navigation and design flaws highlighted; External input is more objective; Severity rating allows easy setting of priorities. Provides reassurance about things that are going well Negatives: Need to strike a balance between carrying out formal evaluation early enough to inform the design process, and maximising cost-effectiveness; Some of the points made referred to known issues;
Heuristic evaluation: Before
Heuristic evaluation: After
Cognitive walkthrough A structured way in which user behaviour can be observed. Aims to discover aspects of the site that the user finds difficult to understand. The user uses the site. The development team watch them and ask them to describe what they see.
Cognitive walkthrough Positives : Real insights gained into potential issues with new users; Extremely practical and easy to implement; Encourages team members to view with a fresh eyes; Throws up surprises. Negatives Can be subject to personal quirks; Not always easy to draw generalisable conclusions; Danger of over-reaction.
Screen shots Before After
User-group pilot Introduced to a group of students on a research methods training programme. Students carry out a mini-package of material from one section of the training package in their own time. Feedback primarily via questionnaire. E.g.
Pilot with group of potential users Can provide a more measured response which is useful in tandem with the rapid feedback from cognitive walkthrough. Allows compilation of records and more accurate means of analysis and comparison. Depends on people giving their time to help. Only the more motivated are likely to respond given the time required. Can identify individuals who are interested in getting involved or finding out more.
Stages of Evaluation Future evaluation activities: Case studies with users (February – April 2005) Small groups of target users will be followed using the site, keeping diaries on ease of use/content. Focus groups. Content evaluation (February – April 2005) Subject experts to evaluate content of completed modules (e.g. Chris Mann). Formal user study (Autumn 2005) Users will be observed using the package and asked to provide feedback through questionnaires.
Conclusions To avoid unpleasant surprises: Use a range of methods and target groups Evaluate as often as you can Act quickly upon the results