GAN-MVL 2.1 Heuristic Evaluation

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Evaluation Through Expert Analysis U U U
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Usability 2009 J T Burns1 Usability & Usability Engineering.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Tool name : Firebug A URL for more information about the tool, or where to buy or download it : Firebug is.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
Designing Educational Web Sites to Support Student Learning Steven WarburtonTELRI Project.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Nielsen’s Ten Usability Heuristics
10 Usability Heuristics for User Interface Design.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
©2011 Elsevier, Inc. Heuristic Evaluation of MealpayPlus website Ruidi Tan Rachel Vilceus Anant Patil Junior Anthony Xi Li Kinberley Seals Niko Maresco.
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Developed by Tim Bell Department of Computer Science and Software Engineering University of Canterbury Human Computer Interaction.
Heuristic Evaluation JUAN MONRREAL JANETTE VAZQUEZ INEZ VELA.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
APPLE MAPS FOR APP Heuristic Evaluation By Rayed Alakhtar.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Discount Evaluation User Interface Design. Startup Weekend Wellington CALLING ALL DESIGNERS, DEVELOPERS AND IDEAS FOLK: Startup Weekend returns to Wellington.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Heuristic Evaluation May 4, 2016
Midterm in-class Tuesday, Nov 6
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Specification of the User Interface Server/Client
Heuristic Evaluation August 5, 2016
Evaluation Techniques 1
CS 522: Human-Computer Interaction Usability and HCI Topics
SY DE 542 User Testing March 7, 2005 R. Chow
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
Software Engineering D7025E
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
One-timer?. A new face of the technical communicator – UX is our strength – we design to write.
Chapter 26 Inspections of the user interface
Evaluation.
Nilesen 10 hueristics.
CSM18 Usability Engineering
COMP444 Human Computer Interaction Evaluation
Evaluation: Inspections, Analytics & Models
Presentation transcript:

GAN-MVL 2.1 Heuristic Evaluation Silvia Gabrielli, Roberto Ranon, Markus Hodapp

TOC Introduction: Heuristic evaluation method The Scenario used for the evaluation Main results Proposal for future evaluations (Lab simulations)

Heuristic Evaluation Expert review: system compliance with a set of usability heuristics [Nielsen ’94] Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation Quick turnaround for the iterative design process Typically 3-5 experts involved (75-80% of system flaws detected)

GAN-MVL HE 5 Usability Experts involved [University of Udine & University of Mannheim) GAN-MVL prototype 2.1 [http://gan.grid.elettra.trieste.it:8080/gridsphere] Type of functionalities tested: a) Help, download and installation of VI plugins b) Collaboration with a remote expert via Skype c) Remote access to an Instrument (FuncGen) via high-resolution video d) Remote control of a VI e) Interaction with Logbook, Calendar, Resource Browser, Http Wizard, Tunnel Monitor, User Profile Experts were provided with a HE template to prepare their Report

GAN-MVL HE GAN-MVL HE Scenario 1a: Access to GAN-MVL as a User Tasks performed: Registration as a new user Activation of an account and change of password Browsing of the Knowledge Management area, of the Help system to find and read the User Guide In the GAN Portal, Skype contact of the GAN-MVL Administrator to get support and access to the Function Generator instrument Download and installation of the High resolution viewer (from the download area) Remote check of the value of the signal generated by the Function Generator instrument Request (to the Admin.) to change current account from user to administrator Booking of an event in the calendar Scenario 1b: Use of GAN-MVL as Administrator Download and installation of LabView Runtime engine software for remote control of the Function Generator instrument Modification of the value of signal generated by the instrument Editing of a message in the Logbook about the operation previously performed Use of the HTTP Wizard to add a new instrument to the Control Room toolbar Access (from the GAN node) to the Tunnel Monitor to control (and eventually drop) the connections currently active (as in the case of an emergency) Logout GAN-MVL HE

HE results Synthesis of main flaws detected, reporting: Issue title Description of each issue found Where each issue occurs Number of experts that notified each issue Severity ranking assigned (Low-Medium-High) Recommendation(s) provided Results

Laboratory simulations & tests In-depth analysis of key issues identified by HE (and SOA) Experimental observation of breakdowns in system use Based on predefined collaborative scenarios of use Participation of a confederate user to elicit key situations (Longitudinal) Log data and observations collected

Laboratory simulations & tests Proposal for a future study on : GAN-MVL support to remote troubleshooting activities Expert-Operator audio-video interaction during troubleshooting Expert-Operator sharing of bidirectional 2D representations of instruments (Labview VI) Expert-Operator sharing and control of BD visual representations of VI Analysis of: time taken to arrive at a clear definition of the problem time taken to establish a common referent, to provide instructions, to solve the problem behavioural aspects of the interaction, communication patterns, interactional difficulties observed post session feedback, participants observations conclusion and insights for further testing

Laboratory simulations & tests Support from WP8 partners needed for: Defining a realistic and complex troubleshooting scenario to observe Set up appropriate devices-instruments for the troubleshooting scenario Identifying participants to involve in the test (expert-operator roles) Results: Will inform future lab simulations & tests Will inform GAN-MVL final release Will support the design of the field tests