Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
IRB Web Site Usability Test Final Report English 3367 Web Usability Testing Team Prepared for: Donna Peters, Project Sponsor Human Subjects Research Coordinator.
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
11 HCI - Lesson 5.1 Heuristic Inspection (Nielsen’s Heuristics) Prof. Garzotto.
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
SIMS 213: User Interface Design & Development Marti Hearst Tues, Feb 25, 2003.
Electronic Communications Usability Primer.
Usability presented by the OSU Libraries’ u-team.
Part 4: Evaluation Days 25, 27, 29, 31 Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what,
AJ Brush Richard Anderson
Heuristic Evaluation. Sources for today’s lecture: Professor James Landay: stic-evaluation/heuristic-evaluation.ppt.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
Heuristic Evaluation.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Review an existing website Usability in Design. to begin with.. Meeting Organization’s objectives and your Usability goals Meeting User’s Needs Complying.
Lecture 23: Heuristic Evaluation
1 SKODA-AUTO.CZ prototype evaluation Poznań, 23th of March 2015.
Heuristic evaluation IS 403: User Interface Design Shaun Kane.
Heuristic Evaluation “Discount” Usability Testing Adapted from material by Marti Hearst, Loren Terveen.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
SAMPLE HEURISTIC EVALUATION FOR 680NEWS.COM Glenn Teneycke.
INFO3315 Week 4 Personas, Tasks Guidelines, Heuristic Evaluation.
Basic Principles of HCI Lecture Requirements Analysis Establish the goals for the Website from the standpoint of the user and the business. Agree.
Nielsen’s Ten Usability Heuristics
Usability Evaluation/LP Usability: how to judge it.
10 Usability Heuristics for User Interface Design.
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Multimedia Specification Design and Production 2012 / Semester 1 / week 5 Lecturer: Dr. Nikos Gazepidis
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
©2011 Elsevier, Inc. Heuristic Evaluation of MealpayPlus website Ruidi Tan Rachel Vilceus Anant Patil Junior Anthony Xi Li Kinberley Seals Niko Maresco.
SEG3120 User Interfaces Design and Implementation
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
June 5, 2007Mohamad Eid Heuristic Evaluation Chapter 9.
Heuristic Evaluation Short tutorial to heuristic evaluation
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
ParaQ Usability ParaQ Summit II March 14, 2006 Matthew Wong, SNL/CA.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
User Interface Evaluation Heuristic Evaluation Lecture #17.
Master Medical Informatics Biomedical Research and evaluation Methodology Maarten Buiter Khalid Bohoudi Mark de Groot Evelyn Lai Usability evaluation of.
APPLE MAPS FOR APP Heuristic Evaluation By Rayed Alakhtar.
© 2016 Cognizant. © 2016 Cognizant Introduction PREREQUISITES SCOPE Heuristic evaluation is a discount usability engineering method for quick, cheap,
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Heuristic Evaluation May 4, 2016
Human Computer Interaction Lecture 15 Usability Evaluation
Heuristic Evaluation August 5, 2016
Unit 14 Website Design HND in Computing and Systems Development
Heuristic Evaluation Jon Kolko Professor, Austin Center for Design.
PostPC Computing Heuristic Evaluation Prof. Scott Kirkpatrick, HUJI
Evaluation ECE 695 Alexander J. Quinn March 30, 2018.
Chapter 26 Inspections of the user interface
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Miguel Tavares Coimbra
Presentation transcript:

Actionable Information: Tools and Techniques to Help Design Effective Intranets Frank Cervone Assistant University Librarian for Information Technology Northwestern University Evanston, IL, USA Darlene Fichter Data Librarian University of Saskatchewan Saskatoon, SK, Canada

Overview Why heuristics testing? What is heuristics testing? Heuristics applied to the web Using heuristic testing for your intranet

Why? Will find 81%-90% of usability problems 1 –Evaluators are experts in software ergonomy and in the field in which the software is applied. 22% to 29% of usability problems 1 –Evaluators know nothing about usability Single evaluators found only 35 percent 2 1) Jakob Nielsen, Finding usability problems through heuristic evaluation. In: Proceedings of the ACM CHI '92 (3.-7. May 1992), pp ) Jakob Nielsen,

Heuristic evaluation? What –A usability inspection method –One or more expert evaluators systematically inspect a user interface design –Judge its compliance with recognized usability principles When –At any point in the design process Who –More is better –Best results with at least 3-5 evaluators –1 is better than none!

Yes, more is better Courtesy of useit.com

How? Evaluators review interface individually –Report problems to coordinator –Assign severity ratings Coordinator combines problems –Removes duplicates Evaluators review combined list –Optional - assign severity ratings as a group Coordinator averages ratings –Ranks problems by severity Web team looks for patterns and find solutions

Why? Good method for finding both major and minor problems in a user interface –Finds major problems quickly –Will tend to be dominated numerically by the minor problems –So, it is important to rate errors and rank them Compliments user testing –Not a replacement for it –Used to find different types of errors Things an “expert” user would notice

Rating errors Frequency –Is it common or rare Impact –Easy or difficult for the users to overcome? Persistence –A one-time problem? Users can overcome once they know about it –Repeatedly be bothered by the problem? Market impact –Certain usability problems can have a devastating effect, even if they are quite easy to overcome

Rating scale 0 = Not a problem I don't agree that this is a usability problem at all 1 = Cosmetic problem only Need not be fixed unless extra time is available 2 = Minor usability problem Fix should be given low priority 3 = Major usability problem Important, so should be given high priority 4 = Usability catastrophe Imperative to fix this before release

Heuristics (1-5) 1)Visibility of system status 2)Match between system and the real world 3)User control and freedom 4)Consistency and standards 5)Error prevention

Heuristics (6-10) 6)Recognition rather than recall 7)Flexibility and efficiency of use 8)Aesthetic and minimalist design 9)Error recovery 10)Help and documentation

That’s great, but… So, how do you apply this in the real world? Several possibilities Unstructured evaluation Structured evaluation

Unstructured evaluation Let the experts find the problems as they occur Provides greater “free-form” discovery of problems More appropriate when working with usability experts

Edmonton Public Library site 3 evaluators reviewed the site 2 passes through the site 1 ½ to 2 hours

Report summary More than 100 unique violations Over 60 violations for “consistency and standards” Another frequently violated heuristic being the “match between the system and the real world” –due to poor labels, jargon and ordering of items

Frequency by heuristic

Problems by area

Link Colors Main links in purple Smaller links in blue Other areas had different link colors altogether

Different Menus No ‘Home’ button ‘Borrower Services’ not found as a main page heading Menu options are the search links for the Song Index No side navigation menu offered.

Labels, Language and Ambiguity Overlap. Mismatch between heading and items Vague headings Audience specific areas are scattered

Structured evaluation Develop a list of specific questions related to the issues at hand –Tie back to heuristic principles Provides greater direction of problem- solving energy More appropriate when relying on “subject” experts

Sample questions at Northwestern 1.Did you feel that you were able to tell what was going on with the system while you were working? 2.Did the language on the site make sense to you? Were you able to understand what the pages were trying to communicate? 3.Did you notice inconsistencies in the way things were referred to? 4.Were you able to navigate and use the site without having to refer back to other pages for needed information?

Feedback - what people said Question: Did the language on the site make sense to you? Were you able to understand what the pages were trying to communicate? –No, sentences are too long. Use numbers to mark each choice –Some of the language seemed a bit like "library- ese," i.e. terms like "descriptor," etc –Most of the language makes sense, in the sense that it is not jargon (except for "NUcat"), but as I said the contents of the categories are not always clear

Long sentences Jargon

Interesting observations There are too many choices which are hard to distinguish between It seems like the info organization probably reflects the internal structures of the library more than the user's point of view I generally felt lost on the site. It was unclear where I needed to go to actually find anything I needed Too much information on one page

? ?? ???

How is heuristic evaluation relevant to usability testing? Allow us to fix big problems before user testing Provides a clue to problem areas –Can be basis for determining usability questions

How is this different from usability testing? Analyzing the user interface is the responsibility of the evaluator Observer can answer questions from the evaluators during the session Evaluators can be provided with hints on using the interface

Other types of evaluation techniques Heuristic evaluation Heuristic estimation Cognitive walkthrough Pluralistic walkthrough Feature inspection Consistency inspection Standards inspection Formal usability inspection

Questions? Frank Cervone Assistant University Librarian for Information Technology Northwestern University Evanston, IL, USA Darlene Fichter Data Librarian University of Saskatchewan Saskatoon, SK, Canada