Usability and taste  Taste is subjective  but not necessarily trivial  Taste is subject to fashion  Changes over time  Influenced by other people.

Slides:



Advertisements
Similar presentations
Cultural Heritage in REGional NETworks REGNET Project Meeting Content Group Part 1: Usability Testing.
Advertisements

Evaluation and Quality Of electronic journals and related information resources.
Evaluation of electronic resources. Review of Internet quality issues Nearly anyone can publish information on the Internet so –academic journals sit.
4th Module: Information Systems Development and Implementation:
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Evaluation (cont.): Heuristic Evaluation Cognitive Walkthrough CS352.
Usability presented by the OSU Libraries’ u-team.
Technical Writing II Acknowledgement: –This lecture notes are based on many on-line documents. –I would like to thank these authors who make the documents.
Useability.
Writing Reports: Identify these stages I) Obtaining a clear specification II) Research & preparation III) Report writing.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Recap of IS214. Placing this course in context Creating information technology that helps people accomplish their goals, make the experience effective,
Evaluating with experts
Website: Best Practices. Sources: The World Wide Web Consortium the main international standards organization for the World Wide Web Research-Based Web.
Lecture 7 Evaluation. Purpose Assessment of the result Against requirements Qualitative Quantitative User trials Etc Assessment of and Reflection on process.
MS3308 Cw1 assessment guide CW1 Deadlines CW1 (Strategy and Scope) DEADLINE ONE: 14th Nov CW1 (Structure and Skeleton) DEADLINE TWO: 28-Nov-2013.
Evaluation: Inspections, Analytics & Models
Analytical Evaluations 2. Field Studies
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Soo Young Rieh School of Information University of Michigan Information Ethics Roundtable Misinformation and Disinformation April 3-4, 2009 University.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Writing a Research Proposal
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Paper Prototyping Source:
© 2012 Adobe Systems Incorporated. All Rights Reserved. Copyright 2012 Adobe Systems Incorporated. All rights reserved. ® WRITING FOR THE WEB.
Soft Skills for a Digital Workplace: Verbal Communication Unit D: Improving Informal Communication.
“Come on! Give me ten!” What users really want and really do on library web sites Darlene Fichter OLA Super Conference 2003.
Technical Communications and Instructional Design It’s all in the family, so what’s the difference? STC Meeting May 17, 2001 Kim Lambdin, M.Ed.
Evaluation of digital collections' user interfaces Radovan Vrana Faculty of Humanities and Social Sciences Zagreb, Croatia
Evaluating Online Information Sources Ask yourself the following questions…
System Design: Designing the User Interface Dr. Dania Bilal IS582 Spring 2009.
Evaluation of Adaptive Web Sites 3954 Doctoral Seminar 1 Evaluation of Adaptive Web Sites Elizabeth LaRue by.
1 The Web & Professional Communication English 3104.
1 © Netskills Quality Internet Training, University of Newcastle Web Page Design © Netskills, Quality Internet Training University.
Best practice in design on NREN websites A heuristic evaluation made for TERENA TF-PR February 2004 By Julia Gardner & Gitte Kudsk, UNIC.
Evaluating web pages Stuart Lloyd-Green Celia Korvessis Lindsay Krieger Shane Sullivan.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
Imagine Cup Tips & Tricks PRESENTED BY:. Key Points 1. Where to start? 2. How to create a perfect team? 3. How to find the right mentor? 4. How to plan.
The Conclusion and The Defense CSCI 6620 Spring 2014 Thesis Projects: Chapters 11 and 12 CSCI 6620 Spring 2014 Thesis Projects: Chapters 11 and 12.
Web Design Guidelines by Scott Grissom1 A Quick Introduction to Web Design Scott Grissom Associate Professor Computer Science & Info Systems
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Designing & Testing Information Systems Notes Information Systems Design & Development: Purpose, features functionality, users & Testing.
5 Elements of Good Web Design A powerpoint by Himyt Kang & Harman Lidder.
Best practice in design on NREN websites A heuristic evaluation made for TERENA TF-PR By Julia Gardner & Gitte Kudsk, UNIC
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Web Design Guidelines by Scott Grissom 1 Designing for the Web  Web site design  Web page design  Web usability  Web site design  Web page design.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
COMMUNICATION SKILLS.
© 2012 Adobe Systems Incorporated. All Rights Reserved. Copyright 2012 Adobe Systems Incorporated. All rights reserved. ® INTRODUCTION TO INFORMATION ARCHITECTURE.
1 Technical Communication A Reader-Centred Approach First Canadian Edition Paul V. Anderson Kerry Surman
Evaluating Fitness Websites Kin 260 Jackie Kiwata.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Why Natural SEO is best for Search Engine Ranking now days Organic SEO.
WELCOME! Communication Camp NDSU Agriculture Communication WiFi Connect to NDSU Limited Open browser Enter Full name Password is 7n7K4X6g.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Abstract  An abstract is a concise summary of a larger project (a thesis, research report, performance, service project, etc.) that concisely describes.
 1. optional (check to see if your college requires it)  2. Test Length: 50 min  3. Nature of Prompt: Analyze an argument  4. Prompt is virtually.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Sequencing Writing Assignments
Sequencing Writing Assignments
Usability Techniques Lecture 13.
INTRODUCTION TO INFORMATION ARCHITECTURE
Guidelines for Selecting Computer Software
Evaluation: Inspections, Analytics & Models
Presentation transcript:

Usability and taste  Taste is subjective  but not necessarily trivial  Taste is subject to fashion  Changes over time  Influenced by other people  What people are accustomed to or expect at any given time  What people think reflects competence, skill at any given time

Usability assessment is like editing  Someone who looks at design (text) through the eyes of the user (reader)  Helping the design (text) do (say) what it is trying to do (say)  Encouraging the designer (author) to let go of favorite ideas that don’t help  Streamlining, tightening, clarifying for the sake of usability (comprehensibility)

Usability inspection  Different methods have different goals  They vary as to have judgment is derived and the criteria  Defining characteristic: reliance on judgment (rather than more elaborate testing)

Inspection methods  Heuristic evaluation  Guidelines review  Consistency inspections  Standards inspections  Features inspection  Cognitive walk-throughs  Pluralistic walk-throughs

Components of a walk- thru  One or more task scenarios  Explicit assumptions about the user population and contexts of use  Sequence of actions a user is likely to perform to complete the task  Prototype of some sort

Cognitive Walkthrough  Taking a systematic look at any product with an eye on ease of learning by inspection.  Using prototypes, task flows, and scenarios:  Walk in your users' shoes through your Web site  Try out parts of the design, following a task flow or scenario  Look for problems

Pluralistic walk-through  Involves several different groups, typically:  Users  Product developers  Usability experts  They walk through a scenario using a prototype

Benefits and limits  Short and fairly simple  Allows developers to hear the concerns of users with the system directly, early enough to do some good  Questions of validity given the constrained setting and tasks  Scenarios don’t readily reflect the full gamut of possible uses and users

Competitive usability study  Purpose: gather insights from related sites  Look for both problems and good ideas  Understand the context within which users will be working  Other choices  What they are used to  What is salient for users: whatever was sacrificed often becomes important in the next round  Caution: easy to focus on trivia

How to do competitor study  Usability inspection by developers or usability specialists  Usability testing by users  With directed tasks  With their own tasks

Selecting sites for competitive usability study  Sites that have good reputations  Sites that have interesting features or designs  The market leaders  Sites your users may be familiar with  Sites that have bad reputations, for mistakes to avoid  Sites that are considered cutting-edge  Ask users!

What to look for  Targeted users  The user goals, tasks this site supports  Content, functionality, navigation, design  Things a site does well  Why and how?  Things it does poorly  Why and how?  Ideas to adopt  Things to avoid

IBM’s guidelines for rating competitors’ sites  Is the purpose of the site clear?  Does the site clearly address a particular audience?  Is the site useful and relevant to its audience?  Is the site interesting and engaging?  Does the site enable users to accomplish all the tasks they need or want to accomplish?  Can these tasks be accomplished easily?  Is info organized in a way users will expect and understand?  Is the most important information easiest to find?  Is textual info clear, grammatically correct, and easy to read?  Do you have a clear idea of what the site contains?  Do you always know where you are, how to get where you want to go?  Is the presentation attractive?  Do pages load quickly enough?

Competitive Evaluation: Information Schools  Sites    To whom are these sites addressed?  What is your first impression of the school?  As a prospective student, what is your reaction? Masters? PhD?  As an employer?  As an alumni?  Walk thru the process of (1) investigating to see if you want to apply, and (2) applying

Conclusions  Content and functionality are paramount; other factors of usability are secondary.  Different users have different needs; for whom is this site optimized?  For new users in particular, the combination of content, functionality, design, and navigation create an overall impression of the organization and the site.  The initial impression may determine whether users are willing to go further when they have a choice.  In an inspection, it’s hard to get past initial, surface impressions.  With further user testing or use, other kinds of usability issues and criteria become important.  Guidelines and heuristics can help us to understand how and why a site does or does not work, but they are means to an end.