1 Evaluate—Qualitative Methods October 2, 2007 NEEDS DESIGN IMPLEMENTEVALUATE.

Slides:



Advertisements
Similar presentations
DEVELOPING A METHODOLOGY FOR MS3305 CW2 Some guidance.
Advertisements

Chapter 15: Analytical evaluation
Human Computer Interaction
Acknowledgements: Most of this course is based on the excellent course offered by Prof. Kellogg Booth at the British Columbia University, Vancouver, Canada.
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Usability presented by the OSU Libraries’ u-team.
Discount Evaluation Evaluating with experts. Agenda Online collaboration tools Heuristic Evaluation Perform HE on each other’s prototypes Cognitive Walkthrough.
Empirical Methods in Human- Computer Interaction.
Evaluation Methodologies
Heuristics  Basis  Evaluators –Qualifications –Training  Preparation –Scenario  Results –List of problems –Severity –Group synthesis.
Heuristic Evaluation IS 485, Professor Matt Thatcher.
Heuristic Evaluation Evaluating with experts. Discount Evaluation Techniques  Basis: Observing users can be time- consuming and expensive Try to predict.
Evaluation Through Expert Analysis U U U
Evaluating with experts
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
Usability 2004 J T Burns1 Usability & Usability Engineering.
Evaluation: Inspections, Analytics & Models
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Allison Bloodworth, Senior User Interaction Designer, University of California, Berkeley Gary Thompson, User Experience Leader, Unicon, Inc. Introduction.
Evaluating User Interfaces Walkthrough Analysis Joseph A. Konstan
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Gathering Usability Data
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
Predictive Evaluation
Discount Evaluation Evaluating with experts. Discount Evaluation Techniques Basis: – Observing users can be time-consuming and expensive – Try to predict.
Evaluation Techniques Material from Authors of Human Computer Interaction Alan Dix, et al.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Chapter 26 Inspections of the UI. Heuristic inspection Recommended before but in lieu of user observations Sort of like an expert evaluation Heuristics.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Human Computer Interaction
Usability Evaluation/LP Usability: how to judge it.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Heuristic Evaluation and Discount Usability Engineering Taken from the writings of Jakob Nielsen – inventor of both.
Evaluation Techniques zEvaluation ytests usability and functionality of system yoccurs in laboratory, field and/or in collaboration with users yevaluates.
SEG3120 User Interfaces Design and Implementation
New Media Research Methods- Part 1 Planning stage: Using Methods, Data and Tools to explore User’s Experience. How methods relate to research questions?
Y ASER G HANAM Heuristic Evaluation. Roadmap Introduction How it works Advantages Shortcomings Conclusion Exercise.
Usability Assessment Methods beyond Testing Chapter 7 Evaluating without users.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
AMSc Research Methods Research approach IV: Experimental [1] Jane Reid
Fall 2002CS/PSY Empirical Evaluation Data collection: Subjective data Questionnaires, interviews Gathering data, cont’d Subjective Data Quantitative.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Administrivia  Feedback from the mid-term evaluation  Insights from project proposal.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
Evaluation techniques
Chapter 26 Inspections of the user interface
Evaluation.
HCI Evaluation Techniques
CSM18 Usability Engineering
Evaluation Techniques
Gathering data, cont’d Subjective Data Quantitative
Cognitive Walkthrough
Evaluation: Inspections, Analytics & Models
Presentation transcript:

1 Evaluate—Qualitative Methods October 2, 2007 NEEDS DESIGN IMPLEMENTEVALUATE

2 Evaluation A little out of sequence due to scheduling A little out of sequence due to scheduling Will get more implementation over next two weeks Will get more implementation over next two weeks Imagine you’ve implemented your application Imagine you’ve implemented your application These are techniques you will need to design user study (end of project) These are techniques you will need to design user study (end of project)

3 Methods for evaluating system Qualitative Qualitative –Rich, subjective –Exploratoring concepts –More useful for earlier input Quantitative Quantitative –Precise, objective, repeatable –Demonstrating claims –More useful at documenting improvement –Can be expensive

4 For your project Will require aspects of both qualitative and quantitative methods Will require aspects of both qualitative and quantitative methods –Qualitative How users react to project, perceptions? How users react to project, perceptions? –Quantitative How users perform on project? How users perform on project? What would you improve on next iteration? What would you improve on next iteration? –Perhaps users’ perceptions of performance more important than actual values –Elevator waiting story

5 Design evaluation methods! Most important aspect of evaluation is upfront design! Most important aspect of evaluation is upfront design! –Expensive to line up users, collect data –Design to collect right information Pick appropriate method for what you want to learn Pick appropriate method for what you want to learn

6 Applying an evaluation method Determine the activity to observe Determine the activity to observe Develop the method Develop the method Human subjects review approval Human subjects review approval Pilot test the method Pilot test the method Recruit participants Recruit participants Collect the data Collect the data Inspect & analyze the data Inspect & analyze the data Draw conclusions to resolve design problems, reflect on what you learned Draw conclusions to resolve design problems, reflect on what you learned Redesign and implement the revised interface Redesign and implement the revised interface

7 Demographic information Demographic data Demographic data –Age, gender, culture –Task expertise, experience –Motivation –Frequency of use –Education, literacy, training No matter what method, collect demographic data No matter what method, collect demographic data

8 Environmental information Besides info on the user, may also need info on the operating environment Besides info on the user, may also need info on the operating environment –Windows, Mac, Linux? –Firefox, Internet Explorer, Safari? –Wired ethernet, wireless, modem –Morning, afternoon, night –Office, mobile, home

9 Qualitative methods “Discount” usability methods “Discount” usability methods –Hueristic Evaluation –Cognitive Walkthrough Questionnaire / Survey Questionnaire / Survey Think aloud protocol Think aloud protocol Co-discovery Co-discovery Semi-structured interview Semi-structured interview Deploy and observe in use Deploy and observe in use

10 “Discount” usability methods Enable evaluation at early stage, before prototype implemented Enable evaluation at early stage, before prototype implemented Conducted quickly, inexpensively Conducted quickly, inexpensively Early evaluation investment saves downstream development costs Early evaluation investment saves downstream development costs –Hueristic evaluation –Cognitive walkthrough

11 Heuristic Evaluation Fancy way to describe expert review Fancy way to describe expert review –HCI expert –Domain expert Expert review identifies usability issues before implementation Expert review identifies usability issues before implementation Our grades on your homework are form of heuristic evaluation Our grades on your homework are form of heuristic evaluation

12 Evaluation hueristics Visibility of system status Match between system and the real world User control and freedom Consistency and standards Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help users recognize, diagnose, and recover from errors Help and documentation

13 Heuristic evaluation method Multiple experts individually review (around 5 experts get 75% problems) Multiple experts individually review (around 5 experts get 75% problems) Observer records issues, answers questions, gives hints Observer records issues, answers questions, gives hints Conduct using low fidelity prototype or task analysis with storyboards and scenarios Conduct using low fidelity prototype or task analysis with storyboards and scenarios Generate list of usability problems according to hueristic compromised Generate list of usability problems according to hueristic compromised

14 Hueristic Evaluation analysis After created list of problems After created list of problems –Rank severity –Estimate fixability –Suggest possible fixes Analysis may involve larger team Analysis may involve larger team

15 Hueristic Evaluation as rigorous design review You can make a living out of doing Hueristic Evaluation You can make a living out of doing Hueristic Evaluation –Substantial consulting market for conducting Heuristic Evaluation You may pay a consultant to do a Heuristic Evaluation You may pay a consultant to do a Heuristic Evaluation –Know what you’re paying for –Especially the Severity, Fixability, Potential Fix aspects

16 Learning more about Hueristic Evaluation You can learn to do a Hueristic Evaluation You can learn to do a Hueristic Evaluation –

17 Cognitive Walkthrough Have user imagine walking through the process of using system Have user imagine walking through the process of using system Can use low-fidelity prototyping, partially implemented prototype Can use low-fidelity prototyping, partially implemented prototype Can use target user rather than expert Can use target user rather than expert –Pluralistic walkthrough uses experts, users, developers Like a code walkthrough Like a code walkthrough C. Wharton et. al. "The cognitive walkthrough method: a practitioner's guide" in J. Nielsen & R. Mack "Usability Inspection Methods" pp

18 Walkthrough procedure Give user representation of interface and task Give user representation of interface and task –Can they discover how to accomplish goal with description of interface? –Can ask “From here, how would you like to accomplish…?” Step through interface Step through interface –User takes action, system provides response –Describe actions not depicted in interface representation –Somewhat like Wizard of Oz

19 Stepping through interface Will user try to achieve the right goal? Will user try to achieve the right goal? –Conceptual model of goals and tasks Will user notice correct action is available? Will user notice correct action is available? –Visibility –Understandability Will user associate correct action with the goal to be achieved? Will user associate correct action with the goal to be achieved? –Aligning goals with sequence of actions If correct action performed, will user see progress toward solution? If correct action performed, will user see progress toward solution? –Feedback

20 Next assignment Testing storyboard with one user Testing storyboard with one user –Effectively, this is a cognitive walkthrough –Create storyboard –Define task –Step through with one user

21 Questionnaires & surveys User responses to specific questions User responses to specific questions Preparation is expensive, administration relatively cheap Preparation is expensive, administration relatively cheap Oral vs. written Oral vs. written –Oral provides interaction, followup, but takes more time –Written more efficient, can provide quantitative data

22 Designing questions Design questions with analysis in mind Design questions with analysis in mind –Closed format more precise, easier to analyze Convert qualitative  quantitative measures Convert qualitative  quantitative measures You give categories to users You give categories to users –Open-ended questions provide richer feedback, longer to analyze Users give you categories Users give you categories

23 Designing survey questions Multiple choice Multiple choice –Collecting information Ordinal ranking Ordinal ranking –Expressing relative preferences Likert scales Likert scales –Expressing personal reactions

24 Closed format styles facebook LinkedIn Orkut MySpace Rank frequency of use from 5 – Most frequent 1- Least frequent 0 - Unused ___ facebook ___ MySpace ___ LinkedIn ___ Orkut ___ Other__________ Which social networking systems do you use? Other_____________ Multiple choiceOrdinal ranking

25 Likert scales Ask users to rate on a numeric scale Ask users to rate on a numeric scale Odd number scale allows a neutral midpoint (5- or 7-point scale) Odd number scale allows a neutral midpoint (5- or 7-point scale) Even number scale forces taking a position (4- or 6-point scale) Even number scale forces taking a position (4- or 6-point scale) “Anchors” give examples of points along the scale “Anchors” give examples of points along the scale

26 Example question How important is the Berkeley- Stanford Big Game? How important is the Berkeley- Stanford Big Game? Very ImportantNot Important Most important event this Fall Could not care less Maybe I’ll go if my friends go

27 Closed Format Advantages Advantages –Clarify among alternatives –Easily quantifiable –Eliminate useless answers –Relatively quick to administer Disadvantages Disadvantages –Must cover whole range –All choices should be similarly likely –Don’t get interesting, “different” reactions

28 Questions people can answer about themselves What they do What they do How they do it How they do it Opinions about current activities Opinions about current activities Complaints about current activites Complaints about current activites Comparing one thing with another Comparing one thing with another How often they have done something in the recent past How often they have done something in the recent past

29 Questions people cannot answer about themselves Predicting what they would do / like / want Predicting what they would do / like / want Imagining a hypothetical scenario Imagining a hypothetical scenario Whether they would like a certain feature or product Whether they would like a certain feature or product Estimating how often they do things Estimating how often they do things

30 What’s most important?

31 Web-based survey tools Surveymonkey Surveymonkey – Zoomerang Zoomerang – Allows free basic analysis, more advanced features for fee Allows free basic analysis, more advanced features for fee Can extend reach to large number of respondents Can extend reach to large number of respondents

32 Thinking aloud protocol Have subject “think out loud” while performing task Have subject “think out loud” while performing task Psychology to elicit cognition Psychology to elicit cognition Requires training task Requires training task Facilitator actively prompts if subject falls silent for more then 10 secondss Facilitator actively prompts if subject falls silent for more then 10 secondss –“What are you thinking now?” –“So, you are trying to…?” –“And now you are…?”

33 Exercise: Volunteer Never used Photoshop before Never used Photoshop before

34 Co-discovery Have two people work on a task together (even though the task is normally done by one person) Have two people work on a task together (even though the task is normally done by one person) Coordination with each other naturally elicits cognition Coordination with each other naturally elicits cognition

35 Exercise: Two volunteers Never used Photoshop before Never used Photoshop before

36 Think aloud and co- discovery Valuable to evaluate tasks that require cognition Valuable to evaluate tasks that require cognition Time intensive Time intensive Rich feedback Rich feedback Think aloud requires training Think aloud requires training

37 Semi-structured interviews Interactively asking questions (face-to- face, telephone) Interactively asking questions (face-to- face, telephone) Give users chance to explain “why” to complement “what” they did, subjective user’s viewpoint Give users chance to explain “why” to complement “what” they did, subjective user’s viewpoint Can help with design questions Can help with design questions –“What improvements would you suggest?” Can be done individually or in groups Can be done individually or in groups

38 Semi-structured interviews Begin with list of open-ended questions Begin with list of open-ended questions –Ask all users these questions –Let users elaborate –Flexibility to ask follow-up questions Must audio-record Must audio-record Interviewer should attend to user (not notepad or laptop), use audio record for data (note timestamps) Interviewer should attend to user (not notepad or laptop), use audio record for data (note timestamps)

39 Questionnaire Issues Language Language –Beware terminology, jargon Clarity Clarity –“How effective was the system?” (ambiguous) Avoid leading questions Avoid leading questions –Phrase neutrally rather than positive or negative “How easy or hard was it to accomplish the task?” “How easy or hard was it to accomplish the task?”

40 Questionnaire Issues (2) Prestige bias Prestige bias –People answer a certain way because they want you to think that way about them Embarrassing questions Embarrassing questions –“What did you have the most problems with?” Hypothetical questions Hypothetical questions “Halo effect” “Halo effect” –When estimate of one feature affects estimate of another (e.g. intelligence/looks) –Aesthetics & usability, one example in HCI

41 Interviews Disadvantages Disadvantages –Subjective view –Interviewer(s) can bias the interview –Problem of inter-rater or inter- experimenter reliability (agreement) –Time-consuming –Hard to quantify

42 Pilot test observation method Pilot test method with some target users Pilot test method with some target users –Debug the questions, methods –Also debug logistics –Don’t count pilot data in analysis Make changes now before collecting data (want method for collecting data to be consistent) Make changes now before collecting data (want method for collecting data to be consistent)

43 Methods used in combination Mix of closed format, open-ended questions Mix of closed format, open-ended questions Surveys, questionnaires often used with quantitative performance measures to assess how users feel about interactions Surveys, questionnaires often used with quantitative performance measures to assess how users feel about interactions

44 Mechanics of user testing Readings give more detailed nuts and bolts Readings give more detailed nuts and bolts Common sense structuring of the experience to help it run smoothly Common sense structuring of the experience to help it run smoothly

45 Analyzing qualitative data Rich, open-ended data Rich, open-ended data Goal: Structure to characterize, describe, summarize data Goal: Structure to characterize, describe, summarize data Sounds harder than it is Sounds harder than it is

46 Analyzing qualitative data Exercise to immerse in data Exercise to immerse in data –Develop categories to count Range Range Average Average –Identify common patterns Allows identifying the interesting, unusual, exceptions Allows identifying the interesting, unusual, exceptions Also look for correlations Also look for correlations

47 Exercise: Analyzing conceptual map of Berkeley Example of rich, qualitative data Example of rich, qualitative data See if we can detect some patterns See if we can detect some patterns Characterize set of qualitative data Characterize set of qualitative data

48 Berkeley map Number of features? Number of features? Format of map Format of map Common features Common features –Landmarks –Roadways Unusual features Unusual features Assessments Assessments Correlations Correlations

49 Qualitative analysis Start with things you can count Start with things you can count –Average, range, median Look for patterns that are in common Look for patterns that are in common Recognize features that are unusual, interesting Recognize features that are unusual, interesting Look for correlations Look for correlations Reflect on what the data is saying Reflect on what the data is saying

50 Qualitative study of your project What do you want to learn What do you want to learn –User reactions, perceptions –Conceptual model problems –Areas to improve design –Does the design work?

51 Next time Quantitative methods Quantitative methods Readings Readings – "A face(book) in the crowd: social Searching vs. social browsing" –"iPod distraction: effects of portable music-player use on driver performance" Questions on Project Proposal assignment? Questions on Project Proposal assignment?