Observation, Interviews, and Questionnaires a.k.a. How to watch and talk to your users.

Slides:



Advertisements
Similar presentations
SEM A – Marketing Information Management
Advertisements

Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Surveys and Questionnaires. How Many People Should I Ask? Ask a lot of people many short questions: Yes/No Likert Scale Ask a smaller number.
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Cognitive Walkthrough More evaluation without users.
Deciding How to Measure Usability How to conduct successful user requirements activity?
Data analysis and interpretation. Agenda Part 2 comments – Average score: 87 Part 3: due in 2 weeks Data analysis.
IAT 334 Experimental Evaluation ______________________________________________________________________________________ SCHOOL OF INTERACTIVE ARTS + TECHNOLOGY.
Data gathering.
©N. Hari Narayanan Computer Science & Software Engineering Auburn University 1 COMP 7620 Evaluation Chapter 9.
Observation Watch, listen, and learn…. Agenda  Observation exercise Come back at 3:40.  Questions?  Observation.
Evaluation. formative 4 There are many times throughout the lifecycle of a software development that a designer needs answers to questions that check.
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
User Interface Testing. Hall of Fame or Hall of Shame?  java.sun.com.
Evaluation Methodologies
Observing users.
Intro to Evaluation See how (un)usable your software really is…
Data collection methods Questionnaires Interviews Focus groups Observation –Incl. automatic data collection User journals –Arbitron –Random alarm mechanisms.
Interviews and Questionnaires
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
Intro to Evaluation See how (un)usable your software really is…
1 MTN-003 Training General Interviewing Techniques Some specific tips for administering the Screening interviewer-administered CRFs SSP Section 14.
Interviews and Questionnaires a.k.a. How to talk to your users.
Gathering Usability Data
By: Christopher Prewitt & Deirdre Huston.  When doing any project it is important to know as much information about the project and the views of everyone.
Intro to Evaluation See how (un)usable your software really is…
CSCI 4163/6904, summer Quiz  Multiple choice  Answer individually - pass in  Then class discussion.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Gathering User Data IS 588 Dr. Dania Bilal Spring 2008.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Chapter 12 Observing Users Li, Jia Li, Wei. Outline What and when to observe Approaches to observation How to observe How to collect data Indirect observation.
Human Computer Interaction
Interface Design Natural Design. What is natural design? Intuitive Considers our learned behaviors Naturally designed products are easy to interpret and.
©2010 John Wiley and Sons Chapter 6 Research Methods in Human-Computer Interaction Chapter 6- Diaries.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
Chapter 15 Qualitative Data Collection Gay, Mills, and Airasian
Intro to Evaluation See how (un)usable your software really is…
User Interface Design & Usability for the Web Card Sorting You should now have a basic idea as to content requirements, functional requirements and user.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
Requirements Gathering CS 561. Where do Requirements Come From? Handed to you (?) Dialogue with – Customer – User Are these always the same? Are these.
Observing users. The aims Discuss the benefits & challenges of different types of observation. Describe how to observe as an on-looker, a participant,
Observation & Experiments Watch, listen, and learn…
Fall 2002CS/PSY Empirical Evaluation Data collection: Subjective data Questionnaires, interviews Gathering data, cont’d Subjective Data Quantitative.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
AVI/Psych 358/IE 340: Human Factors Data Gathering October 6, 2008.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Cognitive Walkthrough More evaluating with experts.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Intro to Evaluation See how (un)usable your software really is…
Program Evaluation for Nonprofit Professionals Unit 4: Analysis, Reporting and Use.
Observation & Experiments Watch, listen, and learn…
How do we know if our UI is good or bad?.
Observational Methods Think Aloud Cooperative evaluation Protocol analysis Automated analysis Post-task walkthroughs.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Week 2: Interviews. Definition and Types  What is an interview? Conversation with a purpose  Types of interviews 1. Unstructured 2. Structured 3. Focus.
Chapter 29 Conducting Market Research. Objectives  Explain the steps in designing and conducting market research  Compare primary and secondary data.
1 ITM 734 Introduction to Human Factors in Information Systems Cindy Corritore Testing the UI – part 2.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
Evaluation through user participation
Designing Questionnaire
Interviews and Questionnaires
Observation & Experiments
Gathering data, cont’d Subjective Data Quantitative
Experimental Evaluation
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Presentation transcript:

Observation, Interviews, and Questionnaires a.k.a. How to watch and talk to your users

Agenda Questions? Reminder: part 3 due NEXT WEEK Bring your prototype to class Highly recommended: read on Heuristic Evaluation and Cognitive Walkthrough, be prepared for them Observation Interview Questionnaire Evaluation plan discussion

Observing Users Not as easy as you think One of the best ways to gather feedback about your interface Watch, listen and learn as a person interacts with your system Usually what occurs during a “usability test”

Location Observations may be In lab - Maybe a specially built usability lab Easier to control Can have user complete set of tasks In field Watch their everyday actions More realistic Harder to control other factors

Usability Lab ation_room2.htm Large viewing area in this one- way mirror which includes an angled sheet of glass the improves light capture and prevents sound transmission between rooms. Doors for participant room and observation rooms are located such that participants are unaware of observers movements in and out of the observation room.

Observation Direct In same room Can be intrusive Users aware of your presence Only see it one time, relies on good note-taking May use 1-way mirror to reduce intrusiveness Indirect Video recording Software logging Reduces intrusiveness, but doesn’t eliminate it Gives archival record, but can spend a lot of time reviewing it

Engaging Users In simple observation, you see actions but not what is going on in their head Qualitative techniques Think-aloud - very helpful Post-hoc verbal protocol - review video Critical incident logging - positive & negative Structured interviews - good questions “What did you like best/least?” “How would you change..?”

Think Aloud User describes verbally what s/he is thinking and doing What they believe is happening Why they take an action What they are trying to do Very widely used, useful technique Better understand user’s thought processes Potential problems: Can be awkward for participant Thinking aloud can modify way user performs task

Cooperative approach Another technique: Co-discovery learning (Constructive iteration) Join pairs of participants to work together Use think aloud Perhaps have one person be semi-expert (coach) and one be novice More natural (like conversation) so removes some awkwardness of individual think aloud Variant: let coach be from design team (cooperative evaluation)

Alternative What if thinking aloud during session will be too disruptive? Can use post-event protocol User performs session, then watches video afterwards and describes what s/he was thinking Sometimes difficult to recall Opens up door of interpretation

What if a user gets stuck? Determine in advance when and how you will offer help Use cooperative approaches: “What are you trying to do..?” “What made you think..?” “How would you like to perform..?” “What would make this easier to accomplish..?” Maybe offer hints

Inputs Need operational prototype could use Wizard of Oz or other simulation Need tasks and descriptions Reflect real tasks Avoid choosing only tasks your design best supports Minimize necessary background knowledge Pay attention to time and training required

Data Task based How do users approach the problem What problems do users have Need not be exhaustive, look for interesting cases Performance based Frequency and timing of actions, errors, task completion, etc. Analyzing data can be very time consuming!

Capturing a Session 1. Paper & pencil Is definitely cheap and easy Can be slow May miss things Time 10:00 10:03 10:08 10:22 Task 1 Task 2 Task 3 … SeSe SeSe

Capturing a Session 2. Recording (audio and/or video) Good for think-aloud Hard to tie to interface Multiple cameras may be needed Good, rich record of session Can be intrusive Can be painful to transcribe and analyze

Capturing a Session 3. Software logging Modify software to log user actions Can give time-stamped key press or mouse event Two problems: Too low-level, want higher level events Massive amount of data, need analysis tools

Example: Heather’s study Software: MeetingViewer interface fully functional Criteria – learnability, efficiency, see what aspects of interface get used, what might be missing Resources – subjects were students in a research group, just me as evaluator, plenty of time Wanted completely authentic experience

Heather’s evaluation Task: answer questions from a recorded meeting, use my software as desired Think-aloud Video taped, software logs Also had post questionnaire Wrote my own code for log analysis Watched video and matched behavior to software logs

Example materials

Example logs |hrichter| |MV|START| |hrichter| |MV|QUESTION|false|false|false|false|false|false| |hrichter| |MV|TAB|AGENDA |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|SLIDECHANGE| |hrichter| |MV|SEEK|PRESENTATION-A|566|604189| |hrichter| |MV|SEEK|PRESENTATION-A|566|604189| |hrichter| |MV|SEEK|PRESENTATION-A|566|604189| |hrichter| |MV|TAB|AGENDA |hrichter| |MV|SEEK|AGENDA|566|149613| |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|SLIDECHANGE| |hrichter| |MV|SEEK|PRESENTATION|566|315796| |hrichter| |MV|PLAY|566| |hrichter| |MV|TAB|AV |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|SLIDECHANGE| |hrichter| |MV|SEEK|PRESENTATION|566|271191| |hrichter| |MV|TAB|AV |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|TAB|AGENDA |hrichter| |MV|TAB|PRESENTATION |hrichter| |MV|TAB|AV |hrichter| |MV|TAB|AGENDA |hrichter| |MV|TAB|AV |hrichter| |MV|STOP|566| |hrichter| |MV|END

Data analysis Basic data compiled: Time to answer a question (or give up) Number of clicks on each type of item Number of times audio played Length of audio played User’s stated difficulty with task User’s suggestions for improvements More complicated: Overall patterns of behavior in using the interface User strategies for finding information

Data representation example

Data presentation

Some usability conclusions Need fast forward and reverse buttons (minor impact) Audio too slow to load (minor impact) Target labels are confusing, need something different that shows dynamics (medium impact) Need more labeling on timeline (medium impact) Need different place for notes vs. presentations (major impact)

Interviews & Questionnaires Subjective view of participants Quantitative – very structured Questionnaires often quantitative, but not entirely Structured Interviews Strict set of questions, deviation would compromise study Qualitative – less or no structure Semi-structured interviews Some deviation encouraged Unstructured interviews i.e. the ethnographic interview Little guide, very explorative

Interviews Potentially lots of detail can vary questions as needed Inexpensive Time consuming to perform and analyze Some interpretation required Subject to interviewer biases

Questionnaires Expensive to create …but cheap to administer Easier to get quantifiable results Can gather info from many more people Protects participant identity Only as good as the questions asked

Structured Interviews More similar to questionnaires Require a lot of training for any hope at inter-interviewer reliability But that means that they tend to give much more repeatable results

Unstructured Interviews Have a plan, but keep interview open to different directions Get participant to open up and express themselves in their terms and at own pace Create interpretations with users Be sure to use their terminology Take lots of time, but learn a lot as well

Semi-Structured Interviews Predetermine data of interest - know why you are asking questions - don’t waste time Plan for effective question types How do you perform task x? Why do you perform task x? Under what conditions do you perform task x? What do you do before you perform…? What information do you need to…? Whom do you need to communicate with to …? What do you use to…? What happens after you…? See Gordon & Gill, 1992; Graesser, Lang, & Elofson, 1987

Asking Questions Understand your goals Consider the ordering of the questions Avoid complex/long/multiple questions Avoid jargon; talk in participant’s language Be careful of stereotypes, biases

Clarity is important Questions must be clear, succinct, and unambiguous How much time have you spent reading news on the Web recently?  Some  A lot  Every day  Rarely  Etc.  None  0 to 5 hours  6 to 10 hours  11 to 20 hours  More than 20 hours

Avoid question bias Leading questions unnecessarily force certain answers. Do you think parking on campus can be made easier? What is your overall impression of… 1.Superb 2.Excellent 3.Great 4.Not so Great

Be aware of connotations Do you agree with the NFL owner’s decision to oppose the referee’s pay request? Do you agree with the NFL owner’s decision in regards to the referee’s pay demand? Do you agree with the NFL owner’s decision in regards to the referee’s suggested pay?

Leading questions People want to do well, give you what you are looking for Be aware of your own expectations before creating questions and while interviewing Use value neutral terms What do you like about this system? Vs. Tell me what you thought about this system.

Avoid hypotheticals Avoid gathering information on uninformed opinions Subjects should not be asked to consider something they’ve never thought about (or know or understand) Would a device aimed to make cooking easier help you?

Handle personal info carefully Ask questions subjects would not mind answering honestly. What is your age? What is your waist size? If subjects are uncomfortable, you will lose their trust Ask only what you really need to know

What’s wrong with this picture? How much easier is it to use this client than Outlook? I see you choose to use your keyboard shortcuts more than the mouse. Is that faster for you? Your choice of red is different than any other user we saw. Why did you do that?

Planning your interview: Introduction Warmup Main session Cool-off Closing Record everything exactly in your participants’ languages (don’t forget to test your recording equipment)

The warmup or “grand tour” question The first question helps set the tone for the interview Familiarize the participant to talking Encourage the participant that their true opinion does matter Question should be Easy to answer But not answered easily More than just a “yes” or “no” response Examples: Tell me about the work you do? What made you buy the computer?

Prompts “Nudge” a participant in a direction, or to get additional response Silent: remain silent until they say more Echo: repeat back and then ask “then what happens” etc. Make agreeing sounds: you say “uh huh” and the other person continues Tell Me More: could you tell me more about that? Clarifying: summarize and ask for confirmation or clarification, often leads to new discussion

Contents of a survey General/Background info Demographic data Also functions as as a “warm up” Correlate responses between groups Objective questions Open-ended/subjective

Background examples Demographic data: Age, gender Task expertise i.e. Have you ever worked in a restaurant? Motivation Frequency of use How often do you… Education/literacy What training have you had in …?

Closed Format Advantages Clarify alternatives Easily quantifiable Eliminate useless answer Disadvantages Must cover whole range All should be equally likely Don’t get interesting, “different” reactions Restricting set of choices Quantifiable

Many forms of response Dichotomous Multiple Choice Multiple Response Rank/Match Likert Rating

Questionnaire Styles LaTeX FrameMaker WordPerfect Word Rank from 1 - Very helpful 2 - Ambivalent 3 - Not helpful 0 - Unused ___ Tutorial ___ On-line help ___ Documentation Which word processing systems do you use?

Likert-type scale Typical scale uses 5, 7 or 9 choices Above that is hard to discern Doing an odd number gives the neutral choice in the middle You may not want to give a neutral option Characters on screen were: hard to read easy to read

What’s wrong with this picture? 2. What is your age? _______________ 3. How long have you used the internet? <1 year 1-3 years 3-5 years >5 years 4. How do you get information about courses? Web site Flyers Registration booklet Advisor Other students 5. How useful is the Internet in getting information about courses? ___________________________________________________________

On line questionnaires or internet Change checkboxes into dropdowns, etc Take advantage of the technology – check input Ensure its as accessible as paper (browser and client compatibility) Ensure confidentiality – how is this different from paper?

Free Web Survey Tools Zoomerang Survey Monkey phpESP Open Source surveys using PHP.

Analyzing your quantitative data “Code” open ended responses or interview questions to make quantitative Categorize all responses Look for trends in the data Count, average, tabulate Make charts, etc Run statistical analysis Use lo-fi methods (post-its, affinity diagrams, etc)

Analyzing qualitative data Find interesting cases, responses Look for patterns of responses Use post-its, affinity diagrams, etc. Look for any useful suggestions, improvements, explanations that help you improve your design Gather illustrative quotes from users that demonstrate your conclusions

Evaluation discussion Someone else should be able to pick up your plan and execute it. Be as SPECIFIC as possible What criteria are important? What tasks EXACTLY? What data? How will you record? What questions will you ask?