Asking users & experts. Interviews Unstructured - are not directed by a script. Rich but not replicable. Structured - are tightly scripted, often like.

Slides:



Advertisements
Similar presentations
Chapter 11 Direct Data Collection: Surveys and Interviews Zina OLeary.
Advertisements

Chapter 15: Analytical evaluation
Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Asking Users and Experts
ACTIVELY ENGAGING THE STAKEHOLDER IN DEFINING REQUIREMENTS FOR THE BUSINESS, THE STAKEHOLDER, SOLUTION OR TRANSITION Requirements Elicitation.
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Announcements Project proposal part 1 &2 due Tue HW #2 due Wed night 11:59pm. Quiz #2 on Wednesday. Reading: –(’07 ver.) 7-7.4, (’02 ver.)
Data gathering.
Asking users & experts.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
Evaluation Through Expert Analysis U U U
Asking Users and Experts Bobby Kotzev Adrian Sugandhi.
Asking users & experts The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Evaluation: Inspections, Analytics & Models
Chapter 7 GATHERING DATA.
FOCUS GROUPS & INTERVIEWS
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
Design in the World of Business
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Write a question/comment about today’s reading on the whiteboard (chocolate!)  Make sure to sign.
Interviews Stephanie Smale. Overview o Introduction o Interviews and their pros and cons o Preparing for an interview o Interview Elements: o Questions.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Human Computer Interface
Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.
1 Asking users & experts and Testing & modeling users Ref: Ch
Introduction to Usability Engineering Learning about your users 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Chapter 7 Data Gathering 1.
Interviews. Unstructured - are not directed by a script. Rich but not replicable. Structured - are tightly scripted, often like a questionnaire. Replicable.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Ways of Collecting Information Interviews Questionnaires Ethnography Books and leaflets in the organization Joint Application Design Prototyping.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Questionnaires Questions can be closed or open Closed questions are easier to analyze, and may be done by computer Can be administered to large populations.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 6, 2008.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Data gathering (Chapter 7 Interaction Design Text)
Fashion MARKETING TID1131. Types of research Quantitative research Information relating to numbers – quantity. Method - surveys Qualitative research To.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Week 2: Interviews. Definition and Types  What is an interview? Conversation with a purpose  Types of interviews 1. Unstructured 2. Structured 3. Focus.
Some general tips for the interviews. Interviews Data collection method used to discover opinions held by users Works better for qualitative research.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Lecture 4 Supplement – Data Gathering Sampath Jayarathna Cal Poly Pomona Based on slides created by Ian Sommerville & Gary Kimura 1.
Data Gathering and Analysis
Chapter 7 GATHERING DATA.
SIE 515 Design Evaluation Lecture 7.
Human Computer Interaction Lecture 15 Usability Evaluation
Imran Hussain University of Management and Technology (UMT)
Lecture3 Data Gathering 1.
CS3205: HCI in SW Development Evaluation (Return to…)
Chapter 7 Data Gathering 1.
Chapter 7 GATHERING DATA.
GATHERING DATA.
Chapter 7 GATHERING DATA.
Gathering Requirements
Evaluation.
Evaluation: Inspections, Analytics & Models
Chapter 8 DATA GATHERING.
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

Asking users & experts

Interviews Unstructured - are not directed by a script. Rich but not replicable. Structured - are tightly scripted, often like a questionnaire. Replicable but may lack richness. Semi-structured - guided by a script but interesting issues can be explored in more depth. Can provide a good balance between richness and replicability.

Basics of interviewing Remember the DECIDE framework Goals and questions guide all interviews Two types of questions: ‘closed questions’ have a predetermined answer format, e.g., ‘yes’ or ‘no’ ‘open questions’ do not have a predetermined format Closed questions are quicker and easier to analyze

Things to avoid when preparing interview questions Long questions Compound sentences - split into two Jargon & language that the interviewee may not understand Leading questions that make assumptions e.g., why do you like …? Unconscious biases e.g., gender stereotypes

Components of an interview Introduction - introduce yourself, explain the goals of the interview, reassure about the ethical issues, ask to record, present an informed consent form. Warm-up - make first questions easy & non-threatening. Main body – present questions in a logical order A cool-off period - include a few easy questions to defuse tension at the end Closure - thank interviewee, signal the end, e.g, switch recorder off.

The interview process Use the DECIDE framework for guidance Dress in a similar way to participants Check recording equipment in advance Devise a system for coding names of participants to preserve confidentiality. Be pleasant Ask participants to complete an informed consent form Contextualization

Probes and prompts Probes - devices for getting more information. e.g., ‘would you like to add anything?’ Prompts - devices to help interviewee, e.g., help with remembering a name Remember that probing and prompting should not create bias. Too much can encourage participants to try to guess the answer.

Group interviews Also known as ‘focus groups’ Opinions are socially constructed Typically 3-10 participants Provide a diverse range of opinions Need to be managed to: - ensure everyone contributes - discussion isn’t dominated by one person - the agenda of topics is covered Inform the participants in writing about the interview topics.

Aspekter ved det kvalitative forskningsintervju - Kvale Livsverden: emnet er informantens livsverden Meningen: fortolke betydningen av sentrale temaer hos informanten Kvalitativt: søker kvalitativ viten Deskriptivt: nyanserede beskrivelser Spesifikke situasjoner, ikke generelle oppfattelser Bevist naivitet: åpenhet for nye og uventede fenomener Fokus på bestemte temaer Flertydighet: utsagn kan være flertydige Forandring: Å bli intervjuet kan skape ny innsikt Sensitivitet: Forskjellige intervjuere kan fremskaffe forskjellige utsagn om samme tema Mellommenneskelig sitasjon: Viten produseres i en mellommenneskelig interaksjon i intervjuet Positiv opplevelse: for både informant og intervjuer

Analyzing interview data Depends on the type of interview Structured interviews can be analyzed like questionnaires It is best to analyze unstructured interviews as soon as possible to identify topics and themes from the data Transkribering helt eller delvis Forskjellige kvalitative analyser: –Tykke historier –Kategorisering Se etter mønstre, handlinger, Trianguler

Questionnaires Questions can be closed or open Closed questions are easiest to analyze, and may be done by computer Can be administered to large populations Paper, & the web used for dissemination Advantage of electronic questionnaires is that data goes into a data base & is easy to analyze Sampling can be a problem when the size of a population is unknown as is common online

Questionnaire style Varies according to goal so use the DECIDE framework for guidance Questionnaire format can include: - ‘yes’, ‘no’ checkboxes - checkboxes that offer many options - Likert rating scales - semantic scales - open-ended responses Likert scales have a range of points 3, 5, 7 & 9 point scales are common Debate about which is best

Developing a questionnaire Provide a clear statement of purpose & guarantee participants anonymity Plan questions - if developing a web-based questionnaire, design off-line first Decide on whether phrases will all be positive, all negative or mixed Pilot test questions - are they clear, is there sufficient space for responses Decide how data will be analyzed & consult a statistician if necessary

Encouraging a good response Make sure purpose of study is clear Promise anonymity Ensure questionnaire is well designed Offer a short version for those who do not have time to complete a long questionnaire If mailed, include a s.a.e. Follow-up with s, phone calls, letters Provide an incentive 40% response rate is high, 20% is often acceptable

Advantages of online questionnaires Responses are usually received quickly No copying and postage costs Data can be collected in database for analysis Time required for data analysis is reduced Errors can be corrected easily Disadvantage - sampling problematic if population size unknown Disadvantage - preventing individuals from responding more than once

Problems with online questionnaires Sampling is problematic if population size is unknown Preventing individuals from responding more than once Individuals have also been known to change questions in questionnaires

Questionnaire data analysis & presentation Present results clearly - tables may help Simple statistics can say a lot, e.g., mean, median, mode, standard deviation Percentages are useful but give population size Bar graphs show categorical data well More advanced statistics can be used if needed

Add SUMI MUMMS QUIS

Asking experts Experts use their knowledge of users & technology to review software usability Expert critiques (crits) can be formal or informal reports Heuristic evaluation is a review guided by a set of heuristics Walkthroughs involve stepping through a pre- planned scenario noting potential problems

Heuristic evaluation Developed Jacob Nielsen in the early 1990s Based on heuristics distilled from an empirical analysis of 249 usability problems These heuristics have been revised for current technology, e.g., HOMERUN for web Heuristics still needed for mobile devices, wearables, virtual worlds, etc. Design guidelines form a basis for developing heuristics

Nielsen’s heuristics Visibility of system status Match between system and real world User control and freedom Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation

HOME-RUN H - High Quality Content O - Often Updated M - Minimum Download / Reaction Time E - Ease of Use R - Relevant to Users’ Need U - Unique to the on-line medium N - Netcentric. Activities and Information on the web site are planned, designed, developed, and maintained to take advantage of its capabilities while remaining highly sensitive to the organizational culture of its target users

Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

3 stages for doing heuristic evaluation Briefing session to tell experts what to do Evaluation period of 1-2 hours in which: - Each expert works separately - Take one pass to get a feel for the product - Take a second pass to focus on specific features Debriefing session in which experts work together to prioritize problems

Advantages and problems Few ethical & practical issues to consider Can be difficult & expensive to find experts Best experts have knowledge of application domain & users Biggest problems - important problems may get missed - many trivial problems are often identified

Cognitive walkthroughs Focus on ease of learning Designer presents an aspect of the design & usage scenarios One of more experts walk through the design prototype with the scenario Expert is told the assumptions about user population, context of use, task details Experts are guided by 3 questions

The 3 questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems

Pluralistic walkthrough Variation on the cognitive walkthrough theme Performed by a carefully managed team The panel of experts begins by working separately Then there is managed discussion that leads to agreed decisions The approach lends itself well to participatory design