1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.

Slides:



Advertisements
Similar presentations
Chapter 11 Direct Data Collection: Surveys and Interviews Zina OLeary.
Advertisements

Chapter 15: Analytical evaluation
Chapter 7 Data Gathering 1.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Asking Users and Experts
CS305: HCI in SW Development Continuing Evaluation: Asking Experts Inspections and walkthroughs.
Chapter 14: Usability testing and field studies. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept.
Data gathering.
Asking users & experts.
Chapter 15: Analytical evaluation. 2 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2009.
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005.
1 FJK User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo 1.
Evaluating with experts
Usability 2004 J T Burns1 Usability & Usability Engineering.
Asking users & experts The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Asking users & experts. Interviews Unstructured - are not directed by a script. Rich but not replicable. Structured - are tightly scripted, often like.
Evaluation: Inspections, Analytics & Models
Chapter 7 GATHERING DATA.
FOCUS GROUPS & INTERVIEWS
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Write a question/comment about today’s reading on the whiteboard (chocolate!)  Make sure to sign.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
Human Computer Interface
Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.
1 Asking users & experts and Testing & modeling users Ref: Ch
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Chapter 7 Data Gathering 1.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Interviews. Unstructured - are not directed by a script. Rich but not replicable. Structured - are tightly scripted, often like a questionnaire. Replicable.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
1 Learning Objectives: 1.Understand data collection principles and practices. 2.Describe the differences between collecting qualitative and quantitative.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Chapter 15: Analytical evaluation Q1, 2. Inspections Heuristic evaluation Walkthroughs Start Q3 Reviewers tend to use guidelines, heuristics and checklists.
Analytical evaluation Prepared by Dr. Nor Azman Ismail Department of Computer Graphics and Multimedia Faculty of Computer Science & Information System.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Computer Science Department California Polytechnic State University San Luis Obispo, CA, U.S.A. Franz J. Kurfess CPE/CSC 484: User-Centered Design and.
Questionnaires Questions can be closed or open Closed questions are easier to analyze, and may be done by computer Can be administered to large populations.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 6, 2008.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Asking users & experts. The aims Discuss the role of interviews & questionnaires in evaluation. Teach basic questionnaire design. Describe how do interviews,
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Data gathering (Chapter 7 Interaction Design Text)
Week 2: Interviews. Definition and Types  What is an interview? Conversation with a purpose  Types of interviews 1. Unstructured 2. Structured 3. Focus.
Asking Users and Experts Li Zhang (Jacey) Yuewei Zhou (Joanna)
Lecture 4 Supplement – Data Gathering Sampath Jayarathna Cal Poly Pomona Based on slides created by Ian Sommerville & Gary Kimura 1.
Data Gathering and Analysis
Chapter 7 GATHERING DATA.
SIE 515 Design Evaluation Lecture 7.
Imran Hussain University of Management and Technology (UMT)
Lecture3 Data Gathering 1.
CS3205: HCI in SW Development Evaluation (Return to…)
Chapter 7 Data Gathering 1.
Chapter 7 GATHERING DATA.
GATHERING DATA.
Chapter 7 GATHERING DATA.
Evaluation.
Evaluation: Inspections, Analytics & Models
Evaluation: Inspections, Analytics, and Models
Presentation transcript:

1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005

2 Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, I added some material, made some minor modifications, and created a custom show to select a subset. –Slides added or modified by me are marked with my initials (FJK), unless I forgot it … FJK 2005

3 484-W09 Quarter The slides I use in class are in the Custom Show “484-W09”. It is a subset of the whole collection in this file. Week 7 contains slides from Chapters 12 and 13 of the textbook. FJK 2005

4 484-W10 Quarter The set of slides I use in class is close to the one in the PowerPoint Custom Show “484-W09”. Since I’m using mostly Keynote now, I use the “Skip” feature to achieve a similar result. 4

5 Chapter 13 Asking Users and Experts FJK 2005

6 Chapter Overview Asking Users –Interviews –Questionnaires Asking Experts –Inspections –Walkthroughs FJK 2005

7 Motivation often it is easier to talk with users than to observe them users can present their case, instead of using interpretations and guesses derived from observations it can be better to talk to experts about the expected behavior of users

8 FJK 2005 Objectives be familiar with the main methods of asking users and experts be aware of the differences between various interviewing techniques and types of questionnaires know how to develop basic interviewing strategies and how to design questionnaires know when to employ which method gain experience in obtaining information from users and experts

9 Asking users & experts

10 The aims –Discuss the role of interviews & questionnaires in evaluation. –Teach basic questionnaire design. –Describe how do interviews, heuristic evaluation & walkthroughs. –Describe how to collect, analyze & present data. –Discuss strengths & limitations of these techniques

11 Interviews unstructured –not directed by a script –rich but not replicable structured –tightly scripted, often like a questionnaire –replicable but may lack richness semi-structured –guided by a script –interesting issues can be explored in more depth –can provide a good balance between richness and replicability

12 Basics of interviewing DECIDE framework can offer guidance goals and questions –guide all interviews types of questions –‘closed questions’ have a predetermined answer format, e.g., ‘yes’ or ‘no’ quicker to administer easier to analyze may miss valuable information –‘open questions’ do not have a predetermined format may yield valuable information may take longer often more difficult to analyze

13 Things to avoid when preparing interview questions long questions compound sentences –split into two jargon –language that the interviewee may not understand leading questions –make assumptions e.g., why do/don’t you like …? unconscious biases –e.g. gender, age stereotypes

14 Components of an Interview introduction –introduce yourself, explain the goals of the interview, reassure about the ethical issues, ask to record, present an informed consent form. warm-up –make first questions easy and non-threatening. main body –present questions in a logical order a cool-off period –include a few easy questions to defuse tension at the end closure –thank interviewee, signal the end (switch recorder off)

15 Interview Process use the DECIDE framework for guidance dress in a similar way to participants check recording equipment in advance devise a system for coding names of participants to preserve confidentiality be pleasant ask participants to complete an informed consent form

16 Probes and Prompts probes –devices for getting more information –e.g., ‘Would you like to add anything?’ prompts –devices to help interviewee e.g. help with remembering a name probing and prompting should not create bias too much information can encourage participants to try to guess the answer –or the answer they think you would like to hear

17 Group Interviews also known as ‘focus groups’ typically 3-10 participants can provide a diverse range of opinions need to be managed –ensure everyone contributes –discussion isn’t dominated by one person –the agenda of topics is covered –stay focused

18 Analyzing Interview Data depends on the type of interview structured interviews –can be analyzed like questionnaires unstructured interviews –generate data like that from participant observation –analyze as soon as possible to identify topics and themes from the data

19 Questionnaires questions can be closed or open –closed questions are easiest to analyze, and may be done by computer can be administered to large populations –paper, & the web used for dissemination electronic questionnaires –data goes into a data base and is easy to analyze –requires computer skills, access online questionnaires –sampling can be a problem when the size of a population is unknown –may not be representative of the whole group of users

20 Questionnaire Style varies according to goal –use the DECIDE framework for guidance questionnaire format can include: –‘yes’, ‘no’ checkboxes –checkboxes that offer many options (radio buttons) –Likert rating scales –semantic scales –open-ended responses

21 Likert Scales have a range of points –3, 5, 7 & 9 point scales are common –sometimes verbal descriptors are used debate about which is best

22 Developing a Questionnaire Provide a clear statement of purpose & guarantee participants anonymity Plan questions –if developing a web-based questionnaire, design off-line first Decide on whether phrases will all be positive, all negative or mixed Pilot test questions –are the questions clear –is there sufficient space for responses Decide how data will be analyzed –consult a statistician if necessary

23 Encouraging a good response Make sure purpose of study is clear Promise anonymity Ensure questionnaire is well designed Offer a short version for those who do not have time to complete a long questionnaire If mailed, include a s.a.e. Follow-up with s, phone calls, letters Provide an incentive 40% response rate is high, 20% is often acceptable

24 Advantages of online questionnaires responses are usually received quickly no copying and postage costs data can be collected in database for analysis time required for data analysis is reduced errors can be corrected easily

25 Problems with online questionnaires Sampling is problematic if population size is unknown –reasonable number of responses –participants may not be representative for the overall populations self-selected higher motivation in participating may have a hidden agenda Preventing individuals from responding more than once Individuals have also been known to change questions in questionnaires –different participants answer different questions

26 Questionnaire Data Analysis and Presentation present results clearly –tables may help simple statistics –can say a lot, e.g., mean, median, mode, standard deviation –more advanced statistics can be used if needed percentages –useful but give population size bar graphs –show categorical data well

27 Activity: CSC Alumni Questionnaires goal: solicit feedback on the CSC program from alumni a few years after graduation develop a questionnaire for CSC alumni –sample selection –cover letter –questions –administration online, , phone, in person, … –evaluation and assessment response rate, bias, –follow-up FJK 2005

28 Add SUMI MUMMS QUIS Additional Information –FAQ about questionnaires –FAQ about questionnaires FJK 2006

29 SUMI Software Usability Measurement Inventory is a rigorously tested and proven method of measuring software quality from the end user's point of view a consistent method for assessing the quality of use of a software product or prototype can assist with the detection of usability flaws before a product is shipped backed by an extensive reference database embedded in an effective analysis and report generation tool see FJK 2006

30 Asking Experts experts use their knowledge of users & technology to review software usability expert critiques (“crits”) –formal or informal reports heuristic evaluation –a review guided by a set of heuristics walkthroughs –stepping through a pre-planned scenario noting potential problems

31 Heuristic Evaluation developed by Jacob Nielsen in the early 1990s based on heuristics –distilled from an empirical analysis of 249 usability problems revised for current technology –HOMERUN for the Web heuristics still needed for new technologies –mobile devices, wearables, virtual worlds, etc. design guidelines –form a basis for developing heuristics

32 Nielsen’s Heuristics Visibility of system status Match between system and real world User control and freedom Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation

33 Discount Evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used Empirical evidence: on average 5 evaluators identify 75-80% of usability problems.

34 Stages Heuristic Evaluation briefing session –to tell experts what to do evaluation period –duration of 1-2 hours –each expert works separately –take one pass to get a feel for the product –take a second pass to focus on specific features debriefing session –experts work together to prioritize problems

35 Advantages and problems few ethical and practical issues to consider can be difficult and expensive to find experts best experts have knowledge of application domain and users biggest problems –important problems may get missed –many trivial problems are often identified

36 Activity: Heuristic Evaluation of CSC Questionnaires select a sample from the CSC alumni questionnaires, and develop a heuristic evaluation strategy for it –briefing session –evaluation period –debriefing FJK 2005

37 Cognitive walkthroughs focus on ease of learning designer presents an aspect of the design and usage scenarios one or more experts walk through the design prototype with the scenario expert is told the assumptions about user population, context of use, task details experts are guided by 3 questions –see next slide

38 Cognitive Walkthrough Questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems.

39 Pluralistic Walkthrough variation on the cognitive walkthrough theme performed by a carefully managed team panel of experts begins by working separately then there is managed discussion that leads to agreed decisions the approach lends itself well to participatory design

40 Key points structured, unstructured, semi-structured interviews, focus groups and questionnaires –closed questions are easiest to analyze and can be replicated –open questions are richer –check boxes, Likert / semantic scales expert evaluation: heuristic and walkthroughs –relatively inexpensive because no users –heuristic evaluation relatively easy to learn –may miss key problems and identify false ones

41 A project for you … Activeworlds.com Questionnaire to test reactions with friends Develop heuristics to evaluate usability and sociability aspects

42 A project for you … -provides heuristics and a template so that you can evaluate different kinds of systems. More information about this is provided in the interactivities section of the id-book.com website.

43 A project for you … Go to the The Pew Internet & American Life Survey (or to another survey of your choice) Critique one of the recent online surveys Critique a recent survey report