Download presentation
Presentation is loading. Please wait.
1
1 User-Centered Design and Development Instructor: Franz J. Kurfess Computer Science Dept. Cal Poly San Luis Obispo FJK 2005
2
2 Copyright Notice These slides are a revised version of the originals provided with the book “Interaction Design” by Jennifer Preece, Yvonne Rogers, and Helen Sharp, Wiley, 2002. I added some material, made some minor modifications, and created a custom show to select a subset. –Slides added or modified by me are marked with my initials (FJK), unless I forgot it … FJK 2005
3
3 484-W09 Quarter The slides I use in class are in the Custom Show “484-W09”. It is a subset of the whole collection in this file. Week 7 contains slides from Chapters 12 and 13 of the textbook. FJK 2005
4
4 484-W10 Quarter The set of slides I use in class is close to the one in the PowerPoint Custom Show “484-W09”. Since I’m using mostly Keynote now, I use the “Skip” feature to achieve a similar result. 4
5
5 Chapter 13 Asking Users and Experts FJK 2005
6
6 Chapter Overview Asking Users –Interviews –Questionnaires Asking Experts –Inspections –Walkthroughs FJK 2005
7
7 Motivation often it is easier to talk with users than to observe them users can present their case, instead of using interpretations and guesses derived from observations it can be better to talk to experts about the expected behavior of users
8
8 FJK 2005 Objectives be familiar with the main methods of asking users and experts be aware of the differences between various interviewing techniques and types of questionnaires know how to develop basic interviewing strategies and how to design questionnaires know when to employ which method gain experience in obtaining information from users and experts
9
9 Asking users & experts
10
10 The aims –Discuss the role of interviews & questionnaires in evaluation. –Teach basic questionnaire design. –Describe how do interviews, heuristic evaluation & walkthroughs. –Describe how to collect, analyze & present data. –Discuss strengths & limitations of these techniques
11
11 Interviews unstructured –not directed by a script –rich but not replicable structured –tightly scripted, often like a questionnaire –replicable but may lack richness semi-structured –guided by a script –interesting issues can be explored in more depth –can provide a good balance between richness and replicability
12
12 Basics of interviewing DECIDE framework can offer guidance goals and questions –guide all interviews types of questions –‘closed questions’ have a predetermined answer format, e.g., ‘yes’ or ‘no’ quicker to administer easier to analyze may miss valuable information –‘open questions’ do not have a predetermined format may yield valuable information may take longer often more difficult to analyze
13
13 Things to avoid when preparing interview questions long questions compound sentences –split into two jargon –language that the interviewee may not understand leading questions –make assumptions e.g., why do/don’t you like …? unconscious biases –e.g. gender, age stereotypes
14
14 Components of an Interview introduction –introduce yourself, explain the goals of the interview, reassure about the ethical issues, ask to record, present an informed consent form. warm-up –make first questions easy and non-threatening. main body –present questions in a logical order a cool-off period –include a few easy questions to defuse tension at the end closure –thank interviewee, signal the end (switch recorder off)
15
15 Interview Process use the DECIDE framework for guidance dress in a similar way to participants check recording equipment in advance devise a system for coding names of participants to preserve confidentiality be pleasant ask participants to complete an informed consent form
16
16 Probes and Prompts probes –devices for getting more information –e.g., ‘Would you like to add anything?’ prompts –devices to help interviewee e.g. help with remembering a name probing and prompting should not create bias too much information can encourage participants to try to guess the answer –or the answer they think you would like to hear
17
17 Group Interviews also known as ‘focus groups’ typically 3-10 participants can provide a diverse range of opinions need to be managed –ensure everyone contributes –discussion isn’t dominated by one person –the agenda of topics is covered –stay focused
18
18 Analyzing Interview Data depends on the type of interview structured interviews –can be analyzed like questionnaires unstructured interviews –generate data like that from participant observation –analyze as soon as possible to identify topics and themes from the data
19
19 Questionnaires questions can be closed or open –closed questions are easiest to analyze, and may be done by computer can be administered to large populations –paper, email & the web used for dissemination electronic questionnaires –data goes into a data base and is easy to analyze –requires computer skills, access online questionnaires –sampling can be a problem when the size of a population is unknown –may not be representative of the whole group of users
20
20 Questionnaire Style varies according to goal –use the DECIDE framework for guidance questionnaire format can include: –‘yes’, ‘no’ checkboxes –checkboxes that offer many options (radio buttons) –Likert rating scales –semantic scales –open-ended responses
21
21 Likert Scales have a range of points –3, 5, 7 & 9 point scales are common –sometimes verbal descriptors are used debate about which is best
22
22 Developing a Questionnaire Provide a clear statement of purpose & guarantee participants anonymity Plan questions –if developing a web-based questionnaire, design off-line first Decide on whether phrases will all be positive, all negative or mixed Pilot test questions –are the questions clear –is there sufficient space for responses Decide how data will be analyzed –consult a statistician if necessary
23
23 Encouraging a good response Make sure purpose of study is clear Promise anonymity Ensure questionnaire is well designed Offer a short version for those who do not have time to complete a long questionnaire If mailed, include a s.a.e. Follow-up with emails, phone calls, letters Provide an incentive 40% response rate is high, 20% is often acceptable
24
24 Advantages of online questionnaires responses are usually received quickly no copying and postage costs data can be collected in database for analysis time required for data analysis is reduced errors can be corrected easily
25
25 Problems with online questionnaires Sampling is problematic if population size is unknown –reasonable number of responses –participants may not be representative for the overall populations self-selected higher motivation in participating may have a hidden agenda Preventing individuals from responding more than once Individuals have also been known to change questions in email questionnaires –different participants answer different questions
26
26 Questionnaire Data Analysis and Presentation present results clearly –tables may help simple statistics –can say a lot, e.g., mean, median, mode, standard deviation –more advanced statistics can be used if needed percentages –useful but give population size bar graphs –show categorical data well
27
27 Activity: CSC Alumni Questionnaires goal: solicit feedback on the CSC program from alumni a few years after graduation develop a questionnaire for CSC alumni –sample selection –cover letter –questions –administration online, e-mail, phone, in person, … –evaluation and assessment response rate, bias, –follow-up FJK 2005
28
28 Add SUMI MUMMS QUIS Additional Information –FAQ about questionnaires http://www.ucc.ie/hfrg/baseline/questionnaires.html http://www.ucc.ie/hfrg/baseline/questionnaires.html –FAQ about questionnaires http://www.ucc.ie/hfrg/baseline/questionnaires.html http://www.ucc.ie/hfrg/baseline/questionnaires.html FJK 2006
29
29 SUMI Software Usability Measurement Inventory is a rigorously tested and proven method of measuring software quality from the end user's point of view a consistent method for assessing the quality of use of a software product or prototype can assist with the detection of usability flaws before a product is shipped backed by an extensive reference database embedded in an effective analysis and report generation tool see http://www.ucc.ie/hfrg/questionnaires/sumi/whatis.html http://www.ucc.ie/hfrg/questionnaires/sumi/whatis.html FJK 2006
30
30 Asking Experts experts use their knowledge of users & technology to review software usability expert critiques (“crits”) –formal or informal reports heuristic evaluation –a review guided by a set of heuristics walkthroughs –stepping through a pre-planned scenario noting potential problems
31
31 Heuristic Evaluation developed by Jacob Nielsen in the early 1990s based on heuristics –distilled from an empirical analysis of 249 usability problems revised for current technology –HOMERUN for the Web heuristics still needed for new technologies –mobile devices, wearables, virtual worlds, etc. design guidelines –form a basis for developing heuristics
32
32 Nielsen’s Heuristics Visibility of system status Match between system and real world User control and freedom Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation
33
33 Discount Evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used Empirical evidence: on average 5 evaluators identify 75-80% of usability problems.
34
34 Stages Heuristic Evaluation briefing session –to tell experts what to do evaluation period –duration of 1-2 hours –each expert works separately –take one pass to get a feel for the product –take a second pass to focus on specific features debriefing session –experts work together to prioritize problems
35
35 Advantages and problems few ethical and practical issues to consider can be difficult and expensive to find experts best experts have knowledge of application domain and users biggest problems –important problems may get missed –many trivial problems are often identified
36
36 Activity: Heuristic Evaluation of CSC Questionnaires select a sample from the CSC alumni questionnaires, and develop a heuristic evaluation strategy for it –briefing session –evaluation period –debriefing FJK 2005
37
37 Cognitive walkthroughs focus on ease of learning designer presents an aspect of the design and usage scenarios one or more experts walk through the design prototype with the scenario expert is told the assumptions about user population, context of use, task details experts are guided by 3 questions –see next slide
38
38 Cognitive Walkthrough Questions Will the correct action be sufficiently evident to the user? Will the user notice that the correct action is available? Will the user associate and interpret the response from the action correctly? As the experts work through the scenario they note problems.
39
39 Pluralistic Walkthrough variation on the cognitive walkthrough theme performed by a carefully managed team panel of experts begins by working separately then there is managed discussion that leads to agreed decisions the approach lends itself well to participatory design
40
40 Key points structured, unstructured, semi-structured interviews, focus groups and questionnaires –closed questions are easiest to analyze and can be replicated –open questions are richer –check boxes, Likert / semantic scales expert evaluation: heuristic and walkthroughs –relatively inexpensive because no users –heuristic evaluation relatively easy to learn –may miss key problems and identify false ones
41
41 A project for you … Activeworlds.com Questionnaire to test reactions with friends http://www.acm.org/~perlman/question.html http://www.ifsm.umbc.edu/djenni1/osg/ Develop heuristics to evaluate usability and sociability aspects
42
42 A project for you … http://www.id-book.com/catherbhttp://www.id-book.com/catherb/ -provides heuristics and a template so that you can evaluate different kinds of systems. More information about this is provided in the interactivities section of the id-book.com website.
43
43 A project for you … Go to the The Pew Internet & American Life Survey www.pewinternet.org/ (or to another survey of your choice)www.pewinternet.org/ Critique one of the recent online surveys Critique a recent survey report
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.