Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz.

Similar presentations


Presentation on theme: "Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz."— Presentation transcript:

1 Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz

2 Asking Users and Experts: Introduction We want to find out what users do, want to do, like, or don’t like Several tools & techniques are available Interviews Questionnaires Heuristic Evaluation Cognitive Walkthroughs Strengths/Weaknesses of these tools

3 Interviews Unstructuredare not directed by a script. Rich but not replicable. Structuredare tightly scripted, often like a questionnaire. Replicable but may lack richness. Semi-structuredguided by a script but interesting issues can be explored in more depth. Can provide a good balance between richness and replicability.

4 Interview Basics Remember the DECIDE framework –Determine Goals, Explore Questions, Choose eval paradigm, Identify issues, Decide on ethical issues, and Evaluate data. Goals and questions guide all interviews Two Types of Questions: Closed Questions Closed questions have a predetermined answer format, e.g., ‘yes’ or ‘no’. Closed questions are quicker and easier to analyze. Open Questions Open questions do not have a predetermined format, they are trying to determine the feeling of the interviewee.

5 Avoid Interview Pitfalls  Long questions  Compound sentences - split into two  Jargon & language that the interviewee may not understand  Leading questions that make assumptions e.g., why do you like …? They might not actually like the product.  Be aware of unconscious biases  e.g., gender stereotypes

6 Interview Components Introductionintroduce yourself, explain the goals of the interview, reassure about the ethical issues, ask to record, present an informed consent form. Warm-Upmake first questions easy & non-threatening. Main Bodypresent questions in a logical order Cool Offinclude a few easy questions to defuse tension at the end Closurethank interviewee, signal the end, e.g, switch recorder off.

7 The Interview Process Dress in a similar way to participants Check recording equipment in advance Devise a system for coding names of participants to preserve confidentiality. Be pleasant Ask participants to complete an informed consent form

8 Group interviews Also known as focus groups Typically 3-10 participants Provide a diverse range of opinions Need to be managed to: - ensure everyone contributes - discussion isn’t dominated by one person - the agenda of topics is covered, eliminate digressions.

9 Other Sources Telephone Interview –Can’t see participant’s body language or nonverbal clues. Online Interviews –Asynchronous: E-mail or Synchronous: Chat Room Retrospective Interview –Check with the participant to be sure interviewer correctly understood their remarks.

10 Analyzing interview data Depends on the type of interview Structured interviews can be analyzed like questionnaires Unstructured interviews generate data like that from participant observation It is best to analyze unstructured interviews as soon as possible to identify topics and themes from the data

11 Links UPA – (Usability Professionals Association)UPA –Discovering User Needs Discovering User Needs Sage Services - User Research & Usability Techniques Available Through Sage.Sage Services

12 Questionnaires Questions can be closed or open Closed questions are easiest to analyze, and may be done by computer Can be administered to large populations Paper, email & the web used for dissemination Advantage of electronic questionnaires is that data goes into a data base & is easy to analyze Sampling can be a problem when the size of a population is unknown as is common online

13 Questionnaire Style Questionnaire format can include: - ‘yes’, ‘no’ checkboxes - checkboxes that offer many options - Likert rating scales - semantic scales - open-ended responses

14 Designing Questionnaires Make questions clear and specific. When possible, ask closed questions and offer a range of answers. Consider including a ‘no-opinion’ option for questions that seeks opinions. Think about the ordering of questions. Avoid complex multiple questions.

15 Designing Questionnaires When scales are used, make sure the range is appropriate and does not overlap. Make sure the ordering of scales is intuitive and consistent. Avoid jargon. Provide clear instructions on how to complete the questionnaire. Make a balance between white space and the need to keep the questionnaire compact.

16 Encouraging a good response Ensure the questionnaire is well designed so that participants do not get annoyed. Provide a short overview section. Include a stamped, self-addressed envelope. Contact respondents through a follow-up letter. Offer incentives. Explain why you need the questionnaire to be completed.

17 Online Questionnaires Advantages: –Responses received quickly. –Copying and postage costs lower. –Data can be transferred immediately to database. –Time required is reduced. –Errors in questionnaire can be corrected easily.

18 Online Questionnaires Disadvantages: –Sampling is problematic if population size is unknown. –Preventing individuals from responding more than once. –Individuals have also been known to change questions in email questionnaires

19 Developing Web-based Questionnaire Steps: 1.Produce an error-free interactive electronic version from the original paper-based one. 2.Make the questionnaire accessible from all common browsers and readable for different sized monitors and network locations. 3.Make sure information identifying each respondent will be captured and stored confidentially. 4.User-test the survey with pilot studies before distributing.

20 Asking experts Experts use their knowledge of users & technology to review software usability Expert critiques (crits) can be formal or informal reports Heuristic evaluation is a review guided by a set of heuristics Walkthroughs involve stepping through a pre- planned scenario noting potential problems

21 Heuristic evaluation Developed by Jacob Nielsen in the early 1990s Based on heuristics distilled from an empirical analysis of 249 usability problems These heuristics have been revised for current technology, e.g., HOMERUN for web Heuristics still needed for mobile devices, wearables, virtual worlds, etc. Design guidelines form a basis for developing heuristics

22 Nielsen’s heuristics Visibility of system status Match between system and real world User control and freedom Consistency and standards Help users recognize, diagnose, recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation

23 HOMERUN High-quality content Often updated Minimal download time Ease of use Relevant to users’ needs Unique to the online medium Netcentric corporate culture

24 Discount evaluation Heuristic evaluation is referred to as discount evaluation when 5 evaluators are used. Empirical evidence suggests that on average 5 evaluators identify 75-80% of usability problems.

25 3 stages for doing heuristic evaluation Briefing session to tell experts what to do Evaluation period of 1-2 hours in which: - Each expert works separately - Take one pass to get a feel for the product - Take a second pass to focus on specific features Debriefing session in which experts work together to prioritize problems

26 Heuristic evaluation of websites MEDLINEplus – medical information websiteMEDLINEplus Keith Cogdill, commissioned by NLM, evaluated this website Based evaluation on the following: –Internal consistency –Simple dialog –Shorcuts –Minimizing the user’s memory load –Preventing errors –Feedback –Internal locus of control

27 MEDLINEplus suggestions for improvement Arrangement of health topics –Arrange topics alphabetically as well as in categories Depth of navigation menu –Having a higher “fan-out” in the navigation would enhance usability…meaning many short menus rather than a few deep ones.

28 Turning guidelines into heuristics for the web Navigation, access and information design are the three categories for evaluation Navigation –Avoidance of orphan pages –Avoidance long pages with excessive white space (scrolling problems) –Provide navigation support –Avoidance of narrow, deep, hierarchical meus –Avoidance of non-standard link colors –Provide a consistent look and feel

29 Turning guidelines into heuristics for the web (cont.) Access –Avoidance of complex URLs –Avoid long download times that annoy users Pictures.wav files Large software files Information Design –Have good graphical design –No outdated or incomplete information –Avoid excessive color usage –Avoid gratuitous use of graphics and animation –Be consistent

30 Advantages and problems Few ethical & practical issues to consider Can be difficult & expensive to find experts Best experts have knowledge of application domain & users Biggest problems - important problems may get missed - many trivial problems are often identified

31 Walkthroughs Alternative approach to heuristic evaluation Expert walks through task, noting problems in usability Users are usually not involved Several forms: Cognitive Walkthrough Pluralistic Walkthrough

32 Cognitive Walkthroughs Involves simulating the user’s problem-solving process at each step in the task Focus on ease of learning Examines user problems in detail, without needing users or a working prototype Can be very time-consuming and laborious Somewhat narrow focus Adaptations may offer improvements

33 Cognitive Walkthrough Steps One or more experts walk through the design description or prototype with a specific task or scenario For each task, they answer three questions: Will users know what to do? Will users see how to do it? Will users understand from feedback whether the action was correct or not?

34 Cognitive Walkthrough Steps A record of critical information is compiled Assumptions about what would cause problems and why, including explanations for why users would face difficulties Notes about side issues and design changes Summary of results Design is revised to fix problems presented

35 Pluralistic Walkthroughs Variation of Cognitive Walkthrough Users, developers, and experts work together to step through a scenario of tasks Strong focus on users’ tasks Provides quantitative data May be limited by scheduling difficulties and time constraints

36 Pluralistic Walkthrough Steps Scenarios are developed in the form of a series of screens Panel members individually determine the sequence of actions they would use to move between screens Everyone discusses their sequence with the rest of the panel The panel moves to the next scenario

37 Asking Users and Experts: Summary Various techniques exist for getting user opinions Structured, unstructured, and semi-structured Interviews, focus groups, questionnaires Expert evaluations: heuristic evaluation and walkthroughs Various techniques are often combined for added accuracy and usefulness

38 Interview: Jakob Nielsen “Father of Web Usability” Pioneered heuristic evaluation Demonstrated cost-effective methods for usability testing & engineering Author of several acclaimed design books www.useit.com www.nngroup.com


Download ppt "Ch 13. Asking Users & Experts Team 3:Jessica Herron Lauren Sullivan Chris Moore Steven Pautz."

Similar presentations


Ads by Google