Presentation is loading. Please wait.

Presentation is loading. Please wait.

Asking Users and Experts Yujia ZHU Yimeng DOU Asking Users l Interviews l Questionnaires.

Similar presentations


Presentation on theme: "Asking Users and Experts Yujia ZHU Yimeng DOU Asking Users l Interviews l Questionnaires."— Presentation transcript:

1

2 Asking Users and Experts Yujia ZHU Yimeng DOU

3 Asking Users l Interviews l Questionnaires

4 Asking Users --- Interview Interviews can be thought of as a “conversation with a purpose” (Kahn and Cannell, 1957) How much the interview is like an ordinary conversation depends on the questions to be answered and the type of interview to be used.

5 Guideline to develop questions Plan to keep the questions short, straightforward and avoid to ask too many –Avoid long questions –Avoid compound sentences –Avoid using jargon –Avoid leading questions like “Why do you like it?” –Be alert to unconscious biases

6 Guideline to plan an interview Try to make the interview as pleasant for interviewees as possible and make the interviewee feel comfortable –Introduction –Warmup session –Main session –Cool-off period –Closing session

7 Conduct Interviews The golden rule is to be professional –Dress in a similar way to the interviewees as possible, dress neatly and avoid standing out –Prepare an informed consent form, signature –Make sure your recording equipment works –Record answers exactly

8 4 types of interviews l Open-ended or unstructured l Structured l Semi-structured l Group interviews

9 Unstructured Interview l Feature: –Interviewers have less control on the process –Open (The format and content of answers is not predefined) l Benefit: –Generate rich data l Disadvantage: –Data is very time-consuming and difficult to analyze –Impossible to replicate the process

10 Structured Interview l Feature –The interviewers have more control –Typically, the questions are closed l To work best: –Questions need to be short and clearly worded –Questions should be refined by asking another evaluator to review the questions and run a small pilot study  Choosing type of interviews according to the evaluation goals and questions to ask

11 Semi-Structured Interviews l Feature –Combine features of structured and unstructured interviews. –The interviewers start with preplanned questions and then probes the interviewee to say more l Some ways to improve the interview –Neutral Probes are a device for getting more information –Prompt the person to help him/herself along –Accommodate silences WITHOUT l Probing and prompting should aim to help the interview WITHOUT introducing biases

12 Group Interviews l Focus Group: Normally three to 10 people are involved. l Benefit: –It allow diverse or sensitive issues to be raised –High validity –Low-cost, quick results, easily be scaled l Disadvantage: –Need a skillful facilitator –Difficult to get people together in a suitable location and time

13 Data analysis and interpretation in Interviews l Analysis of unstructured interviews can be time-consuming, though their contents can be rich l Data from structured interviews is usually analyzed quantitatively

14 Asking Users - Questionnaires l A well established technique for collecting demographic data and users’ opinion l Questionnaire vs. Interview –Questionnaire can be distributed to a large number of people –Interview are easy and quick to conduct

15 Designing questionnaires l basic demographic information  General, specific questions l Advices for designing a questionnaire –Make question clear and specific –Ask closed questions and offer a range of answers –Include a “no-opinion” option –Ordering of questions –Avoid complex multiple questions –Provide appropriate range –Ordering of scales should be intuitive and consistent –Avoid jargon –Clear instructions –Balance between using white space and the need to keep the questionnaire as compact as possible

16 Question and respond format l Different types of responses –Discrete responses (“Yes” or “No”) –Locate the answer within a range –A single preferred opinion l Commonly used formats –Check boxes and ranges l Appropriate range –Rating scales l Likert l Semantic differential scales

17 Likert Scales l Used for measuring opinions, attitudes, and beliefs l Widely used for evaluating user satisfaction with products l Example: The use of color is excellent: Strongly agree agree OK disagree strongly disagree

18 Designing Likert Scales l Gather a pool of short statements about the features of the product that are to be evaluated l Divide the items into groups with about the same number of positive and negative statements in each group l Decide on the scale l Select items for the final questionnaire and reword as necessary to make them clear

19 Semantic differential scales l Explore a range of bipolar attitudes about a particular item l Each pair of attitudes is represented as a pair of adjectives l Example: (evaluation for a homepage) AttractiveUgly ClearConfusing DullColorful ExcitingBoring AnnoyingPleasing……

20 Administering questionnaires l Two important issues –Reaching a representative sample of participants –Ensuring a reasonable response rate l Some ways to encourage a good response –Ensuring the questionnaire is well designed –Providing a short overview section –Including a stamped, self-addressed envelope for its return –Explaining why you need the questionnaire to be completed and assuring anonymity –Contacting respondents through a follow-up letter, phone call or email –Offering incentives such as payments

21 Online questionnaires l Advantages: –Quick responses –Low-cost to copying and postage –Immediate transferring data –Short time to require data for analysis –Easily correct error in questionnaire design l Two types: Email vs. Web-based –Email: target specific users –Web-based: more flexible, can use check boxes,pull-down and pop-up menus, help screens, and graphics l Problem: –Obtaining a random sample –Response rates may be lower

22 Developing a web-based questionnaire l 3 steps –Designing it on paper, following the general guidelines –Developing strategies for reaching the target population –Turning the paper version into a web-based version l Produce an error-free interactive electronic version from the paper version l Make the questionnaire accessible from all common browsers and readable from different-size monitors and different network locations l Make sure information identifying each respondent will be captured and stored confidentially l User-test the survey with pilot studies before distributing

23 Analyzing questionnaire data l Identify any trends and patterns –Use a spreadsheet like excel to hold the data l Simple statistics –Number or percentage of responses in a particular category l Bar charts can also be used to display data graphically l More advanced statistical techniques such as cluster analysis

24 Asking Experts When users are not accessible, or involving them is too expensive,we can ask experts or combination of experts and users to provide feedback. When users are not accessible, or involving them is too expensive,we can ask experts or combination of experts and users to provide feedback.

25 Various Inspection Techniques Heuristic evaluations Walkthroughs Experts inspect the human-computer interface and predict problems users would have when interacting with it.

26 Advantages Relatively inexpensive Easy to learn Effective Can be used at any stage of a design project **Usually when using heuristic evaluation, five evaluators can identify around 75% of the total usability problems.

27 Heuristic Evaluation Developed by Jakob Nielsen and colleagues It’s an informal usability inspection technique. Experts are guided by a set of usability principles. These principles are known as heuristics. Experts evaluate whether user-interface elements conform to those principles.

28 Heuristics (1) Visibility of system status Match between system and the real world User control and freedom Consistency and standards Help users recognize, diagnose, and recover from errors

29 Heuristics (2) Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation

30 Different Heuristics For Specific Purposes Core heuristics are too general Following you will see an example of a set of heuristics for website. There are also heuristics for evaluating toys, WAP devices, online communities, wearable computers, etc. These heuristics are developed by tailoring Nielson’s heuristics and market research, etc.

31 HOMERUN—Heuristics For Commercial Websites High-quality content Often updated Minimal download time Easy of use Relevant to user’s needs Unique to the online medium Netcentric corporate culture

32 How To Do Heuristic Evaluation Briefing Session Evaluation Period Debriefing Session

33 How To Do Heuristic Evaluation Tell experts what to do.Use a prepared script as a guide. Tell experts what to do.Use a prepared script as a guide. Evaluation Period Debriefing Session

34 How To Do Heuristic Evaluation Briefing Session Two passes: 1st pass, gain some feeling of the system. 2nd pass, focus on specific interface elements. Two passes: 1st pass, gain some feeling of the system. 2nd pass, focus on specific interface elements. Debriefing Session

35 How To Do Heuristic Evaluation Briefing Session Evaluation Period Experts come together to discuss their findings and to prioritize the problems found and suggest solutions. Experts come together to discuss their findings and to prioritize the problems found and suggest solutions.

36 Heuristic Evaluation  Selecting appropriate heuristics is very important  Because users are not involved, there are fewer practical and ethical issues.  A week is often cited as the time needed to train experts to be evaluators

37 Dilemma: Problems or False Alarms? Different approaches often identify different problems. Sometimes heuristic evaluation misses severe problems. About 33% reported problems are real usability problems. Heuristic evaluation misses about 21% of user’s problems. 43% are not problems at all.

38 How to reduce the number of false alarms and missed problems? Use complementary user testing techniques along with heuristic evaluation. Check if experts really have the expertise that they claim. Have several evaluators to avoid one person’s bias or poor performance.

39 Heuristic Evaluation Of Websites In 1999, usability consultant Keith Cogdill was commissioned by NLM to evaluate MEDLINEplus. He identified seven heuristics. MEDLINEplus These heuristics are given to 3 experts who independently evaluated MEDLINEplus.

40 Heuristics used by Cogdill  Internal Consistency  Simple Dialog  Shortcuts  Minimizing Memory Load  Preventing Errors  Feedback  Internal locus of control *What heuristics would we use to analyze ICS website?

41 Results Of The Study  Layout: Uncomplicated Vertical Design; well suited for printing; conservative using of graphics  Internal Consistency: Formatting of pages and logo is consistent across the website.  Arrangement of health topics: Problems should be arranged alphabetically as well as in categories.  Depth of navigation menu: increase the fan-out in navigation menu in the left margin.

42 Summary of Heuristics for Web Design  Navigation  Avoid orphan pages  Avoid excessive white space resulting in long page  Provide navigation support  Avoid narrow, deep or hierarchical menus  Avoid non-standard link colors  Provide consistent look and feel for navigation  Access  Avoid complex URLs  Avoid long download times  Information Design  content comprehension and aesthetics

43 Heuristics For…  For online communities  Sociability  Usability  For other devices (handhelds, computerized toy)

44 Another Technique: Walkthroughs  Cognitive Walkthroughs  Pluralistic Walkthroughs

45 Cognitive Walkthroughs— Definition  Simulating user’s problem-solving process at each step in the human-computer dialog  Checking to see if the user’s goals and memory for actions can be assumed to lead to the next correct action  They focus on evaluating designs for case of learning

46 Cognitive Walkthroughs—Steps (1-3) 1. Identify characteristics of typical users; Develop sample tasks; Produce the interface’s prototype, or a description of it; Generate a clear sequence of the actions needed for the users to complete the task 2. A designer and one or more experts begin the analysis 3. Evaluators walk through the sequences for each task, and try to answer the following questions: Will users know what to do, see how to do it, and understand from feedback whether the action was correct or not?

47 4. Record critical information, including:  Assumptions about what would cause problems and why  Notes about side issues and design changes  A summary of the results 5. Revise the design according to the results. Cognitive Walkthroughs—Steps (4-5)

48 Example: Finding a book at Amazon.com  Let’s walk through the process of finding a book at  Task: find a book at  Typical user: students who use web regularly  Specific Steps, Questions and Answers

49 Pros and Cons of Cognitive Walkthroughs  It takes longer than heuristic evaluation for the same part, because it examines each step of a task.  You may get much more detailed information from the cognitive walkthrough.  It’s useful to examine small part of a system; whereas heuristic evaluation is useful for examining a whole system.

50 Pluralistic Walkthroughs-- Definition Another type of walkthrough in which users, developers and usability experts work together to step through a scenario, discussing usability issues associated with dialog elements involved in the scenario steps.

51 Pluralistic Walkthroughs-- Steps  Develop scenarios in the form of a series (usually 1 or 2) of hard-copy screens representing a single path through the interface.  A panelist ask evaluators to write down the sequence of actions they take to move from one screen to another.  Discuss the actions from that round of review. (Users-> Experts-> Designers)  Move on to next round of screens. The process continues until all the scenarios have been evaluated.

52 Benefits and Constraints of Pluralistic Walkthroughs  Strong focus on user’s task  Performance data is produced and many designers like the apparent clarity of quantitative data.  Involving a multidisciplinary team  Constraint: Limited screens each time, so it takes relatively long time to complete.

53 Thanks, that’s all!


Download ppt "Asking Users and Experts Yujia ZHU Yimeng DOU Asking Users l Interviews l Questionnaires."

Similar presentations


Ads by Google