Download presentation
Presentation is loading. Please wait.
1
Questionnaires
2
Questionnaires At least two points in the development cycle:
Collecting demographic information when conceptualising a site Testing the usability of a live site. Try and pilot a survey before use – there will be ambiguities you hadn't noticed.
3
Questionnaires Questionnaires = Surveys are not as easy as they look. Any fool can string together a set of questions. It is very difficult to construct an unambiguous survey whose results can be used in a meaningful way. For fewer than 10 people, do an interview. It is a better and more fruitful use of resources. A printed questionnaire can't be changed. You've got to be certain what you're looking for.
4
Questionnaires There are ready made and pre-tested professional questionnaires for the usability of a working web site: QUIS = Questionnaire for User Interface Satisfaction (University of Maryland) SUMI = Software Usability Measurement Inventory (University of Cork) You can get a copy of SUMI free, but you need to buy a license to use both.
5
What to ask about Surveys are good for collecting basic, clear-cut information. Focus on questions that guide design decisions For each question: Why do I need to know this? How will I process the information? What will I do with the information? If you don't know the answer, delete the question
6
What to ask about Easy questions: Gender, age, profession, education, computer skill, type of computer, nationality These demographic questions check that you have sampled your target population. Next ask about skills, experience & lifestyle
7
What to ask about Needs and preferences:
What kinds of product would you like to buy online? What problems do people have with web sites: Which of these issues is the worst aspect of browsing the web? Download speed – browser incompatibility – getting lost.
8
What to ask about Don't depend on stereotypes:
If you think men prefer black backgrounds, find out by asking gender and preference of background colour.
9
Wake up!! You want to find out about what Napier students think of the student portal In threes, brainstorm ideas on what they might think about it and then convert these into topics. Eg: One might think that the layout is boring. So one of your topics might be the attractiveness of the page. (plenary)
10
Structuring Responses
Responses come in three flavours: Checkbox, multiple choice and free. It's a good idea to vary the type of question so that responders don't get stuck on clicking the middle option all the way down the survey.
11
Free response Asks an open question and lets responders give any reply they want. Takes more effort from the responder, so these sections are often left blank. More difficult to collate answers because they are in different shapes and sizes. There is a restricted form that asks for e.g. quantities
12
Free response Good to put at the end of the survey to collect information the responder thought important, but which didn't fit the survey questions. Free responses are a cheap way to get new ideas. These ideas can be checked later in other data collecting methods.
13
Checkboxes, Checklists
Keep writing to a minimum Allow responder to answer many questions quickly. Can ask: List products that user owns List products the user would like List problems in doing their jobs List features they'd like on the site
14
Checkboxes, Checklists
Problem: People will skip reading long lists or miss options Can replace a long checklist with yes/no radio buttons, but this is more work to answer. NB: have the default radio button position "nothing selected". If you have the "yes" button selected by default and the responder skips the question, you will get the wrong answer.
15
Multiple Choice Allows you to restrict the responses to easily understood categories. You may not have listed all possibilities, so you could have a box "other" and a space to write in what that is. Multiple choice and checkboxes are very easy to process
16
Multiple Choice User might miss the question by accident, so make sure the response defaults to "no response". Likert Scales collect feelings about a single parameter, eg: I very much prefer the new branding: strongly disagree strongly agree
17
Interpreting Responses
Document the most common response Document the breadth of responses to a question, not just the most popular answer Count the total number of responses to a checked item A low response to a question may mean that it is unclear.
18
Sampling A small number of responses is better than none at all
Fewer than ten returns won't tell you much Fifty or so returns can be relied on If you're doing scientific research, you may need as many as 500 if you want the results to be trustworthy.
19
Sampling Online surveys get a response rate of about 1 – 2%
and snail mail surveys get about 5 – 10% returns. For 50 returns you'd have to send out 1000 mailshots. If you are working with small organisations through personal contact, you may get up to 100% returned.
20
Sampling Improve your rate of return:
Offer a small gift to those who return the survey Include a gift with the survey Use unusual paper and envelopes Keep the survey short and say how long it takes to fill in Include a SAE
21
Sampling Include a letter of introduction from someone important
Emphasise that responses will be kept confidential Emphasise that their responses will shape the web site Give them a deadline for responses Query those who didn't respond, asking them to return their forms please.
22
Sampling Selecting survey recipients:
If there is a small number of specific users (as in an intranet), everybody gets a copy and response rates are usually high. Trickier if you're targeting a mass market. Advertise the survey on your web site If there is a mailing list or newsgroup, use that Hand out surveys on a street corner Visit an industry convention
23
Sampling Snowballing effect: ask each respondent to suggest another potential recipient. In surveys, ask respondent to forward the to a friend. Try and send covering letters so it doesn't look like junk mail.
24
Sampling Self Selection:
Some will choose to reply to your survey, others will not. People who are motivated to provide feedback may have different user behaviour to those who aren't. Eg: Dissatisfied people are less motivated to help you. You won't find out why they are dissatisfied.
25
Sampling There will always be self-selection, biassing your results, but It can be minimised It isn't a reason not to survey. Simply state the problems in your report on the results.
26
Wake up!!! Let us, as a class, decide on a web site that we all use.
Let's do a questionnaire on it. Then, as a class, add up the answers.
27
Avoiding Bias Respondents will try to read behind the wording of the question to give you the answer you want them to give. These answers may not be truthful The answer is to pre-test (pilot) the questionnaire. This will find questions that are misleading, ambiguous, insulting, or inviting bias. It will find questions that are always skipped or always given the same answer.
28
Avoiding Bias Question Skipping:
People skip questions because they are: Difficult to understand Not viewed as relevant Difficult to answer Part of a long boring questionnaire Keep the questionnaire short Ask respondents to answer every question
29
Avoiding Bias Response order for checkboxes or radio buttons:
People often choose the first or the last alternative in the list. One solution is to use a different order for every respondent. Put response options in alphabetical order or deliberately scramble them - otherwise respondents might try to second-guess the “right” answer.
30
Avoiding Bias Rote answers:
Sometimes respondents fall into a pattern of ticking the middle option all the way down the line. To avoid this, keep switching question type from multiple choice to free response to checklist
31
Avoiding Bias Negative questions (eg, which of these are you LEAST likely to buy on the web): Respondents often miss negatives and respond as though they are positives. Try not to use negatives. If you must, emphasise the negating word (LEAST)
32
Avoiding Bias Leading Questions:
Are the Iraqi home team terrorists, resistance, insurgents or freedom fighters? Sometimes the language of the question tells the respondent your personal opinions. Can you give an illustrative example?
33
Avoiding Bias Range bias:
How often do you use the internet? 15 hrs a day or more / hours a day / 5-10 hours a day / less than 5 hours a day How often do you use the internet? At least once per day / times per week / 1-5 times per month / less than once per month Q1 will give a more frequent use than Q2. Consider logarithmic range increases.
34
Avoiding Bias Range bias:
Pre-test to make sure you have an appropriate range of responses Consider a blank for writing in the answer.
35
When to Use Surveys Inexpensive Large amounts of data Good sample size
Reliable demographics Best used: Before a project starts, to identify demographics When the site is live, to get user feedback.
36
Steps in implementing a survey
Create the survey on paper. Pre-test the survey questions. Turn the paper survey into an electronic survey ( /web). Do usability testing on the electronic survey. Inform the target population about the survey.
37
Bibliography Brinck, T., Gergle, D., and Wood, S. T. (2002) Usability for the Web: Designing Web Sites that Work, Morgan Kaufmann, San Francisco, USA Lazar, J., (2001) User Centred Web Development, Jones and Bartlett, Sudbury, USA Benyon, D., Turner, P., and Turner, S. (2005) Designing Interactive Systems, Addison Wesley, Harlow, UK.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.