Presentation is loading. Please wait.

Presentation is loading. Please wait.

An evaluation framework

Similar presentations


Presentation on theme: "An evaluation framework"— Presentation transcript:

1 An evaluation framework

2 Evaluation paradigm Any kind of evaluation is guided explicitly or implicitly by a set of beliefs, which are often under-pined by theory. These beliefs and the methods associated with them are known as an ‘evaluation paradigm’

3 User studies User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and with new ones.

4 Four evaluation paradigms
‘quick and dirty’ usability testing field studies predictive evaluation

5 Quick and dirty ‘quick & dirty’ evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users’ needs and are liked. Quick & dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings.

6 Usability testing Usability testing involves recording typical users’ performance on typical tasks in controlled settings. Field observations may also be used. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users’ opinions.

7 Field studies Field studies are done in natural settings
The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

8 Predictive evaluation
Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. Another approach involves theoretically based models. A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive

9 Shneideirman’s eight golden rules
Nielsen’s heuristics Shneideirman’s eight golden rules Visibility of system status Match between system and the real world User control and freedom Consistency and standards Help users recognize, diagnose and recover from errors Error prevention Recognition rather than recall Flexibility and efficiency of use Aesthetic and minimalist design Help and documentation Strive for consistency Enable frequent users to use shortcuts Offer informative feedback Design dialogs to yield closure Offer error prevention and simple error handling Permit easy reversal of errors Support intern locus of control Reduce short-term memory load

10 Overview of techniques
observing users, asking users’ their opinions, asking experts’ their opinions, testing users’ performance modeling users’ task performance

11 DECIDE: A framework to guide evaluation
Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data.

12 Determine the goals What are the high-level goals of the evaluation?
Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product .

13 Explore the questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into sub-questions: - What are customers’ attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?

14 Choose the evaluation paradigm & techniques
The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. E.g. field studies do not involve testing or modeling

15 Identify practical issues
For example, how to: select users (who, how many) stay on budget staying on schedule find evaluators select equipment

16 Decide on ethical issues
Develop an informed consent form Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish - be treated politely

17 Samtykkeerklæring Jeg erklære herved at jeg er over 18 år og ønsker å delta i en undersøkelse ledet av Håkon Tolsby og hans studenter ved Høgskolen i Østfold. Formålet med undersøkelsen er å evaluere brukervennligheten og brukbarheten ved Høgskolens hjemmesider for å forbedre og redesigne disse. Undersøkelsen omfatter observering av meg mens jeg bruker hjemmesidene mens jeg utfører spesifikke oppgaver. Jeg vil også bli spurt åpne spåørsmål om hjemmesidene og om mine erfaringer med å bruke dem All informasjon som samles i undersøkelsen er konfidensielle, og mitt navn eller identitet skal ikke kunne identifiseres. Jeg forstår at jeg er fri til å stille spørsmål og til å trekke meg fra å delta til en hver tid Signatur til deltager Dato

18 Evaluate, interpret & present data
How data is analyzed & presented depends on the paradigm and techniques used. The following also need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you thought? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment of the study influencing it - e.g. Hawthorn effect

19 Hawthorne Effect 1920s study to evaluate the effects of lighting on assembly line productivity Hypothesis: more light, better productivity Identified performance measure: parts produced per day

20 Hawthorne Effect Note: graphs show trends rather than accurate data from the studies! Tested baseline Tested with more lighting Found: more light, more productivity

21 Hawthorne Effect Went back to baseline lighting condition

22 Hawthorne Effect Added a fourth condition: even more lighting

23 Hawthorne Effect Conclusion: Lighting was not affecting performance
Rather, the attention was causing the improved performance Something very important to keep in mind when doing experiments / usability testing with human subjects!

24 Hawthorne Effect Ways to minimize problems with the Hawthorne Effect:
Keep a low-key attitude ”Want to see how it is to work with this system...” Not ”we are performing this highly scientific test to determine conclusively...” Be a bit informal, but not sloppy Find a balance between warm (friendly, open) and cool (distant) – not chummy, not cold Treat all subjects the same Need to appear / be professional – respect for subject and so the subject will take the test seriously However, by being low-key and informal, the subject feels less watched and the test seems less intrusive

25 Pilot studies A small trial run of the main study.
The aim is to make sure your plan is viable. Pilot studies check: - that you can conduct the procedure - that interview scripts, questionnaires, experiments, etc. work appropriately It’s worth doing several to iron out problems before doing the main study. Ask colleagues if you can’t spare real users.


Download ppt "An evaluation framework"

Similar presentations


Ads by Google