Presentation is loading. Please wait.

Presentation is loading. Please wait.

W/ slides from Greenberg & Buxton’s CHI 2008 presentation “Usability Evaluation Considered Harmful (Sometimes)”

Similar presentations


Presentation on theme: "W/ slides from Greenberg & Buxton’s CHI 2008 presentation “Usability Evaluation Considered Harmful (Sometimes)”"— Presentation transcript:

1 w/ slides from Greenberg & Buxton’s CHI 2008 presentation “Usability Evaluation Considered Harmful (Sometimes)”

2

3

4 Usability evaluation if wrongfully applied In early design –stifle innovation by quashing (valuable) ideas –promote (poor) ideas for the wrong reason In science –lead to weak science In cultural appropriation –ignore how a design would be used in everyday practice

5 The Solution - Methodology 101 the choice of evaluation methodology - if any – must arise and be appropriate for the actual problem, research question or product under consideration

6 Usability Evaluation assess our designs and test our systems to ensure that they actually behave as we expect and meet the requirements of the use Dix, Finlay, Abowd, and Beale 1993

7 Usability Evaluation Methods Most common (research): controlled user studies laboratory-based user observations Less common inspection contextual interviews field studies / ethnographic data mining analytic/theory …

8

9 CHI Trends (Barkhuus/Rode, Alt.CHI 2007) 2007

10 CHI Trends User evaluation is now a pre-requisite for CHI acceptance

11 CHI Trends (Call for papers 2008) Authors “you will probably want to demonstrate ‘evaluation’ validity, by subjecting your design to tests that demonstrate its effectiveness ” Reviewers “reviewers often cite problems with validity, rather than with the contribution per se, as the reason to reject a paper”

12 Dogma Usability evaluation = validation = CHI = HCI

13 Discovery vs Invention (Scott Hudson UIST ‘07) Discovery uncover facts detailed evaluation Understand what is Invention create new things refine invention Influence what will be

14 Time Learning Brian Gaines BreakthroughReplicationEmpiricismTheoryAutomationMaturity

15 Time Learning BreakthroughReplicationEmpiricismTheoryAutomationMaturity early design & invention sciencecultural appropriation

16 Part 4. Early Design BreakthroughReplication

17 Memex Bush

18 Unimplemented and untested design. Microfilm is impractical. The work is premature and untested. Resubmit after you build and evaluate this design.

19 We usually get it wrong

20 Early design as working sketches Sketches are innovations valuable to HCI

21 Early design Early usability evaluation can kill a promising idea –focus on negative ‘usability problems’ idea

22 Early designs Iterative testing can promote a mediocre idea idea1

23 Early design Generate and vary ideas, then reduce Usability evaluation the better ideas idea5 idea4 idea 3 idea 2 idea5 idea6 idea 7 idea 8 idea 9 idea5 idea 1

24 Early designs as working sketches Getting the design right Getting the right design idea1 idea5 idea4 idea 3 idea 2 idea5 idea6 idea 7 idea 8 idea 9 idea5 idea 1

25 Early designs as working sketches Methods: –idea generation, variation, argumentation, design critique, reflection, requirements analysis, personas, scenarios contrast, prediction, refinement, … idea5 idea4 idea 3 idea 2 idea5 idea6 idea 7 idea 8 idea 9 idea5 idea 1

26 Part 6. Science EmpiricismTheory

27 I need to do an evaluation

28 What’s the problem?

29 It won’t get accepted if I don’t. Duh!

30 Source: whatitslikeontheinside.com/2005/10/pop-quiz-whats-wrong-with-this-picture.html

31 Research process Choose the method then define a problem or Define a problem then choose usability evaluation or Define a problem then choose a method to solve it

32 Research process Typical usability tests –show technique is better than existing ones Existence proof: one example of success

33 Research process Risky hypothesis testing –try to disprove hypothesis –the more you can’t, the more likely it holds What to do: –test limitations / boundary conditions –incorporate ecology of use –replication

34 Part 6. Cultural Appropriation AutomationMaturity

35 Memex Bush

36 1979 Ted Nelson 1987 Notecards 1992 Sepia 1993 Mosaic 1945 Bush 1989 ACM 1990 HTTP

37

38

39 Part 7. What to do

40 Evaluation

41 More Appropriate Evaluation

42 The choice of evaluation methodology - if any – must arise and be appropriate for the actual problem or research question under consideration argumentationcase studies design critiquesfield studies design competitionscultural probes visionsextreme uses inventionsrequirements analysis predictioncontextual inquiries reflectionethnographies design rationaleseat your own dogfood…

43 http://okcancel.com/

44 Discussion  Do you agree with Buxton & Greenberg’s opinions?

45 Discussion  Based on your experience with comparative evaluation in MP2, do you think that such studies provide an appropriate/accurate validation of the password techniques?

46 Discussion  If you had all the resources in the world (time, money) and were developing a novel technique, where would you position evaluation in the design process? What kind would you use?

47 Defining usability Usability of fruit 47

48 Users’ expectations  Society's expectations are reset every time a radically new technology is introduced.  Expectations then move up the pyramid as that technology matures 48 Degree of abundance

49 Final Exam 10am, Thursday, August 4 th, 2011 Here in 127  Bring a pen and/or pencil  I will provide the necessary context within the question  Questions will evaluate your ability to apply the material, not your ability to recall it

50 What will be on it?  As yet unwritten  2-3 questions  Given this scenario, how would you evaluate X?  Models  Implications for design  Validating through empirical studies  Human in the Loop Framework  Fitt’s Law  Cognitive Models  Study designs: within/between, hypothesis testing,


Download ppt "W/ slides from Greenberg & Buxton’s CHI 2008 presentation “Usability Evaluation Considered Harmful (Sometimes)”"

Similar presentations


Ads by Google