W/ slides from Greenberg & Buxton’s CHI 2008 presentation “Usability Evaluation Considered Harmful (Sometimes)”

Slides:



Advertisements
Similar presentations
Computer-Based Performance Assessments from NAEP and ETS and their relationship to the NGSS Aaron Rogat Educational Testing Service.
Advertisements

LIS403, The Role of Research Spring 2005 G. Benoit, Ph.D. Associate Professor Simmons College, GSLIS Spring 2005 G. Benoit, Ph.D. Associate Professor Simmons.
IPM Exam PreparationIPM Exam Preparation November 12 th pm – 5.15pm 2 hours & 15 minutes reading time.
Agile Usability Testing Methods
CS305: HCI in SW Development Evaluation (Return to…)
IS214 Recap. IS214 Understanding Users and Their Work –User and task analysis –Ethnographic methods –Site visits: observation, interviews –Contextual.
USING AND PROMOTING REFLECTIVE JUDGMENT AS STUDENT LEADERS ON CAMPUS Patricia M. King, Professor Higher Education, University of Michigan.
Methodology Overview Why do we evaluate in HCI? Why should we use different methods? How can we compare methods? What methods are there? see
Saul Greenberg User Centered Design Why User Centered Design is important Approaches to User Centered Design.
Usability Evaluation Considered Harmful Some of the time
Is Creativity Research Viable? Linda Candy KCDCC Talk University of Sydney May 8th 2006.
USABILITY AND EVALUATION Motivations and Methods.
Methodology Overview Dr. Saul Greenberg John Kelleher.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Inspection Methods. Inspection methods Heuristic evaluation Guidelines review Consistency inspections Standards inspections Features inspection Cognitive.
Needs and Usability Assessment. Intro to Usability and UCD Usability concepts –Usability as more than interface –Functionality, content, and design User-Centered.
Recap of IS214. Placing this course in context Creating information technology that helps people accomplish their goals, make the experience effective,
An evaluation framework
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Project Workshops Results and Evaluation. General The Results section presents the results to demonstrate the performance of the proposed solution. It.
Engineering Design Process ETP 2005 – Brian Vance This material is based upon work supported by the National Science Foundation under Grant No
Research Methods for Computer Science CSCI 6620 Spring 2014 Dr. Pettey CSCI 6620 Spring 2014 Dr. Pettey.
류 현 정류 현 정 Human Computer Interaction Introducing evaluation.
Thinking Actively in a Social Context T A S C.
ECD in the Scenario-Based GED ® Science Test Kevin McCarthy Dennis Fulkerson Science Content Specialists CCSSO NCSA June 29, 2012 Minneapolis This material.
Team Problem Solving and Decision Making Chapter 14.
Chapter 11: An Evaluation Framework Group 4: Tony Masi, Sam Esswein, Brian Rood, & Chris Troisi.
Presentation: Techniques for user involvement ITAPC1.
UNIT 6 CRITICAL THINKING At the end of this session:  YOU WILL UNDERSTAND THE ESSENTIAL ELEMENTS OF CRITICAL THINKING AND APPLY THOSE PRINCIPLES TO DEVELOP.
Classroom Assessment A Practical Guide for Educators by Craig A
CM602: Effective Systems Development Exam Preparation Slide 1 Exam Preparation Preparing for the CM602 Examination D.Miller J. Horton.
Lesson Overview Lesson Overview Science in Context Lesson Overview 1.2 Science in Context.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Good Assessment by Design International GCSE and GCE Comparative Analyses Dr. Rose Clesham.
Human Computer Interaction
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
Vex Robotics Design System
AIT, Comp. Sci. & Info. Mgmt AT02.98 Ethical, Legal, and Social Issues in Computing September Term, Objectives of these slides: l What ethics is,
Level 2 Prepared by: RHR First Prepared on: Nov 23, 2006 Last Modified on: Quality checked by: MOH Copyright 2004 Asia Pacific Institute of Information.
Lesson Overview 1.2 Science in Context.
CMPT 880/890 Evaluating solutions. MOTD “Computer science meets every criterion for being a science, but it has a self-inflicted credibility problem.”
Introducing Unit Specifications and Unit Assessment Support Packs Philosophy National 5.
Qualitative Research January 19, Selecting A Topic Trying to be original while balancing need to be realistic—so you can master a reasonable amount.
Software Engineering User Interface Design Slide 1 User Interface Design.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
The Research Process.  There are 8 stages to the research process.  Each stage is important, but some hold more significance than others.
DECIDE: An evaluation framework. DECIDE: a framework to guide evaluation D D etermine the goals. E E xplore the questions. C C hoose the evaluation approach.
1 CS 490JL Midterm Review Midterm in-class Tuesday, October 26 With thanks to Wai-Ling Ho-Ching.
Overview Prototyping Construction Conceptual design Physical design Generating prototypes Tool support.
Research Word has a broad spectrum of meanings –“Research this topic on ….” –“Years of research has produced a new ….”
How to Teach Science using an Inquiry Approach (ESCI 215 – Chapter 1)
Cedric D. Murry APT Instructor of Applied Technology in research and development.
Formulating a research problem R esearch areas and topics.
Research methods revision The next couple of lessons will be focused on recapping and practicing exam questions on the following parts of the specification:
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
CPD 3 - Advanced Publishing Skills 1 - How to Get Published and to Continue to Get Published in Leading Academic Journals Professor Tarani Chandola with.
Lesson Overview Lesson Overview Science in Context Lesson Overview 1.2 Science in Context Scientific methodology is the heart of science. But that vital.
HCC 831 User Interface Design and Evaluation. What is Usability?
October 4, 2012.
Methodology Overview basics in user studies Lecture /slide deck produced by Saul Greenberg, University of Calgary, Canada Notice: some material in this.
atlassian
atlassian
Classroom Assessment A Practical Guide for Educators by Craig A
CRITICAL ANALYSIS Purpose of a critical review The critical review is a writing task that asks you to summarise and evaluate a text. The critical review.
World of work How do tasks bring the WoW into the classroom?
The Nature of Qualitative Research
CSC 682: Advanced Computer Security
Usability Techniques Lecture 13.
User Interface Design and Evaluation
This slide deck is part of a presentation given by Saul Greenberg at a UIST 2007 panel on Evaluating user interface systems research (see paper by Dan.
Presentation transcript:

w/ slides from Greenberg & Buxton’s CHI 2008 presentation “Usability Evaluation Considered Harmful (Sometimes)”

Usability evaluation if wrongfully applied In early design –stifle innovation by quashing (valuable) ideas –promote (poor) ideas for the wrong reason In science –lead to weak science In cultural appropriation –ignore how a design would be used in everyday practice

The Solution - Methodology 101 the choice of evaluation methodology - if any – must arise and be appropriate for the actual problem, research question or product under consideration

Usability Evaluation assess our designs and test our systems to ensure that they actually behave as we expect and meet the requirements of the use Dix, Finlay, Abowd, and Beale 1993

Usability Evaluation Methods Most common (research): controlled user studies laboratory-based user observations Less common inspection contextual interviews field studies / ethnographic data mining analytic/theory …

CHI Trends (Barkhuus/Rode, Alt.CHI 2007) 2007

CHI Trends User evaluation is now a pre-requisite for CHI acceptance

CHI Trends (Call for papers 2008) Authors “you will probably want to demonstrate ‘evaluation’ validity, by subjecting your design to tests that demonstrate its effectiveness ” Reviewers “reviewers often cite problems with validity, rather than with the contribution per se, as the reason to reject a paper”

Dogma Usability evaluation = validation = CHI = HCI

Discovery vs Invention (Scott Hudson UIST ‘07) Discovery uncover facts detailed evaluation Understand what is Invention create new things refine invention Influence what will be

Time Learning Brian Gaines BreakthroughReplicationEmpiricismTheoryAutomationMaturity

Time Learning BreakthroughReplicationEmpiricismTheoryAutomationMaturity early design & invention sciencecultural appropriation

Part 4. Early Design BreakthroughReplication

Memex Bush

Unimplemented and untested design. Microfilm is impractical. The work is premature and untested. Resubmit after you build and evaluate this design.

We usually get it wrong

Early design as working sketches Sketches are innovations valuable to HCI

Early design Early usability evaluation can kill a promising idea –focus on negative ‘usability problems’ idea

Early designs Iterative testing can promote a mediocre idea idea1

Early design Generate and vary ideas, then reduce Usability evaluation the better ideas idea5 idea4 idea 3 idea 2 idea5 idea6 idea 7 idea 8 idea 9 idea5 idea 1

Early designs as working sketches Getting the design right Getting the right design idea1 idea5 idea4 idea 3 idea 2 idea5 idea6 idea 7 idea 8 idea 9 idea5 idea 1

Early designs as working sketches Methods: –idea generation, variation, argumentation, design critique, reflection, requirements analysis, personas, scenarios contrast, prediction, refinement, … idea5 idea4 idea 3 idea 2 idea5 idea6 idea 7 idea 8 idea 9 idea5 idea 1

Part 6. Science EmpiricismTheory

I need to do an evaluation

What’s the problem?

It won’t get accepted if I don’t. Duh!

Source: whatitslikeontheinside.com/2005/10/pop-quiz-whats-wrong-with-this-picture.html

Research process Choose the method then define a problem or Define a problem then choose usability evaluation or Define a problem then choose a method to solve it

Research process Typical usability tests –show technique is better than existing ones Existence proof: one example of success

Research process Risky hypothesis testing –try to disprove hypothesis –the more you can’t, the more likely it holds What to do: –test limitations / boundary conditions –incorporate ecology of use –replication

Part 6. Cultural Appropriation AutomationMaturity

Memex Bush

1979 Ted Nelson 1987 Notecards 1992 Sepia 1993 Mosaic 1945 Bush 1989 ACM 1990 HTTP

Part 7. What to do

Evaluation

More Appropriate Evaluation

The choice of evaluation methodology - if any – must arise and be appropriate for the actual problem or research question under consideration argumentationcase studies design critiquesfield studies design competitionscultural probes visionsextreme uses inventionsrequirements analysis predictioncontextual inquiries reflectionethnographies design rationaleseat your own dogfood…

Discussion  Do you agree with Buxton & Greenberg’s opinions?

Discussion  Based on your experience with comparative evaluation in MP2, do you think that such studies provide an appropriate/accurate validation of the password techniques?

Discussion  If you had all the resources in the world (time, money) and were developing a novel technique, where would you position evaluation in the design process? What kind would you use?

Defining usability Usability of fruit 47

Users’ expectations  Society's expectations are reset every time a radically new technology is introduced.  Expectations then move up the pyramid as that technology matures 48 Degree of abundance

Final Exam 10am, Thursday, August 4 th, 2011 Here in 127  Bring a pen and/or pencil  I will provide the necessary context within the question  Questions will evaluate your ability to apply the material, not your ability to recall it

What will be on it?  As yet unwritten  2-3 questions  Given this scenario, how would you evaluate X?  Models  Implications for design  Validating through empirical studies  Human in the Loop Framework  Fitt’s Law  Cognitive Models  Study designs: within/between, hypothesis testing,