1 Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers 05-863 / 08-763 / 46-863: Introduction to Human.

Slides:



Advertisements
Similar presentations
Assignment Tasks. CriteriaMarksMaximum Length Suggested finishing date Description of users and personas52 pages18 February Task analysis and scenarios52.
Advertisements

Chapter 9 User-centered approaches to interaction design By: Sarah Obenhaus Ray Evans Nate Lynch.
Methods: Deciding What to Design In-Young Ko iko.AT. icu.ac.kr Information and Communications University (ICU) iko.AT. icu.ac.kr Fall 2005 ICE0575 Lecture.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
CS305: HCI in SW Development Evaluation (Return to…)
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
1 Lecture 3: Contextual Design Methodology Brad Myers / / : Introduction to Human Computer Interaction for Technology Executives Fall,
1 Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Design Methodology* Brad Myers / / : Introduction to Human.
William H. Bowers – Understanding Users: Qualitative Research Cooper 4.
1 Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / / : Introduction to Human.
Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / / : Introduction to Human.
1 Lecture 3: Contextual Analysis Brad Myers / / : Introduction to Human Computer Interaction for Technology Executives Fall, 2011,
Data gathering.
Copyright © 2008 – Brad A. Myers1 A Quick Overview of Human-Computer Interaction Brad Myers Human Computer Interaction Institute February 4, 2008 Brad.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Empirical Methods in Human- Computer Interaction.
Information & Interaction Design Fall 2005 Bill Hart-Davidson Session 6: analyzing work practices – rationale and challenges; the 5 Contextual Design work.
Observing users.
Contextual Inquiry Material Source: Professor John Landay, UCB.
Course Wrap-Up IS 485, Professor Matt Thatcher. 2 C.J. Minard ( )
Administrivia Turn in ranking sheets, we’ll have group assignments to you as soon as possible Homeworks Programming Assignment 1 due next Tuesday Group.
Contextual Inquiry Katayoon Etemad November 2007.
1 Contextual Interview Shahnewaz A. Jolly CPSC 681: Research Methods in Human Computer Interaction Instructor: Dr. Saul Greenberg Date: November 4, 2009.
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Write a question/comment about today’s reading on the whiteboard (chocolate!)  Make sure to sign.
©2011 1www.id-book.com Analytical evaluation Chapter 15.
1 Lecture 3: Contextual Analysis Brad Myers / / : Introduction to Human Computer Interaction for Technology Executives Fall, 2013,
Presentation: Techniques for user involvement ITAPC1.
S556 SYSTEMS ANALYSIS & DESIGN Week 11. Creating a Vision (Solution) SLIS S556 2  Visioning:  Encourages you to think more systemically about your redesign.
Human Computer Interaction
Usability Evaluation June 8, Why do we need to do usability evaluation?
Usability Testing CS774 Human Computer Interaction Spring 2004.
User-Centered System Design. - a philosophy of user interface design introduced by Don Norman & Steve Draper in 1986.
Allison Bloodworth, Senior User Interaction Designer, Educational Technology Services, University of California - Berkeley October 22, 2015 User Needs.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net A Bag of Tricks: What is the Right Mix of Methods?
1 Lecture 3: Contextual Analysis Brad Myers / / : Introduction to Human Computer Interaction for Technology Executives Fall, 2012,
Chapter 6: Thinking about requirements and describing them.
Task Analysis Methods IST 331. March 16 th
CSCI 4163 / CSCI 6904 – Winter Housekeeping  Clarification about due date for reading comments/questions  Skills sheet  Active listening handout.
Writing Software Documentation A Task-Oriented Approach Thomas T. Barker Chapter 5: Analyzing Your Users Summary Cornelius Farrell Emily Werschay February.
Kendall & KendallCopyright © 2014 Pearson Education, Inc. Publishing as Prentice Hall4-1 Interactive Methods to collect Information Requirements Interviewing.
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
1 L545 Systems Analysis & Design Week 3: September 16, 2008.
EVALUATION PROfessional network of Master’s degrees in Informatics as a Second Competence – PROMIS ( TEMPUS FR-TEMPUS-JPCR)
Z556 Systems Analysis & Design Session 10 ILS Z556 1.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
©2001 Southern Illinois University, Edwardsville All rights reserved. Today Putting it in Practice: CD Ch. 20 Monday Fun with Icons CS 321 Human-Computer.
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net.
Observation Direct observation in the field –Structuring frameworks –Degree of participation (insider or outsider) –Ethnography Direct observation in controlled.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
From: A. Cooper et al.: About Face Andreas Rudin
Chapter 16: User Interface Design
Lecture 3: Contextual Analysis
Lecture 3: Contextual Analysis
Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / : Introduction to Human Computer.
SY DE 542 User Testing March 7, 2005 R. Chow
Design Research Jon Kolko Director & Founder, Austin Center for Design.
CS294 Software Engineering Software Usability
Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / : Introduction to Human Computer.
Evaluation.
Presentation transcript:

1 Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / / : Introduction to Human Computer Interaction for Technology Executives Fall, 2013, Mini 2 © Brad Myers

Happy Halloween! Take 2 candies! 2© Brad Myers

3 Resolve Devices for Assignments On the GoogleDoc © Brad Myers

4 Some Usability Methods Contextual Inquiry Contextual Analysis (Design) Paper prototypes Think-aloud protocols Heuristic Evaluation Affinity diagrams (WAAD) Personas Wizard of Oz Task analysis Cognitive Walkthrough KLM and GOMS (CogTool) Video prototyping Body storming Expert interviews A vs. B studies Questionnaires Surveys Interaction Relabeling Log analysis Focus groups Card sorting Diary studies Improvisation Use cases Scenarios Cognitive Dimensions “Speed Dating” … © Brad Myers

5 Contextual Inquiry and Analysis/Design One method for organizing the development process We teach it to our MS and BS students Proven to be very successful Hartson-Pyla text: Chapters 3-6 (doing things in a different order than text) Also described in this classic book: H. Beyer and K. Holtzblatt Contextual Design: Defining Customer-Centered Systems. San Francisco, CA:Morgan Kaufmann Publishers, Inc. ISBN: Contextual Design: Defining Customer-Centered Systems © Brad Myers

Contextual Inquiry & Analysis/Design Contextual Inquiry A kind of “ethnographic” or “participatory design” method Combines aspects of other methods: Interviewing, think-aloud protocols, participant/observer in the context of the work Afterwards: Contextual Analysis (Hartson- Pyla term) Beyer-Holtzblatt call it “Contextual Design” Also includes diagrams (“models”) to describe results © Brad Myers6

“Contextual Inquiry” Interpretive field research method Depends on conversations with users in the context of their work Used to define requirements, plans and designs. Discover the real requirements of the work Drives the creative process: In original design In considering new features or functionality © Brad Myers7

8 Context Definition: “The interrelated conditions within which something occurs or exists” Understand work in its natural environment Go to the user Observe real work Use real examples and artifacts “Artifact”: An object created by human workmanship Interview while she/he is working More reliable than asking them Context exists even when not a “work” activity Use “work” here just to mean “doing something” Can be home, entertainment, etc. © Brad Myers

9 Elements of User's Context: Pay Attention to all of these User's work space User's work User’s workarounds User's work intentions User's words (language used) Tools used How people work together Business goals Organizational and cultural structure © Brad Myers

10 Why Context? Design complete work process Fits into “fabric” of entire operations Not just “point solutions” to specific problems Integration! Consistency, effectiveness, efficiency, coherent Design from data Not just opinions, negotiation Not just a list of features © Brad Myers

11 Key distinctions about CIs Interviews, Surveys, Focus Groups Summary data & abstractions What customers say Subjective Limited by reliability of human memory What customers think they want Contextual Inquiry Ongoing experience & concrete data What users do Objective Spontaneous, as it happens What users actually need © Brad Myers

12 Who? Users Between 6 – 20 Representative of different roles Note: may not be people who will be doing the purchasing of the system E.g., if for an enterprise; public kiosk Interviewers: “Cross-functional” team Designers UI specialists Product managers Marketing Technical people © Brad Myers

13 Partnership Definition: A relationship characterized by close cooperation Build an equitable relationship with the user Suspend your assumptions and beliefs Invite the user into the inquiry process © Brad Myers

14 Why is Partnership Important? Information is obtained through a dialog The user is the expert. Not a conventional interview or consultant relationship Alternative way to view the relationship: Master/Apprentice The user is the “master craftsman” at his/her work You are the apprentice trying to learn © Brad Myers

15 Establishing Partnership Share control Use open-ended questions that invite users to talk: "What are you doing?" "Is that what you expect?" "Why are you doing...?" Let the user lead the conversation Listen! Pay attention to communication that is non-verbal © Brad Myers

Some Alternative Contextual Inquiry Interview Methods For intermittent tasks In-context cued recall Activity logs For uninterruptible tasks Post-observation inquiry For extremely long or multi-person tasks Artifact walkthrough New technology within current work Future Scenario Prototype or prior version exists Prototype/Test drive © Brad Myers16

17 Interview Recording and Note-Taking Do record interview Video recordings Screen capture software with laptop microphone for user When to take notes? Note taking can help you pay closer attention Notes lead to faster turn-around Do not let it interfere with interviewing Usually would use a second person How to record? What the user says – in quotes What the user does – plain text Your interpretation – in parentheses Write fast! © Brad Myers

18 Analysis In the moment: Simultaneous data collection and analysis during interview Post interview: Using notes, tapes, and transcripts Analysis by a group: Integrates multiple perspectives Creates shared vision Creates shared focus Builds teams Saves time © Brad Myers

19 Defining the Tasks In a real Contextual Inquiry, user decides the tasks Investigate real-world tasks, needs, context But you still must decide the focus What tasks you want to observe That are relevant to your product plan But for Assignment 1, you will have to invent some tasks © Brad Myers

20 Test Tasks Task design is difficult part of usability testing Representative of “real” tasks Sufficiently realistic and compelling so users are motivated to finish Can let users create their own tasks if relevant Appropriate difficulty and coverage Should last about 2 min. for expert, less than 30 min. for novice Short enough to be finished, but not trivial Tasks not humorous, frivolous, or offensive Easy task first, progressively harder But better if independent Remember: Not asking their opinions © Brad Myers

Initial Questions for the Users Find out the context through initial questions When would you normally do this kind of task? Who would be involved in making the decisions? What would influence any decisions? How would you know what to do? What information would you use to help decide? 21© Brad Myers

22 Test Script Useful to have a script Make sure say everything you want Make sure all users get same instructions Should read instructions out loud Ask if users have any questions Make sure instructions provide goals only in a general way, and doesn’t give away information Describe the result and not the steps Avoid product names and technical terms that appear on the web site Don’t give away the vocabulary Example: “The clock should have the right time”; not: “Use the hours and minutes buttons to set the time” © Brad Myers

23 Example of CI Video of sample session with a eCommerce site: Issues to observe Interview of work in progress, in “context” Actual session of doing a task Not an interview asking about possible tasks, etc. Note that focusing on expert behavior & breakdowns Questions to clarify about routine, motivations Why do certain actions: need intent for actions Notice problems (“breakdowns”) Notice what happens that causes users to do something (“triggers”) E.g. appearance of error messages, other feedback, external events (phone ringing), etc. © Brad Myers

24 Screen shots of important points in video © Brad Myers