Groups Workout Buddy Luke Lones Stephanie Mertz Allen Osgood

Slides:



Advertisements
Similar presentations
Chapter 7 Data Gathering 1.
Advertisements

Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Cognitive Walkthrough More evaluation without users.
Human-Computer Interaction Design Principles & Task Analysis.
Saul Greenberg User Centered Design Why User Centered Design is important Approaches to User Centered Design.
Announcements Project proposal part 1 &2 due Tue HW #2 due Wed night 11:59pm. Quiz #2 on Wednesday. Reading: –(’07 ver.) 7-7.4, (’02 ver.)
Data gathering.
James Tam User Centered Design Why User Centered Design is important Approaches to User Centered Design.
Identifying Needs and Establishing Requirements John Thiesfeld Jeff Morton Josh Edwards.
Foundations and Principles of Human Computer Interaction Slide deck by Saul Greenberg. Permission is granted to use this for non-commercial purposes as.
FOCUS GROUPS & INTERVIEWS
Predictive Evaluation
S556 SYSTEMS ANALYSIS & DESIGN Week 11. Creating a Vision (Solution) SLIS S556 2  Visioning:  Encourages you to think more systemically about your redesign.
Data gathering. Overview Four key issues of data gathering Data recording Interviews Questionnaires Observation Choosing and combining techniques.
Interviewing Basics. Pitches We’ll shoot roughly for – Mon: Alladi – Glassman – Wed: Huang – Zhou Other questions?
Mario Čagalj University of Split 2014/15. Human-Computer Interaction (HCI)
1 Designing Better Software User Centred Design and Usability Adam Smith Director, Design and Usability Flight Level Media Ltd.
AVI/Psych 358/IE 340: Human Factors Data Gathering October 3, 2008.
Inspiration for Project Pitches Often people choose to pitch a project based on an interest. – Crew team member pitching a training app for rowing An observed.
Food Product Development
From: A. Cooper et al.: About Face Andreas Rudin
Generating data with enacted methods
User Interface Evaluation
Chapter 7 GATHERING DATA.
NEEDS ANALYSIS.
Task-Centered Walkthrough
User-centred system design process
Lecture3 Data Gathering 1.
Interviewing.
Usability Testing 3 CPSC 481: HCI I Fall 2014 Anthony Tang.
RESEARCH METHODS Lecture 43
Designing for people CPSC 481: HCI I.
Chapter 7 Data Gathering 1.
Observations contextual inquiry
Building the foundations for innovation
Usability Evaluation, part 2
Inspiration for Project Pitches
Cognitive Walkthrough
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
The design process Software engineering and the design process for interactive systems Standards and guidelines as design rules Usability engineering.
Prototyping.
Introducing Evaluation
CS 321: Human-Computer Interaction Design
Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / : Introduction to Human Computer.
Design Thinking.
Chapter 7 GATHERING DATA.
GATHERING DATA.
Gathering Requirements
Interviewing for Thesis Research
Cognitive Walkthrough
Chapter 7 GATHERING DATA.
Unit 6: Application Development
Introduction to Usability Engineering
Learning about your users
Mockups and Prototypes
CS294 Software Engineering Software Usability
Gathering Requirements
Lecture 2: Discovering what people can't tell you: Contextual Inquiry and Analysis Methodology Brad Myers / : Introduction to Human Computer.
Creative Design Solutions: Design Thinking
Introducing Evaluation
Nanotechnology & Society
Engineering Design Process
THE PROCESS OF INTERACTION DESIGN
Creative Design Solutions: Design Thinking
1.02 Creative Design Solutions: Design Thinking
Cognitive Walkthrough
User CENTERED DESIGN IB TOPIC 7.
Introduction to Usability Engineering
Interface Design and Usability
Case studies: interviews
Presentation transcript:

Groups Workout Buddy Luke Lones Stephanie Mertz Allen Osgood Micah Goodman STL parking Carlos Gonzalez Nathan Gitter Jeanine Burke Charles Ahrens Feldman Better Cook Jay Suo Wint Hnin Joyce Zhang Wenchao Wang Cheapest Ride Daniel Mazour Carter Durno Elliott Battle Claire Yuan

Gathering Requirements

Startup Failure 9/10 Startups fail. #1 Reason (according to Fortune): They build a product that no one wants. The requirements gathering phase is intended to protect you from that form of failure.

What info do we need to build a new system? Who are the users? What are their tasks? How do they complete those tasks? Why? What are the goals behind the tasks? Where? In what context do these tasks occur?

Ideally We collect a mess of sometimes contradictory, detailed information about potential users and their needs/struggles/etc. Then, we’ll use techniques like affinity diagramming and card sorting to identify important themes. Then, we re-evaluate our initial vision, decide whether it’s the right one, and extract answers to who/what/how/why/where (revisiting information gathering steps as needed)

In Class Design Challenge You are in the advanced development office at a software and consumer electronics company. Your boss comes to you saying he/she wants you to explore the potential for some kind of device that helps keeps geographically separated families feeling more connected using photographs. Who should we interview?

What do we want to know? We have a hypothesis that photographs can play an important role in connectedness. Can they? Are there other kinds of opportunities for technologies that support connectedness? What are the circumstances under which a new technology needs to operate? How will it fit into users’ lives?

An interface design process Articulate: who users are their key tasks Brainstorm designs Refined designs Completed designs Goals: Interviewing Shadowing Contextual Inquiry Card Sorts Affinity Diagrams Psychology of everyday things Visual Design Graphical screen design Interface guidelines Style guides User tests Task scenario walk- through Usability testing Cognitive Walkthru Heuristic evaluation Remote Evaluation Evaluate Field testing Methods: low fidelity prototyping methods high fidelity prototyping methods Products: User and task descriptions Throw-away paper prototypes Testable prototypes Alpha/beta systems or complete specification

Tools?

Core Tools:

Interviewing Pre-introduction – introduction, a bit of getting to know each other. Introduction - explain the goals of the interview, reassure about the ethical issues, ask to record, present an informed consent form. Warm-up - make first questions easy & non-threatening. Main body – present questions in a logical order A cool-off period - include a few easy questions to defuse tension at the end Closure - thank interviewee, signal the end, e.g, switch recorder off. White and Kules

Contextual Inquiry “Contextual Design makes data gathering from the customer the base criterion for deciding what the system should do…” “The core premise of Contextual Inquiry is very simple: go where the customer works, observe the customer as he or she works, and talk to the customer about the work. Do that, and you can’t help but gain a better understanding of your customer.” Source: Beyer and Holtzblatt, Contextual Design

Question Is contextual inquiry only possible when we have a functional product, or is it possible to use that contextual inquiry earlier in the process (for example when we’re figuring out the layout)? Usually, contextual inquiry is used to study an existing process that you intend to improve or replace. This question kind of jumps ahead to a later stage in the design process. Once we start to develop prototypes, we’ll use user testing in order to evaluate how well a new design is working.

User Observation Observe the user doing work Sometimes this will be in their own context Other times you may want to ask them to do something specific with an existing system In general, you are going to try not to interrupt much, if at all. Best suited to situations where the why is evident through their natural interactions.

Question It seems like contextual inquiry consists of the elements of observation and interview since it allows you to observe in context as well as to ask questions during the observation. Is it common to use contextual inquiry instead of observation and interviews? What would be a scenario that this form of technique should not be used? Is it observable? Yes – Will interview fundamentally alter the process? Yes – Observation + post interview No – Contextual Inquiry No Interview (sometimes supplementary materials can be helpful here)

Scenario 1 You’ve been asked to redesign the course registration system at Wash U to be more student centered. Contextual inquiry. This is a situation that isn’t particularly sensitive to interruption, time, etc. So watching what people are really doing and asking them about it gives you in depth and accurate information.

Scenario 2 Metrolink has been getting comments from riders that their existing app makes it hard to get desired information. They want you to design a new app that will help riders to more easily use the train system. To capture people’s real use context, you’d need to follow them around all day, potentially for days on end to capture the opportunistic use of the metro (the case that’s more likely to be causing problems). So, this is a situation where interviewing is probably your best bet.

Scenario 3 An airline wants to improve the interface used to help agents rebook passengers after a cancelled flight. Here it is probably realistic to observe the situation directly. Flights get cancelled with some regularity, so we could just hang out at a busy airport. But, we probably don’t want to use contextual inquiry because time is of the essence here. Passengers want to be rebooked as quickly as possible. So, here we’d want to use observation (be a fly on the wall) and then potentially follow up with an interview afterwards to get additional context about the decisions made.

Scenario 4 Ford Motor Company wants to investigate using Virtual Reality to support car shopping at home. Here our goal is to really understand what VR might afford us. What role might VR have? It could potentially increase the probability that a customer comes into a showroom. To do this, we probably want to really understand what is working and not working about existing non-showroom based approaches. We’d probably want to approach this as an interview that is potentially supplemented with advertising materials (websites, commercials, what have you). Then the questions can probe into what aspects of those made the customer decide to seek further information or disregard that vehicle. Another possibility is that we could bring some of the experience of a test drive or the interaction at the dealership back. This is another scenario where we can get at the situation pretty easily – we just need to go to a dealership. And here, we don’t want to change the flow of what’s happening. So, we’d probably opt to use observation again and probably supplement it with interviews with both the salesman and the customer.

Avoid unless you have no choice:

Focus Groups Effectively group interviews Typically 3-10 participants Provide a diverse range of opinions Need to be managed to: - ensure everyone contributes - discussion isn’t dominated by one person - the agenda of topics is covered White and Kules

Question How much do focus groups alter what each person might have said if they were alone in an interview?

Asch Experiment 8 people 7 paid confederates, one actual participant. Variety of answers given, some deliberately incorrect to look at influence of peer pressure. At least 75% gave the wrong answer to at least one question. http://www.experiment-resources.com/asch-experiment.html

Surveys Surveys can be really problematic in early design Which would you prefer: X, Y, or Z? Be careful that you aren’t making assumptions that limit the design space in the survey. Most effective when You have a specific question and there are a concrete, known set of answers. Rarely gives good general design constraints Can sometimes be useful to establish that there a problem or need exists in a user community.

Questions to ask constantly Is this a representative set of tasks? Is this a representative set of users? Is there something about my specific methodology that could cause bias? Am I leading the witness? Am I asking my user to be a designer?

Pilot and Iterate You should be trying to eliminate as much bias as you can before you start collecting data from users. Realistically, you won’t get it all. So, when you start to run your early sessions, you need to ask all of these questions again and again until you believe that you are getting complete, representative data.