Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February.

Slides:



Advertisements
Similar presentations
User Interface Evaluation Formative Evaluation. Summative Evaluation Evaluation of the user interface after it has been developed. Typically performed.
Advertisements

BLS U.S. Bureau of Labor Statistics The Paperwork Reduction ACT (PRA) What You Need to Know for Government UX Work Jean Fox Presented to the.
6.811 / PPAT: Principles and Practice of Assistive Technology Wednesday, 16 October 2013 Prof. Rob Miller Today: User Testing.
CS305: HCI in SW Development Evaluation (Return to…)
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Laura Noll Research Compliance Manager Radford University.
SEVEN FREQUENTLY ASKED QUESTIONS ABOUT USABILITY TESTING Usability Testing 101.
Deciding How to Measure Usability How to conduct successful user requirements activity?
Analyzing and Presenting Results Establishing a User Orientation Alfred Kobsa University of California, Irvine.
CyLab Usable Privacy and Security Laboratory 1 C yLab U sable P rivacy and S ecurity Laboratory Designing.
CyLab Usable Privacy and Security Laboratory 1 C yLab U sable P rivacy and S ecurity Laboratory Laboratory.
Customer: Rosalva Gallardo Team members: Susan Lin Buda Chiou Jim Milewski Marcos Mercado November 23, 2010.
Copyright © 2001 Bolton Institute Faculty of Technology Multimedia Integration and Applications Lecture 7: Prototype Review Damien Markey.
CS CS 5150 Software Engineering Lecture 12 Usability 2.
213: User Interface Design & Development Professor: Tapan Parikh TA: Eun Kyoung Choe
Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research group School of Information.
Evaluation Methodologies
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
213: User Interface Design & Development Prof: Tapan Parikh TA: Deepti Chittamuru
Usage & Usability Denise A. Troll Distinguished Fellow, Digital Library Federation Associate University Librarian, Carnegie Mellon June 16, 2001 – LRRT,
Usable Privacy and Security Course Overview Lorrie Cranor, Jason Hong, Mike Reiter Grad students and juniors and seniors –Tended to be HCI, CS, Public.
Intro to Evaluation See how (un)usable your software really is…
Lecture 7: Prototype Review Damien Markey. Lecture 6: Prototype Review What makes a prototype successful Why a prototype is never a failure Review criteria.
Analytical Evaluations 2. Field Studies
Collecting Basic Evaluation Information Chapter Six cont…
CyLab Usable Privacy and Security Laboratory 1 C yLab U sable P rivacy and S ecurity Laboratory Surveys,
Spring break survey how much will your plans suck? how long are your plans? how many people are involved? how much did you overpay? what’s your name? how.
1 NJ Dept. of Health Decision Tree for eIRB Submission Revised: 06/18/2015 Is this research defined as: A systematic investigation which includes research.
Planning and Writing Your Documents Chapter 6. Start of the Project Start the project by knowing the software you will write about, but you should try.
IT 499 Bachelor Capstone Week 8. Adgenda Administrative Review UNIT Seven UNIT Eight Project UNIT Nine Preview Project Status Summary.
Part 1-Intro; Part 2- Req; Part 3- Design  Chapter 20 Why evaluate the usability of user interface designs?  Chapter 21 Deciding on what you need to.
Chapter 20 Why evaluate the usability of UI designs?
Human Computer Interaction
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley The Resonant Interface HCI Foundations for Interaction Design First Edition.
Copyright © 2008 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Lecturer – Prof Jim Warren Lecture 4 - Usability Testing Based on Heim, Chapter.
Usability Testing Chapter 6. Reliability Can you repeat the test?
1 ISE 412 Usability Testing Purpose of usability testing:  evaluate users’ experience with the interface  identify specific problems in the interface.
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Usability Testing: Role Playing as an Illustration Defining experimental tasks Recording stuff Identifying problems  changes Leading to A6 Presented by.
COMP5047 Pervasive Computing: 2012 Think-aloud usability experiments or concurrent verbal accounts Judy Kay CHAI: Computer human adapted interaction research.
CS5714 Usability Engineering Formative Evaluation of User Interaction: During Evaluation Session Copyright © 2003 H. Rex Hartson and Deborah Hix.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
1 Notes from
Creating User Interfaces Review midterm. Work on User Observation studies Homework: Spring break! Use opportunity to work on user observation study!
Usability Evaluation, part 2. REVIEW: A Test Plan Checklist, 1 Goal of the test? Specific questions you want to answer? Who will be the experimenter?
Usability Evaluation. Objectives for today Go over upcoming deliverables Learn about usability testing (testing with users) BTW, we haven’t had a quiz.
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
AVI/Psych 358/IE 340: Human Factors Evaluation October 31, 2008.
June 5, 2007Mohamad Eid Usability Testing Chapter 8.
Introduction to Evaluation without Users. Where are you at with readings? Should have read –TCUID, Chapter 4 For Next Week –Two Papers on Heuristics from.
Oct 211 The next two weeks Oct 21 & 23: Lectures on user interface evaluation Oct 28: Lecture by Dr. Maurice Masliah No office hours (out of town) Oct.
Usability Evaluation or, “I can’t figure this out...do I still get the donuts?”
UCI Library Website Chris Lee Archana Vaidyanathan Duncan Tsai Karen Quan.
Evaluation / Usability. ImplementDesignAnalysisEvaluateDevelop ADDIE.
School of Engineering and Information and Communication Technology KIT305/607 Mobile Application Development Week 7: Usability (think-alouds) Dr. Rainer.
Day 8 Usability testing.
User Interface Evaluation
Lecture 11: Honours Thesis Structure
Usability Testing 3 CPSC 481: HCI I Fall 2014 Anthony Tang.
Research with human participants at Carnegie Mellon University
Usability Evaluation, part 2
Usability Evaluation.
Collaboration with Google Drive
Example of IRB Application and Resulting Research Paper
Example of IRB Application and Resulting Research Paper
Human-Computer Interaction: Overview of User Studies
This is a template for a presentation that you can use to introduce your team to Harvest. You can customize the content of the slides. You’ll want to pay.
Presentation transcript:

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 1 Designing user studies February 11, 2008

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 2 Administrivia Observations due Wednesday, be prepared to discuss with the class Project group formation on Wednesday How many of you have projects to propose? Feedback on lectures so far

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 3 How is HCISEC different from HCI? Is it different? If so, how? Are different user study methods needed?

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 4 Designing and conducting a user study Identify purpose and metrics Design tasks Develop experimental design Develop detailed plan, artifacts, protocol and scripts Pilot test and revise IRB approval Recruit participants and run study Analyze data

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 5 Purpose & metrics Identify purpose of study What are you trying to learn?  Human-in-the-loop questions may be relevant What are your hypotheses? Identify metrics How will you know if it is better, faster, more useful, more usable, etc. ? What will you measure? What will you compare it to? What is your target improvement, time, score, etc. ? What qualitative data are you looking for?

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 6 Tasks What tasks will you ask users to perform to allow you to take needed measurements? What degree of user interface fidelity do you need to allow you to take needed measurements? Is a paper prototype or low fidelity prototype be preferable, or is a high-fidelity prototype needed? Where should the study be done? Lab study or field study?

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 7 Experimental design What kind of experimental design should you use? Within subjects, between subjects, hybrid How many participants should you have? What will you need for statistical significance? What are your constraints in terms of time, budget, etc? What kind of subjects do you need and how will you recruit them? Special characteristics, knowledge or skills?  Sometimes we recruit a particular type of subject because it is more convenient, even if it doesn’t produce as generalizable results What incentives will they have to participate?

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 8 Detailed plan Develop artifacts Prototypes, questionnaires, screening tools, measuring tools, etc. Protocol and scripts Exactly what will participants do? Will you ask participants to think aloud? What will experimenter(s) do and say? Do you need to train participants? Are warm-up or distracter tasks needed? Will you make audio or video recordings or do screen captures? Will the experimenter record specific information? Is there a form or template to facilitate this? Figure out how you will analyze your data

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 9 Pilot test and revise Run through the whole protocol with members of your team to work out all the details Run through it with your friends or people you recruit to debug the protocol and find out how long it will take Do some preliminary data analysis Revise Make sure tasks and questions aren’t confusing Make sure the study can be done in a reasonable amount of time Make sure the study measures what you are trying to measure Repeat

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 10 IRB approval All published research studies involving human subjects must have CMU IRB approval Surveys are exempt, but you must still fill out form and ask IRB to give you exemption Exempt and low-risk IRB approval usually happens within 2 weeks High risk usually takes about a month, but may be longer if you have to iterate with IRB Whenever possible, design study so participants sign informed consent form up front You will have to convince IRB that there is a good reason not to Submit your IRB form as early as possible, even if not all your study details are worked out You can submit an amendment later Label all recruitment forms and questionnaires as “Example” for more flexibility

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 11 Recruit participants and run study Posters, , ads, etc. to recruit study participants Screen participants, sign them up Run the study Make sure you have reserved lab or appropriate space, if needed Make sure you have enough people there to run the study

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 12 Analyze data Determine scores, times, etc. Code audio or text for quantitative analysis Run appropriate statistical tests

Usable Privacy and Security Carnegie Mellon University Spring 2008 Lorrie Cranor 13 Group exercise: study design The AT&T web mail client identifies suspected phishing s and warns a user if they try to open them. If a user opens the messages anyway, they will see the warning symbol next to all suspicious links. If they click on the link, they will go to a page warning them that the link is suspicious, and asking them if they are sure they want to proceed. Design a user study that will allow you to evaluate the effectiveness of this approach to protecting users from phishing and to come up with recommendations for improving the warning interface. Optionally, you can come up with some design improvements and test them in your user study as well.