Standard Metrics and Scenarios for Usable Authentication

Slides:



Advertisements
Similar presentations
Doc.: IEEE /661r0 Submission November 2002 Ziv Belsky, WavionSlide 1 Proposal for the 5 criteria for the HT SG.
Advertisements

Design, prototyping and construction
Evaluation into the usability of a Virtual Learning Environment By Ian Cole (University of York) Dina Lewis (University of Hull) United Kingdom.
Web E’s goal is for you to understand how to create an initial interaction design and how to evaluate that design by studying a sample. Web F’s goal is.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Caleb Stepanian, Cindy Rogers, Nilesh Patel
© Curriculum Foundation1 Section 3 Assessing Skills Section 3 Assessing Skills There are three key questions here: How do we know whether or not a skill.
CLaSS Computer Literacy Software A Three Year Student Evaluation Ian Cole Lecturer in Information & Communication Technology University of York.
The “Electronic Theses“ Project: A view from the consortium led by The Robert Gordon University Andrew Penman.
The Fine Chinese Teashop Cuong Le MSIT 537 IS Design & Implementation.
Introduction and Overview “the grid” – a proposed distributed computing infrastructure for advanced science and engineering. Purpose: grid concept is motivated.
Senior Project Database: Design and Usability Evaluation Stephanie Cheng Rachelle Hom Ronald Mg Hoang Bao CSC 484 – Winter 2005.
Design of metadata surrogates in search result interfaces of learning object repositories: Linear versus clustered metadata design Panos Balatsoukas Anne.
Quantitative Research  Adequate experimental control  Lack of artificiality  Basis for comparison  Adequate information from the data  Uncontaminated.
Software Process and Product Metrics
Conducting a User Study Human-Computer Interaction.
Current Environment/Challenges Many statistical signals (few with meaning) – easy to get lost in data and signals Many statistical signals (few with meaning)
User Centered Learning Design Ranvir Bahl (PMP, CSM)
AUTHENTICATION MELEE A Usability Analysis of Seven Web Authentication Systems Scott Ruoti, Brent Roberts, Kent Seamons Internet Security Research Lab Brigham.
Questionnaires Judy Kay CHAI: Computer Human Adapted Interaction research group Human Centred Technology Cluster for Teaching and Research School of Information.

Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Create High Performance Testing Teams P. Nagarajan Softsmith.
Voter Experience with Alternative Voting Systems in the 2008 Colorado Presidential Election Robert M. Stein Rice University Testimony Prepared for the.
Heuristic evaluation Functionality: Visual Design: Efficiency:
Principles on evaluating FIWARE relevance for Phase 3 proposals.
Software Project Management Lecture # 3. Outline Chapter 22- “Metrics for Process & Projects”  Measurement  Measures  Metrics  Software Metrics Process.
NIST Voting Program Activities Update February 21, 2007 Mark Skall Chief, Software Diagnostics and Conformance Testing Division.
- Improving the User Experience
CMU Usable Privacy and Security Laboratory A case study in UI design and evaluation for computer security Rob Reeder January 30,
Chapter 1 Statistical Thinking What is statistics? Why do we study statistics.
Confused Johnny WHEN AUTOMATIC ENCRYPTION LEADS TO CONFUSION AND MISTAKES Scott Ruoti, Nathan Kim, Ben Burgon, Tim van der Horst, Kent Seamons Internet.
Supporting education and research Security and Authentication for the Grid Alan Robiette, JISC Development Group.
Securing Passwords Against Dictionary Attacks Presented By Chad Frommeyer.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Using Bayesian Nets to Predict Software Defects in Arbitrary Software Lifecycles Martin Neil Agena Ltd London, UK Web:
Benchmark An objective test or measurement that can be used to evaluate the speed, capacity, capabilities, or productivity of competing products.
Advanced Residual Analysis Techniques for Model Selection A.Murari 1, D.Mazon 2, J.Vega 3, P.Gaudio 4, M.Gelfusa 4, A.Grognu 5, I.Lupelli 4, M.Odstrcil.
Development of a Web-Based Groupwork Assessment Tool Groupwork and Assessment Methods Demonstration of Software Discussion Hannah Whaley David Walker
Assessment without Levels 2015 Meadow Primary School Parents as Partners.
Delwyn L. Harnisch, University of Nebraska – Lincoln Leslie Lukin, Lincoln Public Schools.
MERIL: On-Site Accessibility Assessments Research Conducted By: Jenny Stanhope and Todd Ries.
Transforming Procurement E- GP Planning and change management Action Plan for E- GP implementation in Kenya Jerome Ochieng Public Procurement Oversight.
“We’re on the Same Page”: A Usability Study of Secure Using Pairs of Novice Users Scott Ruoti, Jeff Andersen, Scott Heidbrink, Mark O'Neill, Elham.
ESET Timing Close Study Results Stefani Dawn, PhD Assistant Director of Assessment.
Middlewich Primary School End of Key Stage Statutory Assessment Tuesday 5 th July 2016.
Qualitative and Quantitative Data
Assessment without Levels
Unit 2 GCSE Business Communication Systems
Chapter 6: Checklists, Rating Scales & Rubrics
CodeWrite : an Automated programming assessment tool
Optimize Your Mobile Bill-Pay Experience
Slide sequence A.
Design, prototyping and construction
Human-Computer Interaction: User Study Examples
Chapter 11 Design, prototyping and construction 1.
Scott Marion, Center for Assessment
Introduction to the HEART Framework
Measuring Data Quality and Compilation of Metadata
Strengthening Password-based Authentication
Think Aloud User Testing
Defining Usage Models for ESS Mesh
Towards a Research Agenda for Official Statistics
Dr Marcus Baw Jon Hoeksma 20th September 2016
A simple and secure single sign-in authentication service, designed to help businesses prove who they are when transacting with public services online.
HCI – Lesson 7 USABILITY Homework Prof. Garzotto.
Evaluation.
A Usability Study and Critique of Two Password Managers
Empirical Evaluation Data Collection: Techniques, methods, tricks Objective data IRB Clarification All research done outside the class (i.e., with non-class.
Design, prototyping and construction
Presentation transcript:

Standard Metrics and Scenarios for Usable Authentication Scott Ruoti and Kent Seamons Brigham Young University

Our experience S. Ruoti, B. Roberts, and K. Seamons. Authentication melee: A usability analysis of seven web authentication systems. Proceedings of the 24th International Conference on World Wide Web. International World Wide Web Conferences Steering Committee, 2015. Reviewed numerous authentication papers Empirically evaluated seven web-authentication proposals

Problem Unclear whether any real progress is being made Steady stream of authentication proposals, but… Most lack empirical analysis Few are compared against existing password-based authentication Fewer still are compared against competing proposals We need a more scientific approach to this research area

Head-to-head Comparisons

Direct Comparisons Evaluate whether new proposals are making tangible progress As compared to passwords As compared to alternative proposals Within subject or between subject Within subject can be preferable

Standard Metrics

System Usability Scale (SUS) Simple and effective evaluation of usability 10 questions Single usability score from [0, 100] Reliable Across studies Across populations More accurate than mean time-to-authenticate

Other Potential Metrics J. Sauro and J. Lewis. Quantifying the user experience: Practical statistics for user research. Elsevier, 2012. Post study questionnaire – PSSUQ Post-task questionnaire – SMEQ, SEQ User Burden Scale (UBS) New scale introduced at CHI 2016 Mean time-to-authenticate Less reliable than SUS

Standard Scenarios

Real-World Scenarios Allows for more rigorous comparisons Systems evaluated against same criteria Reduces confounding factors Real-world scenarios with relatable tasks More accurately evaluates a proposal’s usability

Example Scenarios Banking Discussion board Available for download High security environment Occasional usage Discussion board Lower security environment High usage Available for download

Recommendations

Recommendations Direct Comparisons Standard usability metrics Within subject or between subject Measure against baseline Standard usability metrics SUS Standard scenarios Banking Discussion board

Discussion

Discussion Standard usability metrics Scenarios Alternatives Is SUS a good fit for usable security? Scenarios Other compelling scenarios Scenarios for emerging forms of authentication

SUS Questions I think that I would like to use this system frequently. I found the system unnecessarily complex. I thought the system was easy to use. I think that I would need the support of a technical person to be able to use this system. I found the various functions in this system were well integrated. I thought there was too much inconsistency in this system. I would imagine that most people would learn to use this system very quickly. I found the system very cumbersome to use. I felt very confident using the system. I needed to learn a lot of things before I could get going with this system.