PresQT Workshop, Tuesday, May 2, 2017

Slides:



Advertisements
Similar presentations
Usability and Performance Rules of Thumb for Mobile Java Developers Jackson Feijó Filho Software Developer Lightning talk at Java Mobile, Media & Embedded.
Advertisements

1 Software Requirement Analysis Deployment Package for the Basic Profile Version 0.1, January 11th 2008.
Usability presented by the OSU Libraries’ u-team.
Web Usability by Scott Grissom1 Web Usability Scott Grissom Computer Science & Information Systems.
OSU Libraries presented by the u-team.
Web 2.0 Testing and Marketing E-engagement capacity enhancement for NGOs HKU ExCEL3.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Cis-Regulatory/ Text Mining Interface Discussion.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
User-Centred Strategy Online Fiachra Ó Marcaigh 3 June
D AVID -C LINTON WEBSITE REVIEW TEAM ALPHA. LEARNABILITY EFFICIENCY MEMORABILITY ERRORS SATISFACTION STRENGTHS Clean & simple design Good text-image balance.
Beyond the Brochure: Honing Your Web Strategy Donica Mensing Reynolds School of Journalism University of Nevada, Reno Talk given to the American Marketing.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Outcome Based Evaluation for Digital Library Projects and Services
Usability. Definition of Usability Usability is a quality attribute that assesses how easy user interfaces are to use. The word "usability" also refers.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Overview of the rest of the semester Building on Assignment 1 Using iterative prototyping.
Nielsen’s Ten Usability Heuristics
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Bryan Kern (SUNY Oswego), Anna Medeiros (UFPB), Rafael de Castro (UFPB), Maria Clara (UFPB), José Ivan (UFPB), Tatiana Tavares (UFPB), Damian Schofield.
UI Style and Usability, User Experience Niteen Borge.
Interaction Design: Overview
Usability Engineering Dr. Dania Bilal IS 582 Spring 2006.
Overview and Revision for INFO3315. The exam
OSU Libraries presented by the u-team.
Usability Engineering Dr. Dania Bilal IS 592 Spring 2005.
Usability Heuristics Avoid common design pitfalls by following principles of good design Nielsen proposes 10 heuristics, others propose more or less. Inspect.
Now what? 1.  I have short-listed projects I am interested in  I know the types of projects I would like to pursue  I have an idea of the resources.
June REU 2003 How to Conduct Research Some Rules of Thumb.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
1 Web Site Usability Motivations of Web site visitors include: –Learning about products or services that the company offers –Buying products or services.
Ten Usability Heuristics These are ten general principles for user interface design. They are called "heuristics" because they are more in the nature of.
Stages of Research and Development
Spelling and beyond – Curriculum
Advertising your data: Using data portals and metadata registries
Lesson Plan Creating a lesson plan week5.
Human Computer Interaction Lecture 15 Usability Evaluation
Usage scenarios, User Interface & tools
Chapter 1 - Introduction
Evaluation: Analyzing results
Usability engineering
Usability engineering
Developing Thinking Thinking Skills for 21st century learners
Learning and Teaching –
Usability ECE 695 Alexander J. Quinn 3/21/2016.
Interaction qualities
Year 7 E-Me Web design.
It’s On the Way: Conducting Library Website Usability Test
Welcome! Open Firefox Go to: usability-sf.
My UI Career Process improvement for more meaningful and
EER Assurance Presentation of Issues and Project Update June 2018
Creation of an interactive system
Spelling and beyond Literacy Toolkit HGIOS
Unit 14 Website Design HND in Computing and Systems Development
Introduction to the HEART Framework
Engagement Follow-up Resources
ISTE Workshop Research Methods in Educational Technology
Usability Testing CS 4501 / 6501 Software Testing
Engagement Follow-up Resources
Human Computer Interaction
Bulloch Information Session
USABILITY PART of UX.
COMP444 Human Computer Interaction Usability Engineering
Testing & modeling users
Tutors: providing feedback Students: using tutor feedback
Designing Your Performance Task Assessment
Street Manager Training approach
Usability Created by Brett Oppegaard for Washington State University's
Presentation transcript:

Assessment of Research Project & Tool Communication Vehicles: Testing the Heuristics PresQT Workshop, Tuesday, May 2, 2017 Nancy J. Hoebelheinrich, Knowledge Motifs LLC

Breakout Agenda Research Project Background Info Heuristics Checklists / Tools Examples of testing the checklist / tool from Researcher POV Your turn! Feedback on the Checklists / Tools

NH Project Background Info Research result of an ESIP Products & Services (P&S) Testbed, Fast Track Proposal P&S / ESIP Staff interest in developing criteria / questions for evaluators of AIST and other research projects Could be used to determine the overall “usefulness” of a research project / product / tool, but emphasis upon research project Find on : https://github.com/ESIPFed/Project_Assessment_Heuristics

Objectives Identify clear heuristics for evaluators to assess the overall usefulness of a research project Develop “rules of thumb” that can be applied by research project developers & evaluators Create or identify easy-to-use mechanism for evaluators to apply recommended rules of thumb

Method Review of literature on: Usability of web site design Technology “infusion” or “adoption potential” Identify use cases with some relevant tasks associated with evaluation of a research project Develop heuristics for evaluating Create means of evaluating, scoring & commenting upon criteria Produce annotated bibliography from literature search

Definition – based on lit review “Usefulness” of a research web site Assessing both the utility of the web site, i.e., that it can provide the functionality / features needed, and Assessing Its usability, i.e., having the features that are efficient, effective, engaging, error tolerant and easy to learn Utility + Usability = Usefulness Emphasis is upon providing feedback, not judgement Empha

Approach Process of Technology / Product Evaluation: Understanding Assessing Packaging Ranking Effectiveness (“Usefulness”) of Communication Vehicles by: Learnability Efficiency Memorability Errors Satisfaction from Different Points Of View: Research Domain Expert (Use Case 1) Workflow Domain Expert (Use Case 2) Tool-builder (Use Case 3) AIST Evaluator (Use Case 4) Bases: Given: Technology Maturation Lifecycle expressed as (TRLs 1 – 9 ) Zelkowitz, p. 16 ; TRL = Technology Readiness Level

Approach Process of Technology / Product Evaluation: Understanding Assessing Packaging Ranking Effectiveness (“Usefulness”) of Communication Vehicles by: Learnability Efficiency Memorability Errors Satisfaction from Different Points Of View: Research Domain Expert (Use Case 1) Workflow Domain Expert (Use Case 2) Tool-builder (Use Case 3) AIST Evaluator (Use Case 4) Bases: Given: Technology Maturation Lifecycle expressed as (TRLs 1 – 9 ) Zelkowitz, p. 16 ; TRL = Technology Readiness Level

Approach Process of Technology / Product Evaluation: Understanding Assessing Packaging Ranking Effectiveness (“Usefulness”) of Communication Vehicles by: Learnability Efficiency Memorability Errors Satisfaction from Different Points Of View: Research Domain Expert (Use Case 1) Workflow Domain Expert (Use Case 2) Tool-builder (Use Case 3) AIST Evaluator (Use Case 4) Bases: Given: Technology Maturation Lifecycle expressed as (TRLs 1 – 9 ) Jakob ---- ; also importance of Science communication. Depends upon the goals of the product / service / tool / research. Who is your audience?

Approach Process of Technology / Product Evaluation: Understanding Assessing Packaging Ranking Effectiveness (“Usefulness”) of Communication Vehicles by: Learnability Efficiency Memorability Errors Satisfaction from Different Points Of View: Research Domain Expert (Use Case 1) Workflow Domain Expert (Use Case 2) Tool-builder (Use Case 3) AIST Evaluator (Use Case 4) Bases: Given: Technology Maturation Lifecycle expressed as (TRLs 1 – 9 ) Zelkowitz, p. 16 ; TRL = Technology Readiness Level

Effective Communication key to the overall Approach Process of Evaluation Ranking Effectiveness of Communication Vehicles Potential for infusion from different points of view

1st -- Process of Technology / Product Evaluation 1. Understanding the technology / product Getting a reading on the baseline processes Figuring out the key characteristics, e.g., (from Comstock) Performance Schedule Cost Risk ?

Process of Technology / Product Evaluation 2. Assessing the technology / product How applicable to other systems? How would the key characteristics impact other systems & how? What improvements could be made in either the product or the processes? (Comstock & Zelkowitz)

Process of Technology / Product Evaluation 3. Packaging the technology / product How divisible / modular is the technology / product for purposes of transfer / re-use / infusion into other systems? If improvements are suggested, how feasibly could they be incorporated? Zelkowitz

2nd -- Ranking Effectiveness (“Usefulness”) of Communication Vehicles (CV) by: Learnability Efficiency Memorability Errors Satisfaction Using Jakob Nielsen’s definition of usability (https://www.nngroup.com/articles/usability-101-introduction-to-usability) ; discussion of “errors” found at: https://www.nngroup.com/articles/slips/

2nd -- Ranking Effectiveness (“Usefulness”) of Communication Vehicles (CV) by: Learnability How easy is it for users to accomplish basic tasks the first time they encounter the CVs? Using Jakob Nielsen’s definition of usability (https://www.nngroup.com/articles/usability-101-introduction-to-usability) ; discussion of “errors” found at: https://www.nngroup.com/articles/slips/

2nd -- Ranking Effectiveness (“Usefulness”) of Communication Vehicles (CV) by: Efficiency Once users have learned the design, how quickly can they perform tasks? Using Jakob Nielsen’s definition of usability (https://www.nngroup.com/articles/usability-101-introduction-to-usability) ; discussion of “errors” found at: https://www.nngroup.com/articles/slips/

2nd -- Ranking Effectiveness (“Usability”) of Communication Vehicles (CV) by: Memorability When users return to the design after a period of not using it, how easily can they reestablish proficiency? Using Jakob Nielsen’s definition of usability (https://www.nngroup.com/articles/usability-101-introduction-to-usability) ; discussion of “errors” found at: https://www.nngroup.com/articles/slips/

2nd -- Ranking Effectiveness (“Usability”) of Communication Vehicles (CV) by: Errors How many errors do users make, how severe are these errors, and how easily can users recover from the errors? Using Jakob Nielsen’s definition of usability (https://www.nngroup.com/articles/usability-101-introduction-to-usability) ; discussion of “errors” found at: https://www.nngroup.com/articles/slips/

How pleasant is it to use the design? 2nd -- Ranking Effectiveness (“Usability”) of Communication Vehicles (CV) by: Satisfaction How pleasant is it to use the design? Using Jakob Nielsen’s definition of usability (https://www.nngroup.com/articles/usability-101-introduction-to-usability) ; discussion of “errors” found at: https://www.nngroup.com/articles/slips/

3rd -- from Different Points Of View Research Domain Expert (Use Case 1) Workflow Domain Expert (Use Case 2) Tool-builder (Use Case 3) AIST Evaluator (Use Case 4)

from Different Points Of View Research Domain Expert (Use Case 1) I am a domain expert in tropical cyclones and am looking for web-based projects, tools or services to help my research group with certain types of analysis. I am interested in guidance on how to determine if a tool or service will be useful for addressing the research needs of my group.

from Different Points Of View Workflow Domain Expert (Use Case 2) I am a domain expert in tropical cyclones looking for web-based tools or services to help my research group perform certain types of analysis more efficiently and productively without requiring a lot of hands-on training or long ramp-up time.

from Different Points Of View Tool-builder (Use Case 3) I am a researcher building a tool to provide outcomes from model analyses for tropical cyclone research. I want this tool to be adopted by my research community. On what kinds of factors or points about my tool should I focus to help me meet that goal?

from Different Points Of View AIST Evaluator (Use Case 4) As an evaluator, I am looking for ways to identify issues in the public web-based interfaces for a tool, service or research project to ascertain whether the stated goals have (or can’t) be met. Likewise, I am interested in guidelines for identifying problems in an interface that might indicate issues with potential adoption.

Evaluator Checklists It’s a combination of TRL Evaluation & Communication Vehicle (CV) Assessment Built for different evaluator perspectives Similar questions, but could probably have different scores Scoring is subjective (by definition), but not quantitative Questions focused primarily upon communication vehicles used, e.g., project web sites, documentation, online help text, lists of references, etc. Could be used at different points in the evaluation process, e.g., Understanding scores assigned early in the evaluation process

Research Domain Expert Tool

Usability Guidelines

Recommendations / Next Steps Add to questions for 3 use cases besides the research domain expert Add different “persona” approach to augment and clarify the Use Cases Test in a couple of venues with web sites relating to each of the use cases, e.g., a research project, a tool site, a project describing a workflow . Test for feedback upon, for example: The “Usefulness” approach – melding of the two areas of research The viability of the questions from each use case / persona POV The utility of the rating mechanism (i.e., as subjective, but non-quantitative) Iterate the questions on the Checklists based on the testing Re-draft the guidelines based on the testing using a specific research web site as an example Perhaps write a paper for publication after testing in conjunction with the AIST assessment if these heuristics prove useful to those efforts

Recommendations / Next Steps Add to questions for 3 use cases besides the research domain expert Add different “persona” approach to augment and clarify the Use Cases Test in a couple of venues with web sites relating to each of the use cases, e.g., a research project, a tool site, a project describing a workflow . Test for feedback upon, for example: The “Usefulness” approach – melding of the two areas of research The viability of the questions from each use case / persona POV The utility of the rating mechanism (i.e., as subjective, but non-quantitative) Iterate the questions on the Checklists based on the testing Re-draft the guidelines based on the testing using a specific research web site as an example Perhaps write a paper for publication after testing in conjunction with the AIST assessment if these heuristics prove useful to those efforts

Testing the Researcher Cklst Hurricane / Tropical Cyclone website - NASA https://www.nasa.gov/mission_pages/hurrica nes/main/index.html Goal: Use the data download service to bring data into his/her own research project Use this as an example in session as described in the Report.

Research Domain Expert Tool https://docs.google.com/spreadsheets/d/1SJv-SGWGfcoQOQCN5U6LxId2HA5_VYlt8nkbiR2XYI8/edit#gid=1956509091

https://www.nasa.gov/mission_pages/hurricanes/bios/braun_bio.html

Testing the Heuristics: Possible sites to test DMP Tool: https://dmptool.org/ ASDC ODISEES Data Portal: https://odisees.larc.nasa.gov/

Thoughts? Comments?

TRLs chart from NASA POV https://dspace.mit.edu/bitstream/handle/1721.1/96307/MITSloanWP5127-15_Eppinger_PICMET.pdf?sequence=1