Evaluation of applications in main control room

Slides:



Advertisements
Similar presentations
Task-Centered User Interface Design who are the users? what are the tasks? plagiarize! iterative design –rough descriptions, mock-ups, prototypes test.
Advertisements

Agile Usability Testing Methods
Dialog on - Usability Düsseldorf / PVpage 1 USABILITY Based on a lecture by Raino Vastamäki, Research Director Adage Oy in Kiljava on May 2003.
Saul Greenberg User Centered Design Why User Centered Design is important Approaches to User Centered Design.
James Tam User Centered Design Why User Centered Design is important Approaches to User Centered Design.
Empirical Methods in Human- Computer Interaction.
Midterm Exam Review IS 485, Professor Matt Thatcher.
Recap of IS214. Placing this course in context Creating information technology that helps people accomplish their goals, make the experience effective,
Evaluation: Inspections, Analytics & Models
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Mid-Term Exam Review IS 485, Professor Matt Thatcher.
Contextual Inquiry Analytics analysis Structured Interviews Task flow analysis Diary studies Usability tests Heuristic evaluation Participatory.
Evaluation Framework Prevention vs. Intervention CHONG POH WAN 21 JUNE 2011.
Presentation: Techniques for user involvement ITAPC1.
1 Human-Computer Interaction  Design process  Task and User Characteristics  Guidelines  Evaluation.
Formative Evaluation cs3724: HCI. Problem scenarios summative evaluation Information scenarios claims about current practice analysis of stakeholders,
Human Computer Interaction
Usability Expert Review Anna Diubina. What is usability? The effectiveness, efficiency and satisfaction with which specified users achieve specified goals.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net A Bag of Tricks: What is the Right Mix of Methods?
Work package 2: User analysis and usability testing Responsible partner: Danish Data Archives WP leader: Anne Sofie Fink, DDA.
Usability and Accessibility Usability of Accessibility Features Janey Barnes, PhD User-View, Inc. 1.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Usability Heuristics Usability Materials Dr. Dania Bilal IS 582 Spring 2007.
Interaction Design John Kelleher. Interaction Design “Designing interactive products to support people in their everyday and working lives” Software.
Anastasia Cheetham, Software Designer, Adaptive Technology Resource Centre, University of Toronto Whirlwind Tour of Progress to Date.
Qualitative and Quantitative Research Methods
Human-Computer Interaction Design process Task and User Characteristics Guidelines Evaluation ISE
Usability Engineering Dr. Dania Bilal IS 582 Spring 2007.
Innovation insight Peter H. Jones, Ph.D. Dayton, Toronto redesignresearch.com designdialogues.net.
1 Design and evaluation methods: Objectives n Design life cycle: HF input and neglect n Levels of system design: Going beyond the interface n Sources of.
User centered design IS336 with Dr. Basit Qureshi Fall 2015.
Use & Users, Community Needs, Information Needs and Behaviors Sajjad ur Rehman.
User-Centered Design Services for MSU Web Teams Sarah J. Swierenga, Director Usability & Accessibility Center MSU Webmasters 402 Computer Center February.
From: A. Cooper et al.: About Face Andreas Rudin
User Interface Evaluation
Evaluation Emma King.
Imran Hussain University of Management and Technology (UMT)
Imran Hussain University of Management and Technology (UMT)
THE ADOPTION OF PHYSIOLOGICAL MEASURES AS AN EVALUATION TOOL IN UX
Evaluating your Repository
Practical information
Defining and Measuring Customer Satisfaction
Topic for Presentaion-2
© 2013 by Nelson Education Ltd.
CSC 341 Human-Computer Interaction
Presented by Seluvaia Finaulahi
Avoiding Questionnaires
Methods Choices Overall Approach/Design
CIS 375 Competitive Success/snaptutorial.com
CIS 375 Education for Service-- snaptutorial.com.
CIS 375 Teaching Effectively-- snaptutorial.com
Human-Computer Interaction: User Study Examples
CS 522: Human-Computer Interaction Usability and HCI Topics
Part 2: Evaluation within the framework of QIBB
Good Governance Survey
SBD: Analyzing Requirements
Model based design.
SWAN New 19 Enterprise April 2018.
Chapter 12: Surveys Introduction 12.1 The method 12.2 Random samples
Valvira The National Supervisory Authority for Welfare and Health
Formative Evaluation cs3724: HCI.
Testing & modeling users
Behavioral Surveys.
1. INTRODUCTION.
Introduction to Usability Engineering
Human-Computer Interaction: Overview of User Studies
Evaluation: Inspections, Analytics & Models
Pre-Touch First-Touch Core Touch Last Touch In-Touch STAGE TOUCHPOINT
Leader-Substitute Model
Presentation transcript:

Evaluation of applications in main control room

Relatively small group of users Evaluation at GSI Relatively small group of users Users can theoretically be contacted at any time Limited by beamtime requirements Sometimes negative experience of giving in the past Operators want to give feedback  placed on notice board Feedback is given directly by users to application developer Few testings conducted. Feedback and evaluation results not publicly visible. No structured interpretation and consolidation of findings No clear back channel to application development

Evaluation vs. feedback Evaluation results can differ from feedback Interaction data Design feedback Jacob Nielsen: “To design the best UX, pay attention to what users do, not what they say. Self-reported claims are unreliable, as are user speculations about future behavior. Users do not know what they want.”

User-centered design Identify the need for human centered design Specifiy context of use System satisfies specified requirements Specifiy requirements Evaluate designs Produce design solutions

Evaluation methods Heuristic evaluation Usability testing Surveys & questionnaires (e.g. SUS, UEQ) Experience maps Contextual interviews / inquiry Focus groups & workshops

Questions Which evaluation methods should be used? What should be the role of operators during the process? Collect data globally or application-specific?