Cognitive Walkthrough Leader Responsibilities Planning the evaluation method Developing two tasks to be performed by each group member (Evaluator) Meeting.

Slides:



Advertisements
Similar presentations
Ingenuity. Expertise. Results. Get Ahead Moving from Live Training to Blended Training.
Advertisements

LEADERSHIP IN TEAMS AND DECISION GROUPS- CHAPTER 10 BUA 200- Organizational Leadership.
A Social Justice Framework in Community Engagement: The Rural Librarian Information Technology Master’s Scholarship Program Bharat Mehra
© 2007 The McGraw-Hill Companies, Inc. All rights reserved. Problem Solving & Decision Making II: Deciding & Implementing © 2007 The McGraw-Hill Companies,
Decision-Making Understand the main steps involved in rational decision-making Discuss the major reasons for poor decisions, and describe what managers.
Técnicas de Calidad en el Software Sesión # 10. Good quality software Operations Transition Testing Portability Reusability Interoperability Maintainability.
Ch 4 The Process page 1CS 368 Building Software is Difficult often delivered late often over budget always with errors must be a custom solution complexity.
© Nintendo Game Document Game Development - Spring
Design and Evaluation of Iterative Systems n For most interactive systems, the ‘design it right first’ approach is not useful. n The 3 basic steps in the.
1 CS 430 / INFO 430 Information Retrieval Lecture 24 Usability 2.
SIMS 213: User Interface Design & Development Marti Hearst Tues Feb 13, 2001.
1 Pertemuan 22 (Off-Class) Leadership in Teams and Decision Groups Matakuliah: MPG / Leadership and Organisation Tahun: 2005 Versi: versi/revisi.
INTRODUCTION. Concepts HCI, CHI Usability User-centered Design (UCD) An approach to design (software, Web, other) that involves the user Interaction Design.
User Interface Evaluation CIS 376 Bruce R. Maxim UM-Dearborn.
User Interface Design Process Gabriel Spitz. User-Interface design Steps/Goals Understand who are the users and what do they do Articulate how will users.
Evaluation in HCI Angela Kessell Oct. 13, Evaluation Heuristic Evaluation Measuring API Usability Methodology Matters: Doing Research in the Behavioral.
Usability Methods: Cognitive Walkthrough & Heuristic Evaluation Dr. Dania Bilal IS 588 Spring 2008 Dr. D. Bilal.
Initiating and Planning Systems Development projects
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
Kaizen Project Selection & Team Basics Quality Engineering and Quality Management 1 © University of Wisconsin-Madison.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Contd. Assign responsibilities Structure team Clarify roles & authority Collect internal, information Select processes.
Usability Evaluation June 8, Why do we need to do usability evaluation?
Evaluation approaches Text p Text p
Interaction Design CMU. Today’s objectives Continue Design approaches (UCD, ACD)  User-Centered Design  Activity-Centered Design.
SEG3120 User Interfaces Design and Implementation
Chapter 5 by Judith A. Effken
Chapter 9 Leadership and Decision Making in Groups.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Usability and Accessibility Usability of Accessibility Features Janey Barnes, PhD User-View, Inc. 1.
Chapter 15: Analytical evaluation. Inspections Heuristic evaluation Walkthroughs.
Usability Heuristics Usability Materials Dr. Dania Bilal IS 582 Spring 2007.
Evaluation Techniques Evaluation –tests usability and functionality of system –occurs in laboratory, field and/or in collaboration with users –evaluates.
Welcome to the Usability Center Tour Since 1995, the Usability Center has been a learning environment that supports and educates in the process of usability.
Evaluating a UI Design Expert inspection methods Cognitive Walkthrough
Usability 1 Usability evaluation Without users - analytical techniques With users - survey and observational techniques.
Authors: Bharat Mehra, Kimberly Black, Vandana Singh, Jenna Nolt, K. C. Williams, Susan Simmons, Nancy Renfro Presented by Dr. Bharat Mehra, Associate.
System Interface Evaluation: Goodreads Group 3 April 17, 2012.
Dialog Design I Basic Concepts of Dialog Design. Dialog Outline Evaluate User Problem Representations, Operations, Memory Aids Generate Dialog Diagram.
Chapter 15: Analytical evaluation. Aims: Describe inspection methods. Show how heuristic evaluation can be adapted to evaluate different products. Explain.
1 Usability evaluation and testing User interfaces Jaana Holvikivi Metropolia.
1 Technical & Business Writing (ENG-715) Muhammad Bilal Bashir UIIT, Rawalpindi.
Data Systems Integration Committee of the Earth Science Data System Working Group (ESDSWG) on Data Quality Robert R. Downs 1 Yaxing Wei 2, and David F.
Introduction to Codes, Standards, and Regulations Chattanooga State CC.
Leadership Skills. Team Meetings Set the agenda by defining goals and desired outcomes Set the agenda by defining goals and desired outcomes Keep the.
Ten Usability Heuristics with Example.. Page 2 Heuristic Evaluation Heuristic evaluation is the most popular of the usability inspection methods. Heuristic.
COGNITIVE WALKTHROUGH Vigneshwar Poojar. The cognitive walkthrough is a formalized way of imagining people’s thoughts and actions when they use an interface.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
Title of Your Project Team Member 1 Team Member 2.
1 Usability Analysis n Why Analyze n Types of Usability Analysis n Human Subjects Research n Project 3: Heuristic Evaluation.
System Interface Evaluation: Goodreads Group 3 April 17, 2012.
Presentation prepared by: Marilyn Shaw University of Northern IA This multimedia product and its contents are protected under copyright law. The following.
Superintendents’ Network Welcome! Apprentice Facilitators And Coaches.
Human Computer Interaction Lecture 15 Usability Evaluation
Document Delivery Update for May 2011 – Andrew Jackson
Evaluation Techniques 1
HCI in the curriculum The human The computer The interaction
Step 1 : Submit Project Information
HCI – DESIGN RATIONALE 20 November 2018.
Alfred Kobsa University of California, Irvine
Style guidelines Teppo Räisänen School of Information and Management
Examples and general guides
Alfred Kobsa University of California, Irvine
Solution Selection: What works?
Chapter 26 Inspections of the user interface
Evaluation.
Understanding the Problem
Prototyping Sriram Mohan.
HCI Evaluation Techniques
Title of Project goes here!
Presentation transcript:

Cognitive Walkthrough Leader Responsibilities Planning the evaluation method Developing two tasks to be performed by each group member (Evaluator) Meeting with evaluators to discuss tasks, responsibilities, when and where to perform walkthrough, use of Nielsen’s severity rating scale Performing a walkthrough for each task and creating a task analysis to use as benchmarks for comparison to Evaluator walkthroughs. This includes:  time taken to complete each task  feelings/emotions experienced  problems and errors found and causes for these problems and errors  severity rating of each problem or error found  success and failure stories as applicable

Cognitive Walkthrough Evaluator Responsibilities Meet with the Leader and brainstorm on broad tasks to perform in a given interface. Follow the procedures outlined by the Leader. Perform each task individually in the given interface Identify potential problems and errors in the given interface Use Nielsen’s severity rating scale to rate each of the problems and errors found and describe briefly the causes of these problems and errors Provide solutions for solving the problems and errors found based on interface design/usability theories and principles learned in class. Document the feelings and emotions experienced during the process thinking of how target users may feel during the process Write one success and one failure story as applicable Meet with the Leader to discuss the findings, reach a consensus on the problems and errors found and their severity ratings, among other things Generate a usability report as a team

Group 3 Members Lauren Long, Evaluator Cataloger, Transylvania Co. Library, Brevard, NC ITRL Student JULIE FORKNER, Evaluator Director, E.G. Fisher Public Library Athens, TN ITRL Student ANGELA GLOWCHESKI, Evaluator Information Specialist, Lumpkin Co. Public Library Dahlonega, GA ITRL Student SUSAN MACRELLIS, Leader Director, East Ridge City Library Chattanooga, TN ITRL Student MARILYN PONTIUS Evaluator Branch Manager, Hancock War Memorial Branch Library Washington Co, MD ITRL Student