A Nested Model for Visualization Design and Validation Tamara Munzner University of British Columbia Department of Computer Science.

Slides:



Advertisements
Similar presentations
Recuperação de Informação B Cap. 10: User Interfaces and Visualization 10.1,10.2,10.3 November 17, 1999.
Advertisements

Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Chapter 5 Development and Evolution of User Interface
Critical Reading Strategies: Overview of Research Process
Style for Special CS Components. Mathematics “Our confidence in any science is roughly proportional to the amount of mathematics it employs” (Bronowski.
Ch.6: Requirements Gathering, Storyboarding and Prototyping
Tamara Munzner University of British Columbia Department of Computer Science Outward and Inward Grand Challenges VisWeek08 Panel: Grand Challenges for.
Mapping Studies – Why and How Andy Burn. Resources The idea of employing evidence-based practices in software engineering was proposed in (Kitchenham.
Each individual person is working on a GUI subset. The goal is for you to create screens for three specific tasks your user will do from your GUI Project.
Presented by: Thabet Kacem Spring Outline Contributions Introduction Proposed Approach Related Work Reconception of ADLs XTEAM Tool Chain Discussion.
DECO3008 Design Computing Preparatory Honours Research KCDCC Mike Rosenman Rm 279
1 Visualization Process and Collaboration Tamara Munzner Department of Computer Science University of British Columbia
Composite Rectilinear Deformation for Stretch and Squish Navigation James Slack, Tamara Munzner University of British Columbia November 1, 2006 Imager.
Research Cycles Tamara Munzner University of British Columbia Computer Science Dept, 2366 Main Mall Vancouver BC V6T 1Z4 Canada
Algorithms and Problem Solving-1 Algorithms and Problem Solving.
Evaluation Adam Bodnar CPSC 533C Monday, April 5, 2004.
Paper Title Your Name CMSC 838 Presentation. CMSC 838T – Presentation Motivation u Problem paper is trying to solve  Characteristics of problem  … u.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Live Re-orderable Accordion Drawing (LiveRAC) Peter McLachlan, Tamara Munzner Eleftherios Koutsofios, Stephen North AT&T Research Symposium August, 2007.
Information Retrieval: Human-Computer Interfaces and Information Access Process.
Retrieval Evaluation. Introduction Evaluation of implementations in computer science often is in terms of time and space complexity. With large document.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
10th Workshop "Software Engineering Education and Reverse Engineering" Ivanjica, Serbia, 5-12 September 2010 First experience in teaching HCI course Dusanka.
Introduction to Theory & Research Design
What is Interaction Design? “ …designing interactive products to support people in their everyday and working lives. ” (Preece, Rogers, and Sharp – 2002)
Chapter 7 Requirement Modeling : Flow, Behaviour, Patterns And WebApps.
Information Design and Visualization
Measuring the Effort for Creating and Using Domain-Specific Models Yali Wu PhD Candidate 18 October 2010.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
Design Science Method By Temtim Assefa.
Visualizations to Support Interactive Goal Model Analysis Jennifer Horkoff 1 Eric Yu 2 Department of Computer Science 1 Faculty of Information 2
Evaluation of software engineering. Software engineering research : Research in SE aims to achieve two main goals: 1) To increase the knowledge about.
Visualizing Information in Global Networks in Real Time Design, Implementation, Usability Study.
Process Analysis Agenda  Multiple methods & perspectives There are lots of ways to map processes  Useful in many situations not just HRIS design  Preparation.
Research and Analysis Methods October 5, Surveys Electronic vs. Paper Surveys –Electronic: very efficient but requires users willing to take them;
Requirements Engineering Requirements Elicitation Process Lecture-8.
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 1: Software and Software Engineering.
Validation of Visualizations CS 4390/5390 Data Visualization Shirley Moore, Instructor September 24,
Course Organization & Format Visualization II MSIM 842, CS 795/895 Instructor: Jessica Crouch.
Drexel University CS 451 Software Engineering Winter Yuanfang Cai Room 104, University Crossings
Assessing the Frequency of Empirical Evaluation in Software Modeling Research Workshop on Experiences and Empirical Studies in Software Modelling (EESSMod)
Computer Science Project Criteria. Computer Science Project The project is intended to simulate the analysis, design, progamming and documentation stages.
Requirements Collection By Dr. Gabriel. Requirements A requirement is any function, constraint, or property that the system must provide, meet, or satisfy.
Evaluation Proposal Defense Observations and Suggestions Yibeltal Kiflie August 2009.
Oh, no! validation bingo!. algorithm complexity analysis.
Chapter 3 Managing Design Processes. 3.1 Introduction Design should be based on: –User observation Analysis of task frequency and sequences –Prototypes,
Software Design: Principles, Process, and Concepts Getting Started with Design.
1-1 Software Development Objectives: Discuss the goals of software development Identify various aspects of software quality Examine two development life.
Presenting and Analysing your Data CSCI 6620 Spring 2014 Thesis Projects: Chapter 10 CSCI 6620 Spring 2014 Thesis Projects: Chapter 10.
Digital Libraries1 David Rashty. Digital Libraries2 “A library is an arsenal of liberty” Anonymous.
The InfoVis Toolkit Jean-Daniel Fekete INRIA Futurs/LRI, France
Visualization Four groups Design pattern for information visualization
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
WHAT IF ANALYSIS USED TO IDENTIFY HAZARDS HAZARDOUS EVENTS
CI.III.1 Wider Adoption, Deployment, Utilization of a Cyberinfrastructure David De Roure.
Quality Is in the Eye of the Beholder: Meeting Users ’ Requirements for Internet Quality of Service Anna Bouch, Allan Kuchinsky, Nina Bhatti HP Labs Technical.
Helpful hints for planning your Wednesday investigation.
Framework and Models. Framework To help understand the field To develop a system that will allow us to ▫ Develop good designs ▫ Test ▫ Evaluate We need.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
Reading Discussion – s519 by Peter Hall – 1/21/09 Ryan, J., McClure, C. R., & Bertot, J. C. (2001). Choosing measures to evaluate networked information.
SIE 515 Design Evaluation Lecture 7.
PostGraduate Research Excellence Symposium (PGReS) 2015
CS 990: Topics in Scientific Visualization Research
Lecture Software Process Definition and Management Chapter 3: Descriptive Process Models Dr. Jürgen Münch Fall
Information Design and Visualization
CS 522: Human-Computer Interaction Lab: Formative Evaluation
Fundamentals of Human Computer Interaction (HCI)
User interface design.
Introduction to Visual Analytics
1. INTRODUCTION.
Presentation transcript:

A Nested Model for Visualization Design and Validation Tamara Munzner University of British Columbia Department of Computer Science

2 How do you show your system is good? so many possible ways! algorithm complexity analysis field study with target user population implementation performance (speed, memory) informal usability study laboratory user study qualitative discussion of result pictures quantitative metrics requirements justification from task analysis user anecdotes (insights found) user community size (adoption) visual encoding justification from theoretical principles

3 Contribution nested model unifying design and validation guidance on when to use what validation method different threats to validity at each level of model recommendations based on model

4 Four kinds of threats to validity

5 domain problem characterization wrong problem they don’t do that

6 Four kinds of threats to validity domain problem characterization data/operation abstraction design wrong problem they don’t do that wrong abstraction you’re showing them the wrong thing

7 Four kinds of threats to validity domain problem characterization data/operation abstraction design encoding/interaction technique design wrong problem they don’t do that wrong abstraction you’re showing them the wrong thing wrong encoding/interaction technique the way you show it doesn’t work

8 Four kinds of threats to validity domain problem characterization data/operation abstraction design encoding/interaction technique design algorithm design wrong problem they don’t do that wrong abstraction you’re showing them the wrong thing wrong encoding/interaction technique the way you show it doesn’t work wrong algorithm your code is too slow

9 threat: wrong problem threat: bad data/operation abstraction threat: ineffective encoding/interaction technique threat: slow algorithm each validation works for only one kind of threat to validity Match validation method to contributions

10 Analysis examples justify encoding/interaction design qualitative result image analysis test on target users, get utility anecdotes justify encoding/interaction design measure system time/memory qualitative result image analysis computational complexity analysis qualitative/quantitative image analysis lab study, measure time/errors for operation Interactive visualization of genealogical graphs. McGuffin and Balakrishnan. InfoVis MatrixExplorer. Henry and Fekete. InfoVis An energy model for visual graph clustering. (LinLog) Noack. Graph Drawing 2003 Flow map layout. Phan et al. InfoVis LiveRAC. McLachlan, Munzner, Koutsofios, and North. CHI Effectiveness of animation in trend visualization. Robertson et al. InfoVis measure system time/memory qualitative result image analysis observe and interview target users justify encoding/interaction design qualitative result image analysis observe and interview target users justify encoding/interaction design field study, document deployed usage

11 Nested levels in model domain problem characterization data/operation abstraction design encoding/interaction technique design algorithm design output of upstream level input to downstream level challenge: upstream errors inevitably cascade if poor abstraction choice made, even perfect technique and algorithm design will not solve intended problem

12 Characterizing domain problems tasks, data, workflow of target users problems: tasks described in domain terms requirements elicitation is notoriously hard problem data/op abstraction enc/interact technique algorithm

13 Designing data/operation abstraction mapping from domain vocabulary/concerns to abstraction may require transformation! data types: data described in abstract terms numeric tables, relational/network, spatial,... operations: tasks described in abstract terms generic sorting, filtering, correlating, finding trends/outliers... datatype-specific path following through network... problem data/op abstraction enc/interact technique algorithm

14 Designing encoding,interaction techniques visual encoding marks, attributes,... extensive foundational work exists interaction selecting, navigating, ordering,... significant guidance exists Semiology of Graphics. Jacques Bertin, Gauthier-Villars 1967, EHESS 1998 problem data/op abstraction enc/interact technique algorithm

15 Designing algorithms well-studied computer science problem create efficient algorithm given clear specification no human-in-loop questions problem data/op abstraction enc/interact technique algorithm

16 Immediate vs. downstream validation threat: wrong problem threat: bad data/operation abstraction threat: ineffective encoding/interaction technique threat: slow algorithm implement system

17 Domain problem validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique threat: slow algorithm implement system immediate: ethnographic interviews/observations

18 Domain problem validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique threat: slow algorithm implement system validate: observe adoption rates downstream: adoption (weak but interesting signal)

19 Abstraction validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique threat: slow algorithm implement system validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates downstream: can only test with target users doing real work

20 Encoding/interaction technique validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm implement system validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates immediate: justification useful, but not sufficient - tradeoffs

21 Encoding/interaction technique validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm implement system validate: qualitative/quantitative result image analysis validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates downstream: discussion of result images very common

22 Encoding/interaction technique validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm implement system validate: qualitative/quantitative result image analysis validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates downstream: studies add another level of rigor (and time)

23 Encoding/interaction technique validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm implement system validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates usability testing necessary for validity of downstream testing not validation method itself!

24 Algorithm validation threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm validate: analyze computational complexity implement system validate: measure system time/memory validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates immediate vs. downstream here clearly understood in CS

25 threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm validate: analyze computational complexity implement system validate: measure system time/memory validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates Avoid mismatches can’t validate encoding with wallclock timings

26 threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm validate: analyze computational complexity implement system validate: measure system time/memory validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates Avoid mismatches can’t validate abstraction with lab study

27 threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm validate: analyze computational complexity implement system validate: measure system time/memory validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates Single paper would include only subset can’t do all for same project not enough space in paper or time to do work

28 threat: wrong problem validate: observe and interview target users threat: bad data/operation abstraction threat: ineffective encoding/interaction technique validate: justify encoding/interaction design threat: slow algorithm validate: analyze computational complexity implement system validate: measure system time/memory validate: qualitative/quantitative result image analysis [test on any users, informal usability study] validate: lab study, measure human time/errors for operation validate: test on target users, collect anecdotal evidence of utility validate: field study, document human usage of deployed system validate: observe adoption rates Single paper would include only subset pick validation method according to contribution claims

29 Real design process iterative refinement levels don’t need to be done in strict order intellectual value of level separation exposition, analysis shortcut across inner levels + implementation rapid prototyping, etc. low-fidelity stand-ins so downstream validation can happen sooner

30 Related work influenced by many previous pipelines but none were tied to validation [Card, Mackinlay, Shneiderman 99],... many previous papers on how to evaluate but not when to use what validation methods [Carpendale 08], [Plaisant 04], [Tory and Möller 04] exceptions good first step, but no formal framework [Kosara, Healey, Interrante, Laidlaw, Ware 03] guidance for long term case studies, but not other contexts [Shneiderman and Plaisant 06] only three levels, does not include algorithm [Ellis and Dix 06], [Andrews 08]

31 Recommendations: authors explicitly state level of contribution claim(s) explicitly state assumptions for levels upstream of paper focus just one sentence + citation may suffice goal: literature with clearer interlock between papers better unify problem-driven and technique-driven work

32 Recommendation: publication venues we need more problem characterization ethnography, requirements analysis as part of paper, and as full paper now full papers relegated to CHI/CSCW does not allow focus on central vis concerns legitimize ethnographic “orange-box” papers! observe and interview target users

33 Lab study as core now deemed legitimate justify encoding/interaction design qualitative result image analysis test on target users, get utility anecdotes justify encoding/interaction design measure system time/memory qualitative result image analysis computational complexity analysis qualitative/quantitative image analysis lab study, measure time/errors for operation Interactive visualization of genealogical graphs. McGuffin and Balakrishnan. InfoVis MatrixExplorer. Henry and Fekete. InfoVis An energy model for visual graph clustering. (LinLog) Noack. Graph Drawing 2003 Flow map layout. Phan et al. InfoVis LiveRAC. McLachlan, Munzner, Koutsofios, and North. CHI Effectiveness of animation in trend visualization. Robertson et al. InfoVis measure system time/memory qualitative result image analysis observe and interview target users justify encoding/interaction design qualitative result image analysis observe and interview target users justify encoding/interaction design field study, document deployed usage

34 Limitations oversimplification not all forms of user studies addressed infovis-oriented worldview are these levels the right division?

35 Conclusion new model unifying design and validation guidance on when to use what validation method broad scope of validation, including algorithms recommendations be explicit about levels addressed and state upstream assumptions so papers interlock more we need more problem characterization work these slides posted at