What are your interactions doing for your visualization? Remco Chang UNC Charlotte Charlotte Visualization Center.

Slides:



Advertisements
Similar presentations
Chapter 1 Business Driven Technology
Advertisements

CRITICAL THINKING The Discipline The Skill The Art.
MMAP Middle School Math Through Applications Project Dahwun Deepak Gazi Scott Sun-Young.
Computer-Based Performance Assessments from NAEP and ETS and their relationship to the NGSS Aaron Rogat Educational Testing Service.
The Scientific Method: DR HERC
LECTURE 10: ANALYTIC PROVENANCE April 6, 2015 COMP Topics in Visual Analytics Note: slide deck adapted from R. Chang.
Data Quality and Education Sean Fox SERC, Carleton College.
EvaluationIntroVis/GfxInteractionWrap-up Thinking Interactively with Visualizations Remco Chang UNC Charlotte Charlotte Visualization Center.
Cognitive Walkthrough More evaluation without users.
Information Retrieval: Human-Computer Interfaces and Information Access Process.
VALTChessVA IntroAppsWrap-up 1/25 User-Centric Visual Analytics Remco Chang Tufts University Department of Computer Science.
Research Basics PE 357. What is Research? Can be diverse General definition is “finding answers to questions in an organized and logical and systematic.
Good Research Questions. A paradigm consists of – a set of fundamental theoretical assumptions that the members of the scientific community accept as.
Automated Analysis and Code Generation for Domain-Specific Models George Edwards Center for Systems and Software Engineering University of Southern California.
Sensemaking and Ground Truth Ontology Development Chinua Umoja William M. Pottenger Jason Perry Christopher Janneck.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
13.1 Revision IMS Information Systems Development Practices.
Information Retrieval: Human-Computer Interfaces and Information Access Process.
1 Validation and Verification of Simulation Models.
1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.
Educational Technology
Chapter 10.  Basic Functions  Insert Graphics, Audio/Video  Add Text  Create Links  Capture Brainstormed Ideas  Generate Outline  Organize Graphics,
Herts ICT Team. Objectives for the session To explore if, or when ICT should be used in Primary Science To review hardware, software and resources which.
Science Inquiry Minds-on Hands-on.
Domain Modeling (with Objects). Motivation Programming classes teach – What an object is – How to create objects What is missing – Finding/determining.
Moving forward with Scalable Game Design. The landscape of computer science courses…  Try your vegetables (sneak it in to an existing course)  Required.
Scientific Method Lab.
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
Evaluation and analysis of the application of interactive digital resources in a blended-learning methodology for a computer networks subject F.A. Candelas,
Thinking Actively in a Social Context T A S C.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Studying Visual Attention with the Visual Search Paradigm Marc Pomplun Department of Computer Science University of Massachusetts at Boston
Dist FuncIntroPersonalityProvenanceGroupWrap-up 1/40 User-Centric Visual Analytics Remco Chang Tufts University.
Knowledge representation
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis
Object-Oriented Software Engineering Practical Software Development using UML and Java Chapter 7: Focusing on Users and Their Tasks.
Learning Science and Mathematics Concepts, Models, Representations and Talk Colleen Megowan.
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
IBM Research June 14, 2007 An IP Continuum for Adaptive Interface Design Jeff Pierce © 2007 IBM Corporation.
Guidelines for Designing Inquiry-Based Learning Environments on the Web: Professional Development of Educators Byung-Ro Lim IST, Indiana Univ. July 20,
1/20 (Big Data Analytics for Everyone) Remco Chang Assistant Professor Department of Computer Science Tufts University Big Data Visual Analytics: A User-Centric.
-1- Philipp Heim, Thomas Ertl, Jürgen Ziegler Facet Graphs: Complex Semantic Querying Made Easy Philipp Heim 1, Thomas Ertl 1 and Jürgen Ziegler 2 1 Visualization.
Measuring What Matters: Technology & the Assessment of all Students Jim Pellegrino.
Assessing the Frequency of Empirical Evaluation in Software Modeling Research Workshop on Experiences and Empirical Studies in Software Modelling (EESSMod)
Lecture 7: Requirements Engineering
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
1 Introduction to Software Engineering Lecture 1.
+ Chapter 9: Management of Business Intelligence © Sabherwal & Becerra-Fernandez.
Human Factors In Visualization Research Melanie Tory and Torsten Moller Ajith Radhakrishnan Nandu C Nair.
How People Learn – Brain, Mind, Experience, and School (Bransford, Brown, & Cocking, 1999) Three core principles 1: If their (students) initial understanding.
Learning from Model-Produced Graphs in a Climate Change Science Class Catherine Gautier Geography Department UC Santa Barbara.
Human Computer Interaction
LECTURE 09: INTERACTION PT. 2: COST October 19, 2015 SDS235: Visual Analytics Note: slide deck adapted from R. Chang.
Cognitive Walkthrough More evaluating with experts.
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Evaluating the Relationships between User Interaction and Financial Visual Analysis Dong Hyun Jeong, Wenwen Dou, Felesia Stukes, William Ribarsky, Heather.
Copyright © 2011 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 1 Research: An Overview.
IntroGoalCrowdPredictionWrap-up 1/26 Learning Debugging and Hacking the User Remco Chang Assistant Professor Tufts University.
LECTURE 06: INTERACTION February 29, 2016 SDS136: Communicating with Data.
Investigate Plan Design Create Evaluate (Test it to objective evaluation at each stage of the design cycle) state – describe - explain the problem some.
Logistics of Inquiry Erin E. Peters, NBCT Williamsburg Middle School Arlington, Virginia
Design Question 4 – Element 22
Lecture 09: Interaction pt. 2: Cost
Lecture 15: Analytic Provenance
CSc4730/6730 Scientific Visualization
Automated Analysis and Code Generation for Domain-Specific Models
Title of your experimental design
Educational Technology Lab, National Kapodistrian
Cognitive Walkthrough
Presentation transcript:

What are your interactions doing for your visualization? Remco Chang UNC Charlotte Charlotte Visualization Center

Outline: Three Areas of Proposed Research… What is the role of interaction in visual analytics? – Is there a science to designing interactions and applying them to visualizations? How do we know if an interaction is “good”? – Can we evaluate (quantify?) the benefits (or costs) of interactions? If analysts use interactions to perform analysis, can we store the knowledge in the interactions? – Is it possible to create a knowledge-base by extracting knowledge from interaction logs?

Introduction: Role of Interaction Most people in the visual analytics community believe that interactivity is essential for analysis: – “A [visual] analysis session is more of a dialog between the analyst and the data… the manifestation of this dialog is the analyst’s interactions with the data representation” [Thomas & Cook 2005] – “Without interaction, [a visualization] technique or system becomes a static image or autonomously animated images” [Yi et al. 2007]

Motivation: Role of Interaction More explicitly: [Pike et al. 2009] – “A central precept of visual analytics is that it is through the interactive manipulation of a visual interface – the analytic discourse – that knowledge is constructed, tested, refined, and shared.” – “These visual displays must be embedded in an interactive framework that scaffolds the human knowledge construction process with the right tools and methods to support the accumulation of evidence and observations into theories and beliefs.”

VISUAL analytics or visual ANALYTICS? Observation: Current designs of visual analytical systems start with visual representation and add in appropriate interactions afterwards. Visual analytics = visual representation + analytics – assume that (interaction == analytic discourse), then: – Visual analytics = visual representation + interaction – VISUAL analytics = VISUAL REPRESENTATION + interaction – visual ANALYTICS = visual representation + INTERACTION Proposal: If we start the design of visual analytical systems with interactions (i.e., how a user would perform a series of tasks, or to generate hypotheses), we could focus on the ANALYTICS aspect in the design. This seems pretty hard to do… Arguably because we don’t really understand the nature of interactions for the purpose of analytics.

Case Study: Brushing and Linking The linchpin in most visualizations that utilize multiple coordinated views. – Spotfire, GeoVISTA, JIGSAW, etc. However, when used in a collaborative environment, it’s purpose becomes slightly different even though the implementation is (mostly) the same. [Isenberg et al. 2009] Hypothesis: the nature of Brushing and Linking is to coordinate between different perspectives of the same data elements, especially for data of high dimensionality. It is now easier to consider a system design around this… VisualAnalytics

Evaluation: The Benefits of Interactions Scientifically, how is interaction useful? With interaction, – Does an analyst perform tasks faster? – Does an analyst perform tasks more accurately? Short answer: no [Lim et al. 1996] [Jeong et al. 2009a] [Jeong et al. 2009b] [Lipford et al. 2009]

Evidence: Interaction Is Useful in Visualizations… Empirical evidence that interactivity is useful… (1) Users don’t “give up” as easily [Jeong et al. 2009] Green bar denotes the number of participants who “gave up” during an analysis. (iPCA is an interactive visualization, and SAS/INSIGHT is a traditional text-based interface with limited interactivity)

Evidence: Interactivity Is Useful in Visualizations… Empirical evidence that interactivity is useful… (2) Users become more proficient faster The longer a user uses an interactive visualization, the better (faster) they become. Whereas when the same user uses a non-interactive visualization, the amount of time spent remains (roughly) the same. Time Spent using System User’s Task Completion Time Non-Interactive System Interactive System Slow Fast

Evidence: Interactivity Is Useful in Visualizations… Empirical evidence that interactivity is useful… (3) Users prefer interactivity [Jeong et al. 2009] Users giving letter grades to the two tools after using them during an experiment. (iPCA is an interactive visualization, and SAS/INSIGHT is a traditional text-based interface with limited interactivity)

Future Work: How is Interactivity Useful? We propose that: (1)Interactivity is indeed useful (2)We’ve been measuring the wrong things Hypothesis: – Interactivity is useful to keep a user “in a cognitive zone” which is why they don’t give up – Interactivity allows the user to gather more “contextual information” users spend more time to understand the problem before attempting to solve it – We need new metrics and methods to measure the “benefits of interactivity”

Provenance: Capturing User Interactions What is in a user’s interactions? If (interactions == analytic discourse), what can we learn from the user’s interactions? Is it possible to extract “analysis” from “interactions”?

Study: What is in a User’s Interactions? Goal: determine if there really is “analysis” in a user’s interactions. Analysts Grad Students (Coders) Logged (semantic) Interactions Compare! (manually) Strategies Methods Findings Guesses of Analysts’ thinking WireVis Interaction-Log Vis

Results: What’s in a User’s Interactions From this experiment, we find that interactions contains at least: – 60% of the (high level) strategies – 60% of the (mid level) methods – 79% of the (low level) findings R. Chang et al., Recovering Reasoning Process From User Interactions. IEEE Computer Graphics and Applications, R. Chang et al., Evaluating the Relationship Between User Interaction and Financial Visual Analysis. IEEE Symposium on VAST, 2009.

Provenance: Future Work Using semantic interaction capturing, we might be able to collect all the analysis processes of expert analysts and create a knowledge-base that is useful for – Training: many domain specific analytics tasks are difficult to teach – Guidance: use existing knowledge to guide future analyses – Verification and validation: check for accuracy and correctness But our study was crude and made lots of assumptions… – How do we extract analysis from interaction logs semi- automatically? – Can these methods be generalized to all visualizations? – What does a knowledge-base of interactions look like and how to use it? – A model of how and what to capture in a visualization for extracting an analytical process is necessary.

Conclusion We do not yet have a perfect foundation on the “science of interaction”, but we are getting there. The three areas that I propose that would have the highest impact in interaction research are: – Fundamental (Functional) understanding of interaction and interaction techniques – Evaluation methods and metrics for measuring the benefits (and costs) of interactions – Capturing and re-using interactions to create a knowledge-base of analyst’s strategies and methods

Thank you!

Backup Slides

Results: What’s in a User’s Interactions Why are these so much lower than others? – (recovering “methods” at about 15%) Only capturing a user’s interaction in this case is insufficient.

Understanding Interaction as a Science With deeper understanding of

Scenario Revisited: Task: design a visual analytical system to analyze IP logs, starting the design with interaction elements. Solution: A system that displays different aspects of the IP data (e.g., dest IP, orig IP, time, port number, etc.) that are coordinated through Brushing and Linking. – Visual displays (integrated or coordinated multi-views) is a secondary consideration. – The visual representation of what IP logs could be can also be considered independently. – (Number of users and the applied environment)?

Taxonomy of Interaction Techniques: [Yi et al. 2007] – Select: mark something as interesting – Explore: show me something else – Reconfigure: show me a different arrangement – Encode: show me a different representation – Abstract/Elaborate: show me more or less detail – Filter: show me something conditionally – Connect: show me related items Seems all 7 elements are necessary, but that itself doesn’t lead to a design of a system…

Is Visual ANALYTICS possible? We propose that indeed it is…