Ben Shneiderman & Catherine Plaisant

Slides:



Advertisements
Similar presentations
Workshop: Interactive Visual Exploration of Electronic Health Records David Wang, Catherine Plaisant, Ben Shneiderman University of Maryland College Park,
Advertisements

Peter Griffith and Megan McGroddy 4 th NACP All Investigators Meeting February 3, 2013 Expectations and Opportunities for NACP Investigators to Share and.
Student research behavior — prototype application From research conducted at the University of Maryland, Computers in Libraries, March 2009 Dan.
LifeFlow Case Study: Comparing Traffic Agencies' Performance from Traffic Incident Logs This case study was conducted by John Alexis Guerra Gómez and Krist.
Evidence-Based Information Retrieval in Bioinformatics
iOpener Workbench: Tools for Rapid Understanding of Scientific Literature Cody Dunne, Ben Shneiderman, Bonnie Dorr & Judith Klavans {cdunne, ben,
First Steps to NetViz Nirvana: Evaluating Social Network Analysis with NodeXL 1.
Evaluating Visual and Statistical Exploration of Scientific Literature Networks Robert Gove 1,3, Cody Dunne 1,3, Ben Shneiderman 1,3, Judith Klavans 2,
TEST REVISION Reshmi Ravi. Overview Thursday 23 April – 6:00 PM – 7:30 PM Conference Centre Lecture Theatre/ : Aaa-Fit Eng3407/ : Fre-Koo.
Interactive Pattern Search in Time Series (Using TimeSearcher 2) Paolo Buono, Aleks Aris, Catherine Plaisant, Amir Khella, and Ben Shneiderman Proceedings,
UI Standards & Tools Khushroo Shaikh.
1 User Centered Design and Evaluation. 2 Overview Why involve users at all? What is a user-centered approach? Evaluation strategies Examples from “Snap-Together.
Empirical Methods in Human- Computer Interaction.
Readability Metrics for Network Visualization Cody Dunne and Ben Shneiderman Human-Computer Interaction Lab & Department of Computer Science University.
Gene Chasing with the Hierarchical Clustering Explorer: Finding Meaningful Clusters in High Dimensional Data Jinwook Seo and Ben Shneiderman HCIL.
An Evaluation of Microarray Visualization Tools for Biological Insight Presented by Tugrul Ince and Nir Peer University of Maryland Purvi Saraiya Chris.
Lifelines2: Hypothesis Generation in Multiple EHRs Taowei David Wang Catherine Plaisant Ben Shneiderman Shawn Murphy Mark Smith Human-Computer Interaction.
HCI Research Methods Ben Shneiderman Founding Director ( ), Human-Computer Interaction Lab Professor, Department of Computer Science.
FINDING PATTERNS IN TEMPORAL DATA KRIST WONGSUPHASAWAT TAOWEI DAVID WANG CATHERINE PLAISANT BEN SHNEIDERMAN HUMAN-COMPUTER INTERACTION LAB UNIVERSITY OF.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
Analyzing Social Media Networks with NodeXL
Interactive Visual Discovery in Temporal Event Sequences: Electronic Health Records & Other Applications Ben Shneiderman Founding Director.
Sophia Gatowski, Ph.D., Consultant National Council of Juvenile & Family Court Judges Sophia Gatowski, Ph.D., Consultant National Council of Juvenile &
Conducting a User Study Human-Computer Interaction.
Research Methods & Writing a Hypothesis. Scientific Method Hypothesis  What you expect to happen Subjects  The who (or what) of the study Variables.
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Usability testing. Goals & questions focus on how well users perform tasks with the product. – typical users – doing typical tasks. Comparison of products.
Today: Our process Assignment 3 Q&A Concept of Control Reading: Framework for Hybrid Experiments Sampling If time, get a start on True Experiments: Single-Factor.
The Team Science Toolkit is an interactive website that provides resources to help users support, engage in, and study.
Readability Metrics for Network Visualization Cody Dunne and Ben Shneiderman Human-Computer Interaction Lab & Department of Computer Science University.
Methodology and Explanation XX50125 Lecture 3: Usability testing Dr. Danaë Stanton Fraser.
Krist Wongsuphasawat John Alexis Guerra Gomez Catherine Plaisant Taowei David Wang Ben Shneiderman Meirav Taieb-Maimon Presented by Ren Bauer.
School of Industrial Engineering - University of Oklahoma 1 HTIC-REU Research Experience for Undergraduates at the Human Technology Interaction Center.
1 Universal Pre-Kindergarten (UPK) Scope of FY08 Evaluation Activities December 11, 2007.
Week 2 The lecture for this week is designed to provide students with a general overview of 1) quantitative/qualitative research strategies and 2) 21st.
Evaluation Methods - Summary. How to chose a method? Stage of study – formative, iterative, summative Pros & cons Metrics – depends on what you want to.
Evaluating VR Systems. Scenario You determine that while looking around virtual worlds is natural and well supported in VR, moving about them is a difficult.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Discovering Interesting Usage Patterns in Text Collections:
Bubble Lab Write-Up. Skill Being Assessed Design and conduct scientific investigations. Formulate and revise scientific explanations and models using.
Interactive Visual Discovery in Event Analytics: Electronic Health Records & Other Applications Ben Shneiderman Founding Director ( ),
Usability & Evaluation in Visualizing Biological Data Chris North, Virginia Tech VizBi.
HCC 831 User Interface Design and Evaluation. What is Usability?
Cognitive Informatics for Biomedicine – Chapter 5
Using core competencies in curriculum design
MILCS: An effective way of evaluating Information Visualizations
Interactive Event Sequence Query and Transformation
Understanding how MIT faculty, students, and researchers work
Data Collection Methods II
Data Collection Methods II
Lecture 16: Evaluation Techniques
Harry Hochheiser Assistant Professor
Explore business opportunities with emerging tech
Lecture on Primary Data Collection
Module 02 Research Strategies.
The Event Quartet: How Visual Analytics Works for Temporal Data Ben Shneiderman Founding Director ( ), Human-Computer Interaction.
Introduction and Literature Review
Evaluation techniques
MIXED METHODS IN RESEARCH STUDIES: LEARNING FROM EXAMPLES
Information Design and Visualization
The Russia Financial Literacy and Education Trust Fund
Finding Great Resources on the Internet
User Interface Design and Evaluation
Incorporating NVivo into a Large Qualitative Project
CS 522: Human-Computer Interaction Research Methods in HCI
CHAPTER 7: Information Visualization
HCI Evaluation Techniques
CHAPTER 2: Guidelines, Principles, and Theories
CHAPTER 14: Information Visualization
An Actor, Game & Network Analysis of Decision Making in the English Regions Sarah Ayres (University of Bristol) & Ian Stafford (Cardiff University)
Presentation transcript:

Ben Shneiderman & Catherine Plaisant Strategies for evaluating information visualization tools: Multi-dimensional In-depth Long-term Case studies (MILCs) (BELIV 2006 at AVI Conference) http://dl.acm.org/citation.cfm?id=1168158 Ben Shneiderman & Catherine Plaisant University of Maryland Human-Computer Interaction Lab Test of Time Award from BELIV Workshop 2016 http://beliv.cs.univie.ac.at/

Multi-Dimensional In-Depth Long-term Case Studies (MILCs) Domain experts (+ our research team) Exploring their own data To produce their research results Using new interactive tools Time-timited structured evaluation (4-8 weeks) Observations, interviews, surveys, etc. Replicated case studies = Hypothesis testing (Shneiderman & Plaisant, BELIV 2006)

TimeSearcher: PhD 2003 2 case studies Genome biology: DNA expression Nucleotide sequences

Hierarchical Clustering Explorer: TVCG 2006 3 case studies Biologist, statistician, meteorologist Survey results from 57 users 94 citations

SocialAction: CHI 2008 4 case studies Pushback from reviewers Business consultant, medical librarian Journalist, terrorism analyst Pushback from reviewers But we succeed 162 citations

Treeversity PHD: 2013 13 case studies

Treeversity PHD: 2013

Multi-Dimensional In-Depth Long-term Case Studies (MILCs) Other PhD students who conducted MILCs David Wang Krist Wongsuphasawat Cody Dunne Megan Monroe Sana Malik Fan Du

The New ABCs of Research (Oxford, 2016) Guide for Junior researchers Manifesto for Senior researchers Academic administrators Business leaders Funding agencies    www.cs.umd.edu/hcil/newabcs

Research Methods Method Definition Pros Cons Controlled Experiments Defined task, hypotheses, alter IVs (treatments), measure DVs, run stats, prove hypoths Scientific, reproducible Biases can undermine results Usability Testing Evaluate users efficiency & subjective reaction in use of tool for specific tasks, list of changes Fast, insight to user thought processes Small user sample, short-term, limited scope Case Studies (Design Study Methodology) Longer term, with domain experts, design tool, write up reflections/experiences insight to user thought processes, guidance to refine tool Takes a long time, questionable generalizability