How people visually explore geospatial data Urška Demšar Geoinformatics, Dept of Urban Planning and Environment Royal Institute of Technology (KTH), Stockholm,

Slides:



Advertisements
Similar presentations
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Advertisements

K-6 Science and Technology Consistent teaching – Assessing K-6 Science and Technology © 2006 Curriculum K-12 Directorate, NSW Department of Education and.
National Curriculum Framework n Exploring and Developing ideas n Investigating and making art, craft & design n Evaluating and developing work n Knowledge.
Lesson Overview 1.1 What Is Science?.
Cognitive Walkthrough More evaluation without users.
USABILITY AND EVALUATION Motivations and Methods.
Semester in review. The Final May 7, 6:30pm – 9:45 pm Closed book, ONE PAGE OF NOTES Cumulative Similar format to midterm (probably about 25% longer)
Usability presented by the OSU Libraries’ u-team.
Classification Dr Eamonn Keogh Computer Science & Engineering Department University of California - Riverside Riverside,CA Who.
Useability.
Part 4: Evaluation Chapter 20: Why evaluate? Chapter 21: Deciding on what to evaluate: the strategy Chapter 22: Planning who, what, where, and when Chapter.
1 Keeping Track: Coordinating Multiple Representations in Programming Pablo Romero, Benedict du Boulay & Rudi Lutz IDEAs Lab, Human Centred Technology.
Data-collection techniques. Contents Types of data Observations Event logs Questionnaires Interview.
Embedding NVivo in postgraduate social research training Howard Davis & Anne Krayer 6 th ESRC Research Methods Festival 8-10 July 2014.
ICS 463, Intro to Human Computer Interaction Design: 8. Evaluation and Data Dan Suthers.
User Experience Design Goes Agile in Lean Transformation – A Case Study (2012 Agile Conference) Minna Isomursu, Andrey Sirotkin (VTT Technical Research.
Fig Theory construction. A good theory will generate a host of testable hypotheses. In a typical study, only one or a few of these hypotheses can.
1. Learning Outcomes At the end of this lecture, you should be able to: –Define the term “Usability Engineering” –Describe the various steps involved.
Chapter 14: Usability testing and field studies
Scientific Methodology © Keith Klestinski, Scientific Methodology Observation or Thought Ask a Question –based on research or personal observation.
Predictive Evaluation
Defining the Research Problem
Impact assessment framework
ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.
Sarah Drummond Dept. Computer Science University of Durham, UK MSc Research An Investigation into Computer Support for Cooperative Work in Software Engineering.
Big Idea 1: The Practice of Science Description A: Scientific inquiry is a multifaceted activity; the processes of science include the formulation of scientifically.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
WELNS 670: Wellness Research Design Chapter 5: Planning Your Research Design.
Human Computer Interaction
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Experimentation in Computer Science (Part 1). Outline  Empirical Strategies  Measurement  Experiment Process.
Where did plants and animals come from? How did I come to be?
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
Elaine Ménard & Margaret Smithglass School of Information Studies McGill University [Canada] July 5 th, 2011 Babel revisited: A taxonomy for ordinary images.
Designing an Experiment Lesson 1.3 Chapter 1: Using Scientific Inquiry Interactive Science Grade 8, Pearson Education Inc., Upper Saddle River, New Jersey.
The product of evaluation is knowledge. This could be knowledge about a design, knowledge about the user or knowledge about the task.
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
Assessing Peer Support and Usability of Blogging Technology Yao Jen Chang Department of Electronic Engineering Chung-Yuan Christian University, Taiwan.
Research Design and Evaluation Colin Ware. Goals for empirical research Uncover fundamental truths and test theories (late stage science). Discover the.
Geographic Visualization to Support Epidemiology in Bulgaria Anthony C. Robinson GeoVISTA Center Department of Geography The Pennsylvania State University.
1-2 Scientific Inquiry How do scientists investigate the natural world? What role do models, theories, and laws play in science?
© Tan,Steinbach, Kumar Introduction to Data Mining 8/05/ Data Mining: Exploring Data Lecture Notes for Chapter 3 Introduction to Data Mining by Tan,
Lesson Overview Lesson Overview What Is Science? Lesson Overview 1.1 What Is Science?
Cognitive Walkthrough More evaluating with experts.
What is Science? SECTION 1.1. What Is Science and Is Not  Scientific ideas are open to testing, discussion, and revision  Science is an organize way.
Introduction to Evaluation “Informal” approaches.
Fall 2002CS/PSY Predictive Evaluation (Evaluation Without Users) Gathering data about usability of a design by a specified group of users for a particular.
Lesson Overview Lesson Overview What Is Science?.
Scientific Methodology Vodcast 1.1 Unit 1: Introduction to Biology.
Usability Engineering Dr. Dania Bilal IS 587 Fall 2007.
1 Prepared by: Laila al-Hasan. 1. Definition of research 2. Characteristics of research 3. Types of research 4. Objectives 5. Inquiry mode 2 Prepared.
Objectives of session By the end of today’s session you should be able to: Define and explain pragmatics and prosody Draw links between teaching strategies.
Week 2: Interviews. Definition and Types  What is an interview? Conversation with a purpose  Types of interviews 1. Unstructured 2. Structured 3. Focus.
Cluster Analysis What is Cluster Analysis? Types of Data in Cluster Analysis A Categorization of Major Clustering Methods Partitioning Methods.
Human Performance Enhancement System Park Young Ho Dept. of Nuclear & Quantum Engineering Korea Advanced Institute of Science and Technology December 23.
Introduction to Quantitative Research
Where have we been? Cartography as communication systems: then (linear) and now (cyclical) Geospatial data Visualization Pipeline (Ben Fry) Getting a message.
SY DE 542 User Testing March 7, 2005 R. Chow
A logical and systematic problem solving process
CSc4730/6730 Scientific Visualization
1-1 What is Science? What Science Is and Is Not
Toward a Reliable Evaluation of Mixed-Initiative Systems
Like all science, biology is a process of inquiry. Scientists:
COMP444 Human Computer Interaction Usability Engineering
HCI Evaluation Techniques
Testing & modeling users
Cognitive Walkthrough
A logical and systematic problem solving process
A logical and systematic problem solving process
Presentation transcript:

How people visually explore geospatial data Urška Demšar Geoinformatics, Dept of Urban Planning and Environment Royal Institute of Technology (KTH), Stockholm, Sweden ICA WS on Geospatial Analysis and Modeling 8 th July 2006 Vienna

Developing geovisualisation tools Developing a usable and useful information system User-centred design Human-Computer Interaction (HCI) Knowledge about users and how they use the system Geovisualisation tools and systems visual exploration analysis presentation of geospatial data For a long time: technology-driven development A recent shift in attitude: user-centred development

Usefulness Utility Usability Can the functionality of the system do what is needed? How well can typical users use the system? Usability evaluation Process of systematically collecting & analysing data on how users use the system for a particular task in a particular environment. User-centred design Usability of an information system is the extent to which the system supports users to achieve specific goals in a given context of use and to do so effectively, efficiently and in a satisfactory way. Nielsen 1993 of a computer system Evaluate systems functionality Assess users experience Identify specific problems

Usability testing Formal evaluation Exploratory usability User testing Observing users Measuring the accuracy and efficiency of users performance on typical tasks Assessing how the users work with the system performing predefined tasks questionnaires thinking-aloud methodology observation, video controlled measurements: errors, time descriptive data: verbal protocols Qualitative evaluation Quantitative evaluation Methods complement each other! evaluation through user participation

Exploratory usability experiment GeoVISTA - based visual data mining system Dataset with clearly observable spatial and other patterns Exploratory usability experiment How people visually explore geospatial data? Which exploration strategies they adopt? Which visualisations they prefer to use? Formal usability issues: Edsall 2003, Robinson et al. 2005

Data Iris setosaIris versicolorIris virginica Iris dataset - famous from pattern recognition Fischer plants, 50 in each class, 4 attributes Linear separability in attribute space Original dataset new attributes plant measurements bedrock soil landuse put in a spatial context Linear separability in geographic space

Visual data mining: Data mining method which uses visualisation as a communication channel between the user and the computer to discover new patterns. Data exploration by visual data mining Data mining = a form of pattern recognition the human brain The best pattern recognition apparatus How to use it in data mining? Computers communicate with humans visually. Computerised data visualisation

Visualisations geoMap Multiform bivariate matrix Parallel Coordinates Plot (PCP) Brushing & linking + interactive selection Exploration system Gahegan et al. 2002, Takatsuka and Gahegan 2002 GeoVISTA Studio

Participants Small number of participants: 6Discount usability engineering Nielsen 1994, Tobon 2002 The majority of the usability issues are detected with 3-5 participants. cost & staff limitations Students of the International Master Programme in Geodesy and Geoinformatics at KTH non-native English speakers, fluent in English nationality/ mother tongue Ghanian Russian Slovenian Spanish Swedish gender 50/50 engineering background familiar with GIS voluntary participation Not colour-blind

Experiment design 1. Introduction: - what the test was about, consent for using the data, etc. Usability test in English performed individually under observation h per participant 5 steps 2. Background questionnaire: - gathering information on gender, mother tongue, background, etc. 3. Training: (unlimited time: ca min per participant) - introduction to data and visual data mining system - independent work though a script - questions allowed

4. Free exploration: (limited time: 15 min per participant) - whatever exploration in whatever way the participant wanted - no questions allowed - Verbal Protocol analysis – thinking-aloud - cooperative evaluation: if the participant stops talking, the observer can ask questions (What are you trying to do?, What are you thinking now?) 5. Rating questionnaire: - gathering information on participants opinion about the system - measuring perceived usefulness & learnability The main part of the test

Results 1. Perceived usefulness & learnability The bivariate matrix the easiest to use. The map the easiest to understand. The PCP the most difficult to understand and use. 2. Exploratory usability Analysis of the thinking-aloud protocols Hypotheses extraction classification acc. to source background knowledge prompted by a visual pattern refinement of a previous hypothesis Counting visualisations total frequency relative frequency

Hypotheses classification background knowledge prompted by a visual pattern Refinement of a previous hypothesis Higher flowers probably have longer leaves. Are sepal length and sepal width correlated? There seem to be two clusters in each of these scatterplots. Not only are there two clusters, but the big cluster consists of two subclusters according to petal length. assign colour acc. to petal length. Flowers of the same species probably grow in the same area.

Visualisation frequencies Hypotheses generated f R (i,j)=f T (i,j)/N j i – visualisation j – participant Relative frequency:

Browse Form ideas or hypotheses Manipulate graphicsInterpret data Amend initial idea according to new information Look for content Adjust browsing/ decide where to look Gather evidence Get new/more information Evaluate initial idea Adjust browsing/ decide where to look Tobon Exploration strategies Model of the visual investigation of data 3 groups mapping the strategies as paths

Browse Form ideas or hypotheses Manipulate graphicsInterpret data Amend initial idea according to new information Look for content Adjust browsing/ decide where to look Gather evidence Get new/more information Evaluate initial idea Adjust browsing/ decide where to look Strategy no. 1: Confirm/reject a hypothesis based on background knowledge and then discard it. Repeat from the start. Confirming a priori hypothesis

Browse Form ideas or hypotheses Manipulate graphicsInterpret data Amend initial idea according to new information Look for content Adjust browsing/ decide where to look Gather evidence Get new/more information Evaluate initial idea Adjust browsing/ decide where to look Strategy no. 2: Form a hypothesis based on what you see, interpret and adapt it, confirm/ reject it and discard it. Repeat from the start. Confirming a hypothesis based on a visual pattern

Browse Form ideas or hypotheses Manipulate graphicsInterpret data Amend initial idea according to new information Look for content Adjust browsing/ decide where to look Gather evidence Get new/more information Evaluate initial idea Adjust browsing/ decide where to look Strategy of group no. 3: Form a hypothesis based on what you see, explore further and adapt/refine it, according to what you see in other visualisations, confirm the refined version or adapt again and continue. Seamless exploration

Small study size: - conclusions can not be too general, observations only Conclusions Training necessary: - new concepts visual data mining unusual visualisations interactivity of geoVISTA-based tools Cooperative evaluation vs. strict thinking-aloud: - cooperative evaluation better (compared to a previous experiment) - no silent participants - easier to keep protocols Discrepancy in perceived vs. actual learnability: - PCP very difficult to understand - PCP used most frequently of all visualisations - spaceFills almost never used

Exploration strategies: - three different exploration strategies not related togender academic background nationality/mother tongue GIS experience Investigating spatial data visually is not so simple! Substantial interpersonal differences in forming exploration strategies Why?Question for the future

Thank you!