Download presentation
Presentation is loading. Please wait.
1
Interactive Dynamic Aggregate Queries Kenneth A. Ross Junyan Ding Columbia University
3
Mediator Data Request Unified Results User Web Traditional DBMS... Scenario Outline Graphical User Interface Dynamic Query Data Files e.g., PUMS Dynamic Query Engine
4
Engine Decoupled from Interface Can use a variety of interfaces Multiple connections to one server Can “do one thing well” Client/Server parallelism Abstract interaction via API
5
Engine Performance Goals Interactive data exploration Millions of records Thousands of columns (but look at ten or so at a time) Aggregates and statistical measures Fine adjustments at 30 answers/second.
6
Technical Details Main Memory Implementation Multidimensional tree structures Cache consciousness Branch Misprediction SIMD Asynchronous work
7
Internet GetGloss.gov url Xml glossary info ParseGloss Sensus GlossIT
8
Automatic Ontologies from Web Pages Judith L. Klavans Peter K. Davis Samuel Popper Columbia University
9
Where are Glossaries? Internet
11
GetGLOSS Web Crawling to Find Glossaries GetGloss
12
ParseGLOSS Building an Ontology ParseGloss
13
Output for SENSUS Ontology SENSUS
14
Data Users Social Science Research Data Component Electronic Data Service – Columbia Univ Librarians and Data Specialists Steady stream of different user groups Collect user logs and interview users Coordinated by Walter Bourne
15
DGRC User Interface Testbed Menu presented as grid of alternating rows and columns –Top level items in left column Ontology entry shown in beam for selected item –Located as near as possible
16
DGRC User Interface Testbed Color coding shows parental and semantic relationships
17
DGRC User Interface Testbed Fisheye magnification of region of interest Magnified group laid out to avoid internal overlap
18
Optimize the effectiveness of the interface, Identify usability problems, Provide feedback on the overall functionality, Anticipate changes in user need that might drive future development, Validate the design, Indicate the extent to which the interface improves on previous interfaces. Goals of Evaluation
19
Methods of Evaluation Interviews to Experts Analysis of DataGate Interface Design and Testing with Heuristic for Database Interface User and Task Analysis
20
Interview Findings User Type Identification –Novice and Power/Expert Users User Goals Kinds of Questions Types of Searches Related Terms for Searches –Difficulty of Use of Alternative Terms Selecting the database Learning to Use the Interface –Innovative Interface –Need Orientation and Time to Familiarize with the Interface
21
Interview Findings Searching Styles Flexibility to Searching Styles Helping the User Define the Search –Help users to Visualize the Context and Structure of Information –Definition and Redefinition of Search Standardization Problems Suggestions for the Design
22
Variables –Hierarchical Structure –Massive Amount Terminology –Definitions Change –Obscure Terminology –Census Question Change Geographical References –Boundaries Change –Unique Boundaries –Codes for Areas –Various Meanings for Same Names Content Visualization –Display Information Organization Dynamic Menu –Magnification on Selected Items with Full Content Zoom In, Zoom Out –Manipulate the Level of Magnification Searchlight –Multiple Layers of Display –Alternative Terms –Definition of Terms –Alternative Pathways Create Dynamic Maps Census Characteristics and Interface Possibilities
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.