Download presentation
Presentation is loading. Please wait.
Published byJade Atkins Modified over 9 years ago
1
OLAP tool for comparing time- based data 14 th May 2008 Proposed By Pimolmas Ponchaisakuldee 104915 Advisor Dr. Paul Janecek
2
Contents Introduction 1 Implementation 2 Experimental Evaluation 3 Conclusion 4
3
What is Data warehouse & OLAP Data warehouse a very large database with special characteristic Data for analyzing OLAP tool (On-Line Analytical Processing) A tool on top of data warehouse For data exploring and analyzing 3
4
OLAP tool Problems Cube navigation Data presentation 4
5
OLAP tool Problems Cube navigation Problems of window-explorer-like tree browser Lose in tree topology Need several clicks Requirements Overall topology Multiple focuses and topology at the same time Solutions DOITree browser (better topology representation, multiple focuses, and go more than one level of hierarchy per click) DOITree VS Window-Explorer-like browser 5
6
OLAP tool Problems Data presentation Tasks Trend finding over time Trend comparison between 2 period of time Problems Pivot table: hard to do the analysis task Existing graph: complex, hard to focus Polaris table –does not support hierarchy exploring of data –Problem of composition for comparison task Solution Focus on demand graph = graph matrix + overlaid graph 6
7
Objectives To build OLAP tool To evaluate effectiveness of DOITree visualization for OLAP-specific navigation compare to Window-Explorer-like visualization To evaluate visual tools for time-based comparison task Graph matrix Overlaid graph 7 Focus on demand graph
8
Contents Introduction 1 Implementation 2 Experimental Evaluation 3 Conclusion 4
9
Implementation Java to modified JRubik 9 Cube Navigator Pivot table presentation Graph matrix presentation Overlaid Graph
10
Design when measure is on column 10
11
Design when measure is on row 11
12
Contents Introduction 1 Implementation 2 Experimental Evaluation 3 Conclusion 4
13
Evaluation 1: (WD) VS (DOI) Window-Explorer Like browser (WD) of JRubik DOI tree browser (DOI) of the modified JRubik 13
14
14 Evaluation 2: (GE) VS (FOCUS) Overlaid Graph General graph (GE) Focus on demand graph (FOCUS) Graph Matrix + VS
15
Experiment design Participants will be trained to use the 2 tools The modified JRubik = (DOI) and (FOCUS) JRubik = (WD) and (GE) Experiment design with counterbalancing order 15 ParticipantsFirst ToolSecond ToolData Student01 to Student10 The modified JRubikJRubikFoodmart Student11 to Student20 JRubikThe modified JRubikFoodmart Staff1 and 2The modified JRubikJRubikAIT admission Staff3 and 4JRubikThe modified JRubikAIT admission
16
First Evaluation: (WD) VS (DOI) Tasks for browser comparison Only browsers (one attribute) Appendix C page 83 Task1 - First Node Finding Task2 - First time Node Finding Task3 - Subtree Revisiting Task4 - Node Revisiting The browsers as query tool (many attributes) Appendix D page 84 Task1 - First time Node Finding Task2 - Subtree Revisiting Task3 - Node Revisiting 16 WD DOI WD DOI
17
Second Evaluation: (GE) VS (FOCUS) Tasks for graph comparison Simple Analysis Trend finding, Trend comparing Complex Analysis Trend comparing 17
18
Second Experiment: (G) VS (F) Simple Analysis Complex Analysis 18
19
Variables studied using Foodmart 19 Browsers comparisonGraphs comparison (WD) VS (DOI)(GE) VS (FOCUS) Independent variables Task types Cube Structure Browser Type Screen area Participant demographics Task types Presentation type Screen area Participant demographics Dependent variables Speed to complete tasks Number of clicked nodes Satisfaction score Speed to complete tasks Number of correct answer Satisfaction score
20
Variables studied using AIT ADM 20 Browsers comparisonGraphs comparison (WD) VS (DOI)(GE) VS (FOCUS) Independent variables Task types Cube Structure Browser Type Screen area Participant demographics Task types Presentation type Screen area Participant demographics Dependent variables Speed to complete tasks Number of clicked nodes Satisfaction score Speed to complete tasks Number of correct answer Satisfaction score
21
Experiment Result Browser comparison result Browsers only: Pair-Sample T-test Browser as a query tool: Pair-Sample T-test Participants opinions: One way ANOVA Graph comparison result Simple and Complex Analysis Pair-Sample T-test One way ANOVA Participants opinions One way ANOVA 21
22
Experiment Result Browser comparison result: browsers only Subtree revisiting – users can leave subtrees open 22 First time node finding First node finding Subtree revisiting Node revisiting First time node finding First node finding Subtree revisiting Node revisiting Sig. DOI Sig. WD WD DOI Sig. WD Sig. DOI DOI WD
23
Experiment Result Browser comparison result: browsers only Finding for DOITree Advantage –More information help users to recover from mistake Disadvantage –More information can distract users to go wrong path 23
24
Experiment Result Browser comparison result: browser as a query tool 24 First node finding Subtree revisiting Node revisiting WD DOI Sig. DOI Sig. WD WD DOI
25
Experiment Result Browser comparison result: browser as a query tool (DOI) always take less clicks Go more than one level per click using (DOI) Users shrink subtree before finding new node using (WE) 25 First node finding Subtree revisiting Node revisiting 25 WD DOI Sig. DOI Sig. WD WD DOI
26
Experiment Result Browser comparison result: Opinions Likeness: auto shrink, go more than one level Dislikeness: dizzy (many subtrees open), cannot leave subtree open 26
27
Experiment Result Graph comparison result 27 FOCUS GE Sig. FOCUS Sig. GE GE FOCUS
28
Experiment Result Graph comparison result : Opinions Likeness: focus Dislikeness: graph and label are too small 28
29
Experiment Result ANOVA: 3 groups of student participants Satisfaction score on tree browsers: no sig. was found Satisfaction score on graphs: no sig. was found Simple Analysis: Next slide Complex Analysis: no sig. was found 29
30
Experiment Result ANOVA: 3 groups of student participants Simple analysis task 30 G2: OLAP using G1: OLAP using & ex G3: Non OLAP ex G1 Sig.: OLAP using & ex Sig. G3 Sig. Non OLAP ex Sig.
31
Experiment Result ANOVA: 2 groups of student participants 31 G2: JRubik G1: Mod. JRubik G2: JRubik G1: Mod. JRubik
32
Discussion Personal Evaluation 32 TasksSubtasksJPivotFreeAnaly sis JRubikModified JRubik T4: Analyzing visualization presentation T4.1: How easy to read text of each graph+++++ T4.2: How easy to read exact value in graph+++++++++ T4.3: How easy to find trend of data++++++++ T4.4: How easy to compare trend of data+++ ++++
33
Contents Introduction 1 Implementation 2 Experimental Evaluation 3 Conclusion 4
34
Problems Cube navigation, Data Presentation Objectives to evaluate effectiveness of (DOI) compare to (WD) to evaluate the focus on demand graph for time-based data analysis tasks Browser comparison result: 34 TasksThe betterSig.Note Browsers only First time node finding(DOI) - First node finding= - Subtree revisiting(WE) Leave subtree open Node revisiting(DOI) Some users shrink subtree Browsers as query tool First node finding(DOI) At middle depth (very deep has animation) Subtree revisiting(DOI) When subtree expanded Node revisiting(DOI) When subtree expanded WD DOI
35
Conclusion Graph comparison result: Implications Browser comparison: Browser only: (DOI) is better for long term usage, (WE) is good for revisiting tasks Query tool: DOITree is better for unknown data and suitable for the analyzers who are always analyze unknown data Graph comparison: separated graph in graph matrix can eliminate complexity Users can focus what they want line graph is a suitable presentation when time is on X axis 35 CriteriaThe betterSig.Note Pair-sample T-Test Simple Analysis Tasks(FOCUS) - Complex Analysis Tasks(FOCUS) - FOCUS GE
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.