Download presentation
Presentation is loading. Please wait.
Published byMuriel Hines Modified over 9 years ago
1
CS6604 Spring 2012 Notes on Algorithm Visualization Clifford A. Shaffer Department of Computer Science Virginia Tech
2
“State of the Field” Hundreds of visualizations are freely available on the Internet Studies on the effectiveness of AVs –Many studies show no significant difference –But AVs have been shown to help in some implementations –One conclusion is that creating/using effective AVs is possible but not easy Many faculty wish to use Avs – but there is not as much use as this would indicate.
3
What AVs are Available? A collection of links available at http://algoviz.org Links to over 500 visualizations Nearly all AVs now written in Java –Applets vs. applications Stand-alone vs. collections
4
Who Makes Them? Single authors, one-off implementations (1-5) –30% Small shops, sustained over a few years –Typically a faculty member and a few students –5-10 visualizations –10% Larger teams, longer term investment –Team built, maybe funded –25% Major Projects –integrated package or shared look-and-feel –35%
5
Is There Adequate Coverage? No –Sorting, search trees, and linear structures overwhelmingly dominate –Coverage for more advanced topics is spotty
6
What Is Their Quality? A majority have no pedagogical value –These give the user no understanding of how the data structure or algorithm works –Will be of little use in the classroom We would recommend less than one quarter of what we have seen for any purpose Even the better visualizations usually have serious deficiencies –Animation only: Users are passive observers –Tree structure visualizations tend to show what happens, but not how –Limited interactivity
7
Is the Field Improving? Pros: –A growing body of literature on best practices to create effective AVs –Community starting to organize (AlgoViz) Cons: –Recent projects are no more in tune with coverage gaps than old projects –No apparent movement in creating repositories
8
Is the Field Active? Appears to be a reduction in “one-off” development. (Drop in student projects) –Fewer CS students –Less interest in Java –But these trends might reverse But steady activity in the larger groups.
9
AVs: The Problem AVs have high faculty and student favorability ratings But most faculty don’t use them much in courses
10
Informal Survey Results Warning: Self-selected responders Are AVs useful? –Strongly Agree: 12 –Agree: 17 –Neutral: 1 A (bare) majority indicated that they used some sort of visualization with class
11
Survey: Impediments to Use Lack of knowledge/time to find good AVs: 13
12
Survey: Impediments to Use Lack of knowledge/time to find good AVs: 13 Time to make good AVs: 2 Difficulty integrating in class: 9 Lack of time within class constraints: 2 Uncertainty about quality outcomes: 1 Content not relevant to my classes: 1
13
Overcoming Impediments Reassurance about what AVs are good Ideas on how to use AVs Reassurance about how a given AV can be used successfully in class Ability to connect to developers
14
AVs: The Solution is Community http://algoviz.org/ –Build a community of users/developers –Better disseminate best practices information Project Support –NSF CCLI grant –NSF NSDL grant –Connections to NSDL/Ensemble project
15
AlgoViz.org A collection of links to over 500 AVs Annotated bibliography of over 500 research papers Forums, field reports OpenAlgoViz
16
Are AVs of Pedagogical Value? Instructors generally think so Students usually say they “like” them 16
17
Metastudy: 2002 Reviewed 24 prior studies on pedagogical effectiveness related to AVs –Generally of an individual system or AV Results of 24 studies: –11 found significant (positive) results –10 did not find a significant result –2 entangled prediction with visualization –1 study found a negative result! 17
18
Epistemic Fidelity Model There is an “objective truth” Experts carry a model of this truth in their heads For data structures, graphics are especially helpful in representing this model Therefore AVs should be especially helpful in transferring this model to students. 18
19
Cognitive Constructivism Individuals construct their own knowledge from subjective experiences When they become engaged in learning, they actively construct new understandings from new experiences Therefore, passively watching AVs won’t have much effect –Students must become actively engaged –The technology should be a tool for knowledge construction. 19
20
Classification The studies represented a wide range of activities and methods Looking deeper, reclassify the independent variables: –Epistemic Fidelity: 10 –Cognitive Constructivism: 14 –(others too few to measure) CC has the highest percentage of positive studies 20
21
Results CC: 71% statistically signficant EF: 30% statistically significant 21
22
CC Activities Construct own input sets Make predictions about future states Program the algorithm Answer questions about the algorithm Construct own visualization 22
23
Level of Effort Compared whether the two treatments required similar “cognitive effort” vs. different levels of effort –Equivalent effort: 33% significant –Not equivalent: 71% significant Construct algorithm/visualization takes time Note that just taking time need not correlate to learning 23
24
Procedural vs. Conceptual Knowledge Procedural only: 67% [10/15] Procedural and Conceptual: 67% [2/3] Conceptual only: 38% [3/8] 24
25
Study Measures Post-test only: 54% Pre- to Post-test difference: 78% –But most of these studies came from one source 25
26
Study Conclusions How students use AV is more important than what they see Pre-test/post-test experiments on procedural knowledge show most improvement Technology is effective when it is used for active engagment 26
27
Bloom’s Taxonomy Knowledge (facts) Comprehension (of the facts) Application (mechanically use the facts) Analysis (interpreting the facts) Synthesis (using facts at higher level) Evaluation (ability to make judgments) 27
28
Engagement Taxonomy Naps Working Group 2002 –No viewing –Viewing –Responding –Changing –Constructing –Presenting Relates to Bloom’s Taxonomy 28
29
Extended Engagement Taxonomy Myller, et al. –No viewing* (textbook) –Viewing* (video) –Controlled Viewing (slideshow) –Entering Input (Define the input to execute) –Responding* (answer questions) –Changing* (direct manipulation) –Modifying (Modify existing AV) –Constructing* (create the AV) –Presenting* (Teach the material) –Reviewing (Give a review of AV) 29
30
2009 Evaluation Urquiza-Fuentes/Velazquez-Iturbide Analyzed 33 successful evaluations Evaluation: –Usability (half of evaluations – often shallow) –Learning outcomes (other half) Many studies compared Viewing, Changing, or Constructing vs. Non-Viewing A few compared Changing or Constructing vs. Viewing Learning improvements in 75% of studies 30
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.