Data Exploration Using Sonification IC280 5/28/02 Jeff Ridenour
Reasons for using senses other than vision (sound specifically): Capability for parallel processing as good as visual processing Specific sound clusters can be distinctly remembered Can be used peripherally or subconsciously (attention is focused somewhere else) Can augment data visualization (observing more parameters simultaneously) May be used to easily give a global perspective of data
Some Sonification Types Alarm Signals (simple signals or cues, with no additional information) Auralization (directly converting data into amplitude vs. time) Earcons (codes learned by the user) Auditory Icons (metaphorical sounds) Parameter Mapping (each data point is an aural event, attribute values mapped to aural parameters)
Some parameters which may be used for Data Sonification Temporal (including tempo, rhythmic type, regularity, articulations, etc.) Timbral (instrumental variation (harmonicity, envelope, etc.), monophonic/polyphonic, vowel/consonants for vocal sounds(speech)) Volume (dynamics) Frequency based (register, registral width, melodic patterns, harmonic patterns, tonality)
Sound represents single data element (ex.granular, algorithmic fish) “Frame” of reference Sound represents single data element (ex.granular, algorithmic fish) Sound represents entire data field, each event represents one data element (ex. Javoicer applet)