Grasping Graphs by Ear: Sonification of Interaction with Hidden Graphs Leena Vesterinen Department of Computer Sciences University of Tampere Finland.

Slides:



Advertisements
Similar presentations
Quality control tools
Advertisements

Personal Response System (PRS). Revision session Dr David Field Do not turn your handset on yet!
Paul Lunn BSc(Hons) MSc PCPD FHEA MIET Supervised by Dr A Hunt ( Department of Electronics, The University of York)
Relating Error Diagnosis and Performance Characteristics for Affect Perception and Empathy in an Educational Software Application Maria Virvou, George.
EyeChess: the tutoring game with visual attentive interface Špakov Oleg Department of Computer Sciences University of Tampere Finland
Multimodal feedback : an assessment of performance and mental workload (NASA-TLX) 남종용.
Math-Puzzle: Equation Tutor for Sighted and Visually-Impaired Children Jarno Jokinen Department of Computer Sciences University of Tampere Finland
A UDITORY D ISPLAYS AND S ONIFICATION Ryan McGee MAT 200C.
A Cross-modal Electronic Travel Aid Device F. Fontana, A. Fusiello, M. Gobbi, V. Murino, D. Rocchesso, L. Sartor, A. Panuccio. Università di Verona Dipartimento.
Design Activities in Usability Engineering laura leventhal and julie barnes.
Add here the title of your project Forename Surname Department of Computer Sciences University of Tampere Finland March, 2005 AAFG.
Organizational Notes no study guide no review session not sufficient to just read book and glance at lecture material midterm/final is considered hard.
User-System Interaction a challenge for the present and the future Prof. dr. Matthias Rauterberg IPO Center for User-System Interaction Eindhoven University.
1 Pattern Recognition (cont.). 2 Auditory pattern recognition Stimuli for audition is alternating patterns of high and low air pressure called sound waves.
SIMS 213: User Interface Design & Development Marti Hearst Thurs, March 13, 2003.
vSmiley : The imaging of emotions through vibration patterns. Deepa Mathew Department of Computer Sciences University of Tampere, Finland
Cyclic input of characters through a single digital button without visual feedback Yang Xiaoqing New Interaction Techniques Dept.
Evaluating Non-Visual Feedback Cues for Touch Input Device Selina Sharmin Project for the course New Interaction Techniques Spring.
25 February New Interaction Techniques Target Selection Under Time Pressure Conditions New Interaction Techniques Department of Computer and Information.
Mobility Aid For The Blind By Amir Gonnen. The oldest aids Walking Cane Guide dog Problems: – Skills and Training phase – Range – Very little information.
1 Basic statistics Week 10 Lecture 1. Thursday, May 20, 2004 ISYS3015 Analytic methods for IS professionals School of IT, University of Sydney 2 Meanings.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Multimodal Interfaces in a Ubiquitous Computing Environment 3 rd UK-Ubinet Workshop —————— 9 th – 11 th February 2005 —————— Fausto. J. Sainz Salces, Dr.
Text Input Under Time Pressure Conditions Department of Computer and Information Sciences University of Tampere, Finland Liang Jing Liang Jing p 01_12.
Discount Usability Engineering Marti Hearst (UCB SIMS) SIMS 213, UI Design & Development March 2, 1999.
WPI Center for Research in Exploratory Data and Information Analysis From Data to Knowledge: Exploring Industrial, Scientific, and Commercial Databases.
Statistical Process Control
Database Construction for Speech to Lip-readable Animation Conversion Gyorgy Takacs, Attila Tihanyi, Tamas Bardi, Gergo Feldhoffer, Balint Srancsik Peter.
Game Programming Chapter 1 Review Part One Juniors March 11, 2011.
Using the Web for Bilingual/Bicultural Education of Deaf Children Sonia Martinez, Vicki Hanson & Susan Crayne IBM T. J. Watson Research Center New York,
ACCURATE TELEMONITORING OF PARKINSON’S DISEASE SYMPTOM SEVERITY USING SPEECH SIGNALS Schematic representation of the UPDRS estimation process Athanasios.
Chapter 11: Interaction Styles. Interaction Styles Introduction: Interaction styles are primarily different ways in which a user and computer system can.
Fall 2002CS/PSY On-Speech Audio Area Overview Will it be heard ? Will it be identified ? Will it be understood Four Areas Uses of Non-speech Audio.
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London THE INFLUENCE.
ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.
Maria Grazia Albanesi, Riccardo Amadeo University of Pavia, Faculty of Engineering, Computer Department Impact of Fixation Time on Subjective Video Quality.
Types of data and how to present them 47:269: Research Methods I Dr. Leonard March 31, :269: Research Methods I Dr. Leonard March 31, 2010.
EE 492 ENGINEERING PROJECT LIP TRACKING Yusuf Ziya Işık & Ashat Turlibayev Yusuf Ziya Işık & Ashat Turlibayev Advisor: Prof. Dr. Bülent Sankur Advisor:
COSC 3461: Module 1 S04 Introduction to Interaction & Principles of Design I.
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
The Scientific Method Honors Biology Laboratory Skills.
How do university students solve problems in vector calculus? Evidence from eye tracking Karolinska institutet Stockholm 4th of May 2012 Magnus Ögren 1.
One way to inspire or inform others is with a multimedia presentation, which combines sounds, visuals, and text.
CHAPTER TEN AUTHORING.
Children: Can they inspect it? Yes they can! Gavin Sim.
Object Orientated Data Topic 5: Multimedia Technology.
SEG3120 User Interfaces Design and Implementation
Experimental evidence of the emergence of aesthetic rules in pure coordination games Federica Alberti (Uea) Creed/Cedex/Uea Meeting Experimental Economics.
Perceptual Analysis of Talking Avatar Head Movements: A Quantitative Perspective Xiaohan Ma, Binh H. Le, and Zhigang Deng Department of Computer Science.
Evaluating Perceptual Cue Reliabilities Robert Jacobs Department of Brain and Cognitive Sciences University of Rochester.
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
TAUCHI – Tampere Unit for Computer-Human Interaction Development of the text entry self-training system for people with low vision Tatiana Evreinova Multimodal.
 Skill frequency.  Movement Patterns.  Intensity charts.  Work rest ratios  Combined Intensity & Work:Rest Ratio Chart.
Yonglei Tao School of Computing & Info Systems GVSU Ch 7 Design Guidelines.
Scatter Plots Scatter plots are a graphic representation of collated biviariate data via a mathematical diagram using Cartesian coordinates. The data.
Oman College of Management and Technology Course – MM Topic 7 Production and Distribution of Multimedia Titles CS/MIS Department.
November BLT Training. 2 Outcomes Build team collegiality Look at and start the first steps for Take One! Identify strategies to address district low.
Immersive Virtual Characters for Educating Medical Communication Skills J. Hernendez, A. Stevens, D. S. Lind Department of Surgery (College of Medicine)
Ms. Tracy MODULE 1- LESSON 7. BELL RINGER What are the primary functions of a word-processing program?
Human Joint Transportation in a Multi-User Virtual Environment Stephan Streuber Astros.
1 AQA ICT AS Level © Nelson Thornes 2008 Operating Systems What are they and why do we need them?
Sample paper in APA style Sample paper in APA style.
Methods Identifying the Costs of Auditory Dominance on Visual Processing: An Eye Tracking Study Wesley R. Barnhart, Samuel Rivera, & Christopher W. Robinson.
Objectives Overview Identify the four categories of application software Describe characteristics of a user interface Identify the key features of widely.
MATH-138 Elementary Statistics
Precedence-based speech segregation in a virtual auditory environment
Travelling to School.
Online Testing System Assessment Viewing Application (AVA)
Identifying Confusion from Eye-Tracking Data
Multimodal Human-Computer Interaction New Interaction Techniques 22. 1
Presentation transcript:

Grasping Graphs by Ear: Sonification of Interaction with Hidden Graphs Leena Vesterinen Department of Computer Sciences University of Tampere Finland March, 2005 AAFG 2005

The goal of the game “Hidden Graphs” is a blind inspection of the hidden graphs. Player is to capture as many features as possible in the graph following the guiding sound signals. 3 different sounds are used to guide the player actions in detection of the hidden graphs. 3 major concepts are employed: Capture radius, Directional- predictive sound signals (DPS) and Basic behavioral patterns (BBP). Player task: Player is to choose the right capturing strategy and recognize the hidden graph. Researcher task: Researcher is to optimize the “dialogue” with player through basic behavioral patterns coordinated to directional- predictive sound signals, and to facilitate shaping the personal behavioral strategy. L. Vesterinen p 02_ Grasping Graphs by Ear

L. Vesterinen p 03_ Grasping Graphs by Ear For instance, by grasping virtual graphs, children would develop skills in cross-modal coordination: The use of special basic behavioral patterns for efficient inspection primarily within the game field, and learning how and when is suitable to apply one or another gesturing in dependence on discovered features. Basic cognitive processes: Sound feedbacks learning and experience should progress through the game from the concrete level to more abstract level. Gradual improvement on hearing of feedbacks (and/or haptics) should finally form the personal Behavioral strategy.

L. Vesterinen p 04_ Sonification: Use of non-speech audio to convey information. The sonification field is composed of the following three components: (1) psychological research in perception and cognition; (2) sonification tools for research and application; (3) sonification design and application. In particular, sonification is a potential solution for communication and interpretation of data. Plenty of research is done on sonification and applied to numerous application domains: to provide navigation cues, information visualization (charts and graphs) and non-visual drawing. Regarding to our study - remarkable work is done by B.N. Walker and J. Lindsay, 2004 based on the work of Tran et al., [B.N. Walker and J. Lindsay, 2004] [Tran et al. 2000] [Walker et al., 2003] [Franklin et al., 2004] [Holland et al., 2002] Grasping Graphs by Ear

L. Vesterinen p 05_ The capture radius of an auditory *beacon as the range at which the system considers a user to have research the *waypoint where beacon in positioned. In practise, as a participant moves close enough to waypoint, sound signal is given as an indication for leading the player towards the goal, capturing the graph. Directional-predictive sounds (DPS) are used in relation to capture radius, for graph inspection. 3 unique DPS sounds are combinations of pure sine wave signals with variable tone, pitch and volume. Grasping Graphs by Ear * Waypoint: The coordinates of a specific location. * Beacon: The object at the specific location in the coordinate system.

Pilot Game Testing – 4 subjects: intermediate/experienced, with a normal vision, normal hearing Software: ‘Hidden Graphs’ game, PC version Hardware: AceCad AceCat Flair USB graphics tablet on a standard laptop with two speakers Conditions: silent, closed room Testing procedure: during the test, subjects were blindfolded, hearing the sounds from two speakers. 3 separate test sessions, due to high concentration level and duration of time during the test (>60min< ). 30 games in each session. Each game (1/30) involved playing at the preliminary inspection phase and confirmation phase. In the preliminary inspection phase the player captured the graph trying to memorise the graph. Later player entered the confirmation mode and captured the same graph again as accurately as possible. L. Vesterinen p 06_ Grasping Graphs by Ear

Behavioral strategies: BBP1: ‘Spiral’ and straight line gestures were applied as the basic behavioral patterns for the graph capturing. Player is to scale, change direction or speed of the gesture during the inspection, in relation to the DPS- signals. CS – crossing sound: Player is capturing the graph inside the capture radius. Apply straight line gesture. BS- backward sound: Player is getting out of the capture radius. Apply spiral gesture (scale to big). TS – towards sound: Player is returning towards the capture radius. Apply spiral gesture (scale to small). L. Vesterinen p 07_ Grasping Graphs by Ear

BBP2: ‘S’ –shape and straight line gestures were applied as the basic behavioral patterns for the graph capturing. Player is to scale, change direction or speed of the gesture during the inspection, in relation to the DPS- signals. CS – crossing sound: Player is capturing the graph inside the capture radius. Apply straight line gesture. BS- backward sound: Player is getting out of the capture radius. Apply ‘S’- shape gesture (scale to big). TS – towards sound: Player is returning towards the capture radius. Apply ‘S’- shape gesture (scale to small). BBP3: Combination of the BBP1 and BBP2 behavioral patterns following the same rule format. L. Vesterinen p 08_ Grasping Graphs by Ear Movement vectors

L. Vesterinen p 09_ Directional-predictive signals Grasping Graphs by Ear TS BS CS

L. Vesterinen p 10_ Graphs used in testing Grasping Graphs by Ear 5 different shape graphs were used in the game testing.

L. Vesterinen p 11_ level: Rc = 20 pxls – 1 st level; Rc = 15 pxls – 2 nd level; Rc = 10 pxls – 3 rd level Grasping Graphs by Ear The average distance to graph inspected and the time spent in the training phase in dependence on the game level The relative frequency of DPS- sounds used (the average number is in a log. scale) for graph inspection in the training phase in dependence on the game level

L. Vesterinen p 12_ level: Rc = 20 pxls – 1 st level; Rc = 15 pxls – 2 nd level; Rc = 10 pxls – 3 rd level Grasping Graphs by Ear The relative frequency (the average number) of DPS-sounds used for graph inspection at the confirmation phase in dependence on the game level. Percentage of inspected points. The average distance to graph inspected and the time spent at the confirmation phase in dependence on the game level. Std.dev. of the ave.distance in each level.

L. Vesterinen p 13_ Grasping Graphs by Ear Comparison ratios for the BBP1, 2 & 3 correlation (prel.insp.) > < Typical correlation coefficients

Level 1, capture radius 20 pixels – starting level. The frequency of the levels within the 30 games set in testing: over 90% of the games played in level 2 or 3, capture radius 10 or 15 pixels. 4% of the games played in level 4. Starting position of the capture was free and did not make difference on performance. Separating mouse from the tablet would cause confusion for the player about the position. Smaller the capture radius, longer the confirmation phase and more pixels captured in the target graph. The average distance to the graph within the different levels, did not vary much. Although, std. deviation shows that the ave. distances in different levels are variable. Std. dev. on ave. distance within levels in BBP2, is the largest. Different sounds (CS-BS, BS-TS, CS-TS) in each behavioral pattern associated strongly and positively, resulting in high correlation. L. Vesterinen p 14_ Grasping Graphs by Ear

L. Vesterinen p 15_ Grasping Graphs by Ear Overall, there was a little correlation between the 3 different behavioral strategies, for CS/BS sounds. The most, positive correlation was found in capture radius 15 and 20, between BBP1 and BBP3. Statistics indicated that the graph 2 got the least points inspected in each level, in all 3 behavioral strategies. We can conclude from the calculated statistics that the BBP3 appears to be the most efficient behavioral strategy to be used for the smaller capture radius (Rc = 10). Overall, seems that the BBP1, 2 and 3 are very close to each other in performance measures, therefore no absolute and clear best BB strategy could be stated. For future development, more testing could be done using limited test time, with additional behavioral strategies, increased number of test players and larger data set to make further conclusions.

References: Brewster S. A, Using Non-Speech Sounds to Provide Navigation Cues to, ACM transactions on computer human interaction, 5, 3, 1998 Brown L.M and Brewster S.A, Drawing by ear: interpreting sonified line graphs, Proceedings of the 2003 international Conference on auditory displays, 6-9 July, Brown L. M. et al., Design guidelines for audio presentation of Graphs and Tables, Proceedings of the 2003 International Conference on Auditory Displays, 6-9 July, Evreinov G, ‘Hidden Graphs’ game, Tampere university Franklin K. F and Roberts J.C, Pie Chart Sonification, Proceedings Information Visualization (IV03), pages 4-9. IEEE Computer Society, July 2003 Franklin K.M and Roberts J.C, A Path Based Model for Sonification, Information Visualization, pages IEEE Computer Society, July Holland S, Morse D.R and Gedenryd H, AudioGPS: Spatial Audio Navigation with a Minimal Attention Interface, ACM Personal and Ubiguitous computing, 6, 4, pages 253 – 259, Jacobson R, Representing Spatial Information Through Multimodal Interfaces, IEEE 6 th International conference on Information Visualization (IV’02), July , 2002 L. Vesterinen p 16_

Kramer G, ed. Auditory Display: Sonification, Audification, and Auditory Interfaces, Proc. Volume XVIII, Reading MA: Addison Wesley, Walker B.N and Lindsay J, “Auditory Navigation Performance is Affected by Waypoint Capture Radius,” in Proc. of ICAD 04, Sydney, Australia, July 6-9, 2004 Walker B. N. and Lindsay J, Effect of Beacon Sounds on Navigation Performance in a Virtual Reality Environment, Proceedings of the 2003 International Conference on Auditory Display, Boston, MA, USA, 6-9 July Tran T. V.; Letowski T.; Abouchacra K. S. Tran, Evaluation of acoustic beacon characteristics for navigation tasks. Ergonomics, 1 June 2000, 43, 6, pp (21). Web site (2005) Web site (2004) Web site (2003) Click here to go to first slide L. Vesterinen p 17_