Download presentation
Presentation is loading. Please wait.
Published bySheryl Payne Modified over 9 years ago
1
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc THE INFLUENCE OF SCREEN READERS ON WEB COGNITION Tony Stockman & Oussama Metatla Queen Mary, University of London
2
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc OVERVIEW Examine implications of SR technology for web cognition Report findings from survey and study of collaborative web use Propose draft taxonomy of errors in collaborative web interaction Examine potential role of non-speech sound in addressing identified problems
3
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc FEATURES OF SR WEB INTERACTION Default linear model of page presentation No ambient representation of location on page No representation of spatial layout No immediate indication of information density Little to assist formulation of mental model of page
4
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc CURRENT FUNCTIONALITY Focus: Jaws, WE and VO. Cursor key navigation – strength and weakness of analogy with other apps Other mechanisms: By listing of navigating forward/back between links, headers, frames, tables, forms, text elements, markers etc. Forms mode
5
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc OVERVIEWS Important but often under supported/neglected Typically lists no. of links, frames, headers, and forms with reminder of related hot keys
6
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc CURRENT NON-SPEECH SOUND Jaws schemes Window-eyes events VoiceOver defaults
7
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc VoiceOver Embedded in OS Factors in switching Group v. DOM mode navigation Overview followed by interaction model Non-speech sound more “visible”
8
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc IMPLICATIONS FOR COGNITION AND INTERACTION Hindered by linear navigation EG pageing through search results Hot keys and markers help, but ignore density of info and spatial layout - consequences for collaboration Overviews neglect spatial layout, ordering and esthetics, Tables are navigable but lack overviews
9
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc DESCRIBING WEB PAGES Employed widely known pages Characteristics of sighted descriptions: Relatively short but covering main features Column layouts, colours, mood, style, pictures, emotional response to message Characteristics of VI descriptions: Longer, more factual and granular, more focus on function and usability Conclusion: the two groups approach web tasks from widely differing contexts
10
Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc COLLABORATIVE TASKS Involved pairs of sighted and visually impaired users Both could read the web pages used One gave instructions while the other performed the task The tasks involved: Simple information searches Comparisons of data values Navigating pages and filling forms
11
VISUALLY IMPAIRED INSTRUCTOR These tasks were performed generally quite straightforwardly VI user was generally familiar with sites Sighted users perspective generally compensated for difference in views of each user Sources of problems: Screen-reader’s focus unavailable to sighted user Sighted user referring to spatial layout unavailable to screen- reader Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
12
SIGHTED INSTRUCTOR Substantially more problems: Point in task unclear because screen-reader focus unavailable to sighted user Sighted references to spatial layout CAPTCHEs Non-standard form controls Column headers not spoken on forms Dynamic updating of form fields Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
13
TOWARDS A TAXONOMY OF COLLABORATIVE ERROR 1 Location disconnects Layout disconnects Missing objects Navigation disconnects Contextual disconnects Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
14
TOWARDS A TAXONOMY 2 Affordance disconnects Modal disconnects Hollistic disconnects Multi-focus disconnects Esthetics disconnects Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
15
NON-SPEECH AUDIO Audio is inexpensive and widely used Screen-readers only gradually adopting limited non-speech sound, eg forms and progress bars Growing body of knowledge on how to design and use (www.ICAD.org) Range of techniques that could be examined as part or whole solutions to problems described Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
16
Monitoring for dynamic changes The structure of Earcons could for EG reflect object type and nature of update Ambient sound might convey esthetics and/or interaction mode Auditory icons might signal affordance open/closed Spatial sound might convey overall layout, density, locations of users NON-SPEECH AUDIO 1 Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
17
NON-SPEECH AUDIO 2 Example of non-speech auditory overviews compared with speech, more like a glance Spearcons for typical radio button options Crucial to avoid auditory overload, masking etc. Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
18
CONCLUSIONS Speech-only model struggles to convey rich web content and enable increasingly complex interactions This is highlighted by gap in first impressions of common web pages Cross modal web collaboration os subject to a range of disconnects due to differences in presentation and interaction Non-speech audio is an under-used mechanism that, with careful design, could help to address some of the issues sited Interaction Media & Communication, Department of Computer Science, Queen Mary University of London http://www.dcs.qmul.ac.uk/research/imc
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.