Mapping GUIs to Auditory Interfaces Mercator Mapping GUIs to Auditory Interfaces
Goals Provide transparent access to X Windows applications for computer users who are blind or severely visually impaired. Build a framework to monitor, model, and translate graphical interfaces of X Window applications without modifying the application. Develop a methodology for translating graphical interfaces into nonvisual interfaces.
Mercator Design Considerations Modality for nonvisual interface -- auditory or tactile. Substantial research in auditory interfaces (Gaver, Bly, Blattner et al) Users’ ability to monitor multiple auditory signals - the “cocktail party effect” (Cherry) Active versus passive interaction Low cost, standard audio devices Users’ possible lack of tactile sensitivity due to diabetes
Modeling Visual Interfaces
Auditory Widgets Correct level of abstraction Convey objects … and attributes
Navigation Support two activities Mouse navigation is unsuitable Allow users to “scan” interface Allow users to operate interface Mouse navigation is unsuitable Map interface structure into hierarchical tree structure Based on widget hierarchy Users walk tree structure to navigate Works with existing keyboard short-cuts
Mercator Architecture Three Main Goals: Capture -- high-level semantically meaningful information from X applications. Store -- a semantic, off-screen model of the graphical interface. Translate -- the stored information to present a consistent alternative interface.
Spectrum of solutions Modify the applications Modify the toolkit Interpose between application and window server
First Architecture
Second Architecture
How this worked Xt hooks, RAP, Xlib safety net Control flow Interface modeling Interpreted interfaces Simulating input
How to do this now...