Slides by Yevgen Borodin (slides adapted for Psych 384, 3/3/09) Department of Computer Science, Stony Brook University A Vision for a Universally Accessible Web
The web is designed for those who can filter out irrelevant information
Non-Visual Web Browsing Jaws, Windows Eyes, Hal Serial audio interface Shortcut-driven navigation in HTML DOM-tree Navigation between links, headers, lines, etc. Inaccessible images, links, multimedia, etc.
How blind people browse the web Landmarks Searches List of links Headings Structure Speed of speech Static sites [Bigham, et. al., ASSETS’07]
What is HearSay? Started: Motivated by work in AI Information extraction from web pages Process modeling Added an audio interface Now : Working with HKSB and HKNC Collaborate with IBM, UW Several faculty members, Ph.D., MS., undergrad
HearSay 3 Free!!! Multi-platform Focused on Web browsing Flexible multimodal interface Supports text-to-speech engines Supports voice recognition engines
Improving navigation would make web browsing more efficient. Segment pages Identify patterns Add 2D navigation Summarize content All this helps, but...
Project Goals Filter out irrelevant information Discover relevant information Provide quick access to relevant content Evaluate the usability of HearSay Compare HearSay to other screen readers Distribute a stable version of the program for free
Scenarios Relevancy in Ad-hoc Web Browsing Relevancy when Web content changes Relevancy in Online Transactions (e.g. shopping, paying bills) But what is relevant?
Manual Annotations of Content (the user could tell us what is relevant) Beginning of Main Content Search Button
Needed for Manual Labeling: Provide an interface for creating annotations Store annotations in a database Query the database when the page loads Apply the metadata to the page Provide an interface for reviewing the annotations
Collaborative authoring of accessibility metadata [Takagi et al, 2008] Social network connecting end-users and volunteers [ Accessibility Commons (AC) DB to store metadata [Kawanaka, Borodin et al, 2008] Web-based infrastructure for sharing metadata
Benefits of Social Accessibility Shortens the time for accessibility renovations Supported: headings, ALT tags, and titles Workshop at UW – formed a consortium Defined the Accessibility Commons DB schema Identified the object addressing methods: XPath, MD5, URI
Automatic labeling of content to support web transactions
Labeling content Non-visual web transactions are difficult Consider all problems with non-visual browsing Need to locate relevant concepts (buttons and links) Relevant concepts are similar across websites Some variations, e.g. “add to cart”, “add to bag” Different labels, e.g. “Search”, “Go”, “Find” Evolution of relevance and form over time
Or instead of annotating elements, the whole process could be automated. Ex: AT&T Log-in Page
AT&T: Account Overview Page
AT&T: Make a Payment Page
AT&T: Confirm Payment Details
Macro Recording Interface Create a recording (non-)visually Save recording with a description Voice interface to replay the macro-recording Page (in)dependence Customizing what is read Specifying variables
Context-Directed Browsing
Something has Just Changed…?
Dynamic Web Content Dynamic content: Our actions often cause change We pay attention to changes of content New information is often in the changes Affects relevancy of information Types of updates: Page refresh, redirect, JavaScript and AJAX updates Source of updates: User-invoked and Timer-based
Another example:
Dynamic Content Paradigm Treat any content changes as “updates”: AJAX, JavaScript, refresh, redirect Navigation by following links Using back and forward buttons Analyze and diff the updated Web content Provide interface for reviewing the changes
Page Refresh
Filtering Repeated Content
User-Centric Goals Discover and present relevant information first Minimize access-time to relevant information Keep users focused on tasks and information Facilitate multi-tasking and refocusing Enable automation of repetitive tasks Keep the context of user actions Minimize system distractions
HearSay for the Sighted Browsing on handhelds Browsing over the phone Browsing on-the-go Other services
Web Accessibility in Handhelds
Mobile Browsing Problems Data Transfer Cost is High Connection is Slow Small Screens Lots of Scrolling
Context-driven Browsing
External Collaborators Accessibility Group at IBM Japan Accessibility Group at Google HKSB and HKNC Arizona State University Conferences: ASSETS, W4A, CSUN
Conclusion Web Accessibility is an important problem Glimpse of Interesting Approaches Much remains to be done: E.g. Integration, Robustness (Specification and Verification) Sonification Other modalities – touch, pen,.. Extensive end user studies to probe mental models to drive technology development (feedback) Other Disabilities – cognitive, motor impairment, etc. 36
Questions? Comments? Concerns? Suggestions?