Exploiting SenseCam for Helping the Blind in Business Negotiations Shuaib Karim, Amin Andjomshoaa, A Min Tjoa (skarim, andjo, Institute of Software Technology & Interactive Systems SemanticLIFE Project ( Vienna University of Technology
ICCHP Motivation Availability of automatic data capture devices, and the semantic web technology can greatly enhance the knowledge accessibility for people with special needs
ICCHP Meeting Room Consists of meeting place, meeting time, participants, meeting room objects, agenda AND the interplay between all these The meaningful gestures and movements of participants possess valuable information
ICCHP Some screenshots of business meetings Ref:
ICCHP Ref:
ICCHP Ref: Business meeting (from google)Business meeting
ICCHP Ref: From
ICCHP Disadvantage for visually impaired participants –Unable to capture the gestures made by other participants –Unable to see the meaningful movements If this information is made available to blind participants then they can better plan their future meetings
ICCHP Suggested solution Capture the meeting proceedings using devices like SenseCam Make associations between meeting constituents –Associations can be static (meeting meeting place), or dynamic which may change over time, or based upon some event (meeting participant‘s presence, meeting discussed issue, etc.)
ICCHP Introduction to SenseCam – 1/2 (Ref: Microsoft Research)Microsoft Research -Size: badge-sized wearable camera -Storage: 128Mbyte FLASH memory -Recording mechanism: Sensors trigger a new recording automatically. The triggers are of various types like time, sudden movement, or a person nearby, light transition, change due to another person coming into the room, opening / closing of door, changing posture like sitting down, standing up, running etc.
ICCHP Capture rate: 2000 VGA images per day. Sensor data such as movement, light level and temperature is recorded every second. Manual capture is also possible by using a hand gesture. -Configurable sensors: using xml configuration files -GPS and continuous audio recording capability -Ability to detect other SenseCams in vicinity -The data is stored on sqlServer which is accessible using API provided with the device Introduction to SenseCam – 2/2
ICCHP Possible movements capturable by SenseCam –leaving or entering the room –sitting down, standing up –whispering with someone while leaning –relaxing on the chair –sitting alert –hand gestures by the participants etc.
ICCHP Our approach : SemanticLIFE SemanticLIFE research project is an attempt to come a step closer to Vannevar Bush's vision to realize Memex as a device in which an individual stores lifetime information SemanticLIFE is a Personal Information Management system that captures user activities like: –Browsed web pages – s –Chat sessions –Local processes –Telephone logs –Appointments SemanticLIFE uses Semantic Web technology to glue up the events and domain concepts
ICCHP SemanticLIFE Architecture Message Bus Plug-in Google Explorer Plug- in Repository Plug-in Personal Repository Ontologies Pipeline Plug-in PipelinesStyle sheets User Profile Plug-in Annotation Plug-in Web Service Plug-in Other data feeds Analysis Plug-in Visualization plug-in
ICCHP Workflow: Blind person wears the SenseCam during the meeting Pictures are uploaded in SemanticLIFE repository as our file upload datafeed module Retrieval of day‘s pictures and identification of participants either manually by the caregiver or automatically using multimedia analysis plugins Annotation of pictures –Structural enhancement of information items –Manual associations with other information items –Dynamic associations Enrichment of associations (updating concerned contact profile)
ICCHP Workflow of annotation sub-system
ICCHP The problem faced is: The number of pictures to be annotated is too large (about 3600 pictures per hour). The proposed solution is to “categorize the pictures based upon different criteria“. This will give the possibility to interact with this huge amount of information like OLAP cubes.
ICCHP OLAP (OnLine Analytical Processing) Multidimensional conceptual view / arrangement of data to allow fast analysis (E.F. Codd & Associates, 1994) The multidimensional structure (OLAP cube) provides data views based upon different criteria Each dimension is a category There can be hierarchy in each category
ICCHP Information distribution over multiple axes
ICCHP The proposed “Project Meeting“ ontology
ICCHP SemanticLIFE UI
ICCHP
ICCHP Conclusions Automatic data capture devices are available of capturing user‘s activities A project meeting ontology using semantic web technology can be used to build associations between meeting constituents which can be very useful for all, especially for visually impaired people Accessibility criteria is equally useful while making associations as well as while presenting information More Info. distribution axes to incorporate in future Privacy issues to be investigated
ICCHP Thanks!