Download presentation
Presentation is loading. Please wait.
Published byMoris Richard Modified over 9 years ago
1
Advisor: Hsin-His Chen Reporter: Chi-Hsin Yu Date: 2009.02.11 From AAAI 2008 William Pentney, Department of Computer Science & Engineering University of Washington Matthai Philipose, Intel Research Seattle Jeff Bilmes, Department of Electrical Engineering University of Washington
2
Common Sense Data Acquisition for Indoor Mobile Robots ◦ AAAI 2004 ◦ Rakesh Gupta and Mykel J. Kochenderfer Sensor-Based Understanding of Daily Life via Large-Scale Use of Common Sense ◦ AAAI 2006 ◦ William Pentney, et al., Matthai Philipose (Intel) Learning Large Scale Common Sense Models of Everyday Life ◦ AAAI 2007 ◦ William Pentney, et al., Matthai Philipose (Intel)
3
Introduction Data Acquisition and Representation Inference Evaluation Methodology and Results Conclusion
4
Common sense ◦ being critical to the automated understanding of the world (Example)(Example) ◦ OMICS (Open Mind Indoor Common Sense) project This paper ◦ enabling correspondingly large scale sensor-based understanding of the world (RFID)
5
(Back)
6
Challenges ◦ semantic gaps (facts in DB - phenomena detected by sensors) ◦ fragility of reasoning in the face of noise ◦ Incompleteness of repositories (DB) ◦ slowness of reasoning with these large repositories The adaptation of using sensor data is challenging because it is unclear that … ◦ how to represent models ◦ term occurrence statistics are a practical means of acquiring arbitrary common sense information from the web
7
Collecting common sense data through the Open Mind Indoor Common Sense (OMICS) website Restricting the domain to indoor home and office environments (AAAI 2004)
9
Hand proximity to objects implies object use. Three users perform various daily activities. A total of 5-7 minutes of each activity was collected, for a total of 70-75min of data. These traces were divided into time slices of 2.5s.
10
SRCS=State Recognition using Common Sense
11
Template to relation ◦ “You when you are.” ◦ Relation: people(, ) ◦ Ex: Relation People people (‘eat’, ‘hungry’) people(’drink water’,’are thirsty’) Relation ContextAction contextactions(’full garbage bag’, ’put the garbage in’, ’trash’) contextactions(’making toasted bread’, ’slice’, ’bread’) Relation ActionGenealization actiongeneralization(’investigate cause of’, ’alarm’, ’smoke alarm’) actiongeneralization(’wipe off’, ’floorcover’, ’carpet’) actiongeneralization(’clean’, ’floorcover’, ’carpet’) SRCS ◦ 50000+ instances, 15 relations KnowItAll ◦ Weighting the facts in OMICS DB Weighted Relations
12
Using 20 fixed rewrite rules ◦ People(S, A) ⇝ (actionObserved(A) ⇒ personIn(S)) ◦ People(angry, yell) ⇝ (actionObserved(yell) ⇒ personIn(angry)) Horn clause ◦ p 1 ∧ p 2 ∧ … ∧ p N ⇒ p N+1 ◦ P 1 … p N : constant/atom, p N+1 :atom ◦ Constants: object, action, location, context, state ◦ 8 types of atoms: useInferred(O), stateOf(O, S), locationInfererd(L), personIn(S), actionObserved(A) Weighted Relations Weighted Horn Clauses
13
MRF ◦ Consists a graph whose set of vertices V are connected by a set of cliques c i ⊂ V. V: atom f i,t and object o i,t for all i in time t ◦ Each c i has a potential function φ i :c i R + ◦ Calculates p(f t,o t ) Markov logic network (Richardson & Domingos 2006) ◦ Are used for this conversion. Weighted Horn Clauses Markov Random Field (MRF)
14
Markov Random Field Chain Graph
15
At time slice t ◦ Useinferred(O)=true if the use of object is detected at time slice t ◦ Infer other unknown variables using loopy belief propagation (Pearl 1988) Calculate marginal for propositions in time slice t-1 30 min for inference in each time slice ◦ Use query-directed pruning to improve it Output thresholds ◦ Set a variable to true if p(variable) > thresholds ◦ Use decision stump to learn the threshold
16
Monitoring 24 Boolean variables to identify individual activity Human labeling of the traces as the ground truth
17
Train decision stumps on a sampling of data fro each activity (20mins)
19
AAAI 2008 AAAI 2006 The performance is inconsistent. Improved
20
Future works ◦ The cost of labeling trace is expensive semi-supervised algorithm may help ◦ Integrating other sources of sensory input is exploring ◦ Selection of variable subset is important for scalable inference. Conclusions ◦ Densely deployable wireless sensors made things possible.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.