Download presentation
Presentation is loading. Please wait.
Published byNora Pope Modified over 9 years ago
1
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields 2012311529 Yong-Joong Kim Dept. of Computer Science Yonsei University Lin Liao, Dieter Fox, and Henry Kautz, In International Journal of Robotics Research (IJRR), 26(1), 2007
2
Contents Motivation Hierarchical Activity Model Preliminaries : Conditional Random Fields – Overview – Inference – Parameter Learning Conditional Random Fields for Activity Recognition – GPS to street map association – Inferring activities and types of significant places – Place detection and labeling algorithm Experimental Results – Experimental environment – Example analysis – Extracting significant places – Labeling places and activities using models learned form others Conclusions
3
Motivation (cont’) Application areas of learning patterns of human behavior from sensor data – Intelligent environments – Surveillance – Human robot interaction Using GPS location data to learn to recognize the high-level activities Difficulties in previous approaches – Restricted activity models – Inaccurate place detection
4
Motivation A novel, unified approach to automated activity and place labeling – High accuracy in detecting significant places by taking a user’s context into account – By simultaneously using CRF (Conditional Random Field) Estimating a person’s activities Identifying places Labeling places by their type Research goal – To segment a user’s day into everyday activities – To recognize and label significant places
5
Hierarchical activity model (cont’) GPS readings – Input to proposing model – Segmenting a GPS trace spatially in order to generate a discrete sequence of activity nodes Activities – Being estimated for each node in the spatially segmented GPS trace – Distinguishing between navigation activities and significant activities Significant places – Playing a significant role in the activities of a person
6
Hierarchical activity model Two key problems for probabilistic inference – Complexity of model Solved by approximating inference algorithm – Not clear how to construct the model deterministically from a GPS trace Solved by constructing the model as part of this inference
7
Preliminaries : Conditional Random Fields
8
Overview (cont’) Definition of CRFs – Undirected graphical models developed for labeling sequence data – Properties Directly represent the conditional distribution over hidden states No assumptions about the dependency structure between observations Nodes in CRFs – Observation : – Hidden states : – Defining conditional distribution over hidden states y Cliques – Fully connected sub-graphs of a CRF – Playing a key role in the definition of conditional distribution Preliminaries: Conditional random fields
9
Overview Conditional distribution over hidden state : where Preliminaries: Conditional random fields
10
Inference (cont’) Inference in CRF can have two tasks : – To estimate the marginal distribution of each hidden variable – To estimate the most likely configuration of the hidden variables (i.e. the maximum a posteriori, or MAP, estimation) – Using Belief propagation to solve these tasks Two types of BP algorithms : – Sum-product for marginal estimation – Max-product for MAP estimation Preliminaries: Conditional random fields
11
Inference (cont’) Sum-product for marginal estimation – Message initialization : Initializing all messages as uniform distr. over – Message update rule : – Message update order : Iterating the message update rule until it (possibly) converges – Convergence conditions : – After convergence, calculation of marginals Preliminaries: Conditional random fields
12
Inference Max-product for MAP estimation – Very similar to the sum-product – Replaced summation with maximization in the message update rule – After convergence, calculating the MAP belief – Then, each component of Preliminaries: Conditional random fields
13
Parameter learning (cont’) Goal of parameter learning – To determine the weights of the feature functions – Learn the weights discriminatively Two method – Maximum likelihood (ML) estimation – Maximum pseudo-likelihood (MPL) estimation Parameter sharing – Learning algorithm to learn the same parameter values (weights) for different cliques in the CRF Preliminaries: Conditional random fields
14
Parameter learning (cont’) Maximum likelihood (ML) estimation – Object function – The gradient of object function Preliminaries: Conditional random fields
15
Parameter learning (cont’) Maximum pseudo-likelihood (MPL) estimation : local feature counts involving variable – Object function – The gradient of object function Preliminaries: Conditional random fields
16
Parameter learning Parameter sharing – Learn a generic model that can take any GPS trace and classify the locations in that trace – Achieved by making sure that all the weights belonging to a certain type of feature are identical – Calculating gradient for a shared weight by the sum of all the gradients computed for the individual cliques Preliminaries: Conditional random fields
17
Conditional Random Fields for Activity Recognition
18
GPS to street map association (cont’) Desirable to associate GPS traces to a street map – (e.g.) to relate locations to addresses in the map Constructing a CRF – Taking into account the spatial relationship between GPS readings – Generating a consistent association Conditional Random Fields for Activity Recognition
19
GPS to street map association (cont’) Distinguishing tree types of cliques – Measurement cliques (dark grey) – Consistency cliques (light grey) – Smoothness cliques (medium grey) Conditional Random Fields for Activity Recognition
20
GPS to street map association Conditional Random Fields for Activity Recognition
21
Inferring activities and types of significant places (cont’) Generating a new CRF, to estimate – Activity performed at each segment – A person’s significant places Conditional Random Fields for Activity Recognition
22
Inferring activities and types of significant places Activity node’s features – Temporal information such as time of day, day of week, duration of the stay – Average speed through a segment – Information extracted from geographic databases – Connected to its neighbors Place node’s feature – Activities that occur at a place strongly (consider weekly frequency) – A limited number of different homes or work places Possibility of generating very large cliques – Resolve this problem by converting to tree-structured CRFs Conditional Random Fields for Activity Recognition
23
Place detection and labeling algorithm Conditional Random Fields for Activity Recognition
24
Experimental Results
25
Experimental environment Collected GPS data from four different persons – Seven days of data – Roughly 40,000 GPS measurements (10,000 segments) – Manually labeled all activities and significant places Using leave-one-out cross-validation for evaluation – Training data : 3 persons (MPL estimation for learning) – Testing data : 4 persons Experimental Results
26
Example analysis Experimental Results
27
Extracting significant places Comparing experiment – Proposing system – A widely-used approach (time threshold) Experimental Results
28
Labeling places and activities using models learned form others (cont’) Experimental Results
29
Labeling places and activities using models learned form others Experimental Results
30
Conclusions A novel approach to performing location-based activity recognition – One consistent framework – Iteratively constructing a hierarchical CRF – Discriminative learning using pseudo-likelihood – Being performed the Inference efficiently using loopy BP Achieving virtually identical accuracy both with and without a street map
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.