Localization in Wireless Networks

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Slides from: Doug Gray, David Poole
Our Wireless World Our Wireless World
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Data Mining Classification: Alternative Techniques
FM-BASED INDOOR LOCALIZATION TsungYun 1.
Salvatore giorgi Ece 8110 machine learning 5/12/2014
5/15/2015 Mobile Ad hoc Networks COE 499 Localization Tarek Sheltami KFUPM CCSE COE 1.
Introduction to Sampling based inference and MCMC Ata Kaban School of Computer Science The University of Birmingham.
RADAR: An In-Building RF-based User Location and Tracking System Paramvir Bahl and Venkata N. Padmanabhan Microsoft Research.
Computing the Posterior Probability The posterior probability distribution contains the complete information concerning the parameters, but need often.
Part of the Joint Project by HKBU, HKIVE and Several Local Mobile Service Providers for Accurate Low-cost Mobile localization Support Vector Regression.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Carla P. Gomes Module: Nearest Neighbor Models (Reading: Chapter.
Flexible Metric NN Classification based on Friedman (1995) David Madigan.
1 Spatial Localization Light-Seminar Spring 2005.
CHAPTER 4: Parametric Methods. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Parametric Estimation X = {
CHAPTER 4: Parametric Methods. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Parametric Estimation Given.
RADAR: An In-Building RF-Based User Location and Tracking system Paramvir Bahl and Venkata N. Padmanabhan Microsoft Research Presented by: Ritu Kothari.
Presented by: Xi Du, Qiang Fu. Related Work Methodology - The RADAR System - The RADAR test bed Algorithm and Experimental Analysis - Empirical Method.
Copyright © 2007 Jeffrey Junfeng Panwww.cse.ust.hk/~panjfPage 1 Learning-based Localization in Wireless and Sensor Networks Jeffrey Junfeng Pan Advisor:
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Bayesian Indoor Positioning Systems Presented by: Eiman Elnahrawy Joint work with: David Madigan, Richard P. Martin, Wen-Hua Ju, P. Krishnan, A.S. Krishnakumar.
RADAR: An In-Building RF-based User Location and Tracking System Presented by: Michelle Torski Paramvir Bahl and Venkata N. Padmanabhan.
Algorithms for Wireless Sensor Networks Marcela Boboila, George Iordache Computer Science Department Stony Brook University.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
RADAR: An In-Building RF-based User Location and Tracking System.
1 Robust Statistical Methods for Securing Wireless Localization in Sensor Networks (IPSN ’05) Zang Li, Wade Trappe Yanyong Zhang, Badri Nath Rutgers University.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
A Passive Approach to Sensor Network Localization Rahul Biswas and Sebastian Thrun International Conference on Intelligent Robots and Systems 2004 Presented.
RADAR: an In-building RF-based user location and tracking system
Interactive Learning of the Acoustic Properties of Objects by a Robot
Lecture3 – Overview of Supervised Learning Rice ELEC 697 Farinaz Koushanfar Fall 2006.
Additional Topics in Prediction Methodology. Introduction Predictive distribution for random variable Y 0 is meant to capture all the information about.
Copyright© 2002 Avaya Inc. All rights reserved Avaya – Proprietary Use pursuant to Company instructions Copyright© 2003 Avaya Inc. All rights reserved.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
CSC321: Introduction to Neural Networks and Machine Learning Lecture 15: Mixtures of Experts Geoffrey Hinton.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Dependency Networks for Inference, Collaborative filtering, and Data Visualization Heckerman et al. Microsoft Research J. of Machine Learning Research.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Bursts modelling Using WinBUGS Tim Watson May 2012 :diagnostics/ :transformation/ :investment planning/ :portfolio optimisation/ :investment economics/
LEMON: An RSS-Based Indoor Localization Technique Israat T. Haque, Ioanis Nikolaidis, and Pawel Gburzynski Computing Science, University of Alberta, Canada.
Bayesian Inference: Multiple Parameters
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
PSY 626: Bayesian Statistics for Psychological Science
Chapter 7. Classification and Prediction
Chapter 3: Maximum-Likelihood Parameter Estimation
Regression Analysis AGEC 784.
k-Nearest neighbors and decision tree
RF-based positioning.
CSSE463: Image Recognition Day 11
The Elements of Statistical Learning
DASH Background Server provides multiple qualities of the same video
Presented by Prashant Duhoon
Overview of Supervised Learning
CSCI 5822 Probabilistic Models of Human and Machine Learning
PSY 626: Bayesian Statistics for Psychological Science
CAP 5636 – Advanced Artificial Intelligence
Learning with Identification Trees
CSSE463: Image Recognition Day 11
Predictive distributions
Smartphone Positioning Systems material from UIUC, Prof
Instance Based Learning
CS 188: Artificial Intelligence
Indoor Location Estimation Using Multiple Wireless Technologies
Robust Full Bayesian Learning for Neural Networks
RADAR: An In-Building RF-based User Location and Tracking System
Statistical Models and Machine Learning Algorithms --Review
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 11
RADAR: An In-Building RF-based User Location and Tracking System
Presentation transcript:

Localization in Wireless Networks David Madigan Columbia University

The Problem Estimate the physical location of a wireless terminal/user in an enterprise Radio wireless communication network, specifically, 802.11-based

Example Applications Use the closest resource, e.g., printing to the closest printer Security: in/out of a building Emergency 911 services Privileges based on security regions (e.g., in a manufacturing plant) Equipment location (e.g., in a hospital) Mobile robotics Museum information systems

Physical Features Available for Use Received Signal Strength (RSS) from multiple access points Angles of arrival Time deltas of arrival Which access point (AP) you are associated with We use RSS and AP association RSS is the only reasonable estimate with current commercial hardware

Known Properties of Signal Strength Signal strength at a location is known to vary as a log-normal distribution with some environment-dependent  Variation caused by people, appliances, climate, etc. Frequency (out of 1000) Signal Strength (dB) The Physics: signal strength (SS; in dB) is known to decay with distance (d) as SS = k1 + k2 log d

Location Determination via Statistical Modeling Data collection is slow, expensive (“profiling”) “Productization” Either the access points or the wireless devices can gather the data Focus on predictive accuracy

Prior Work discrete, 3-D, etc. Take signal strength measures at many points in the site and do a closest match to these points in signal strength vector space. [e.g. Microsoft’s RADAR system] Take signal strength measures at many points in the site and build a multivariate regression model to predict location (e.g., Tirri’s group in Finland) Some work has utilized wall thickness and materials

Krishnan et al. Results Infocom 2004 Smoothed signal map per access point + nearest neighbor

Tree Models

What do you predict for the 56-year-old? Age MaxCharlson Y1Hospital NumClaims Y2Hospital 75 4 Yes 12 1 No 35 38 15 3 5 23 2 11 56 22 ? Age > 70? 1/5 2/2 yes no Age > 70? 1/5 2/2 yes no Age > 30? 3/3 3/4 yes no What do you predict for the 56-year-old?

Age > 70? 1/5 2/2 NumClaims > 10? 1/6 1/1 Y1Hospital = “yes” 2/4 MaxCharlson Y1Hospital NumClaims Y2Hospital 75 4 Yes 12 1 No 35 38 15 3 5 23 2 11 56 22 ? Age > 70? 1/5 2/2 yes no NumClaims > 10? 1/6 1/1 yes no Y1Hospital = “yes” 2/4 1/3 yes no

Age MaxCharlson Y1Hospital NumClaims Y2Hospital 75 4 Yes 12 1 No 35 38 15 3 5 23 2 11 Y1Hospital = “yes” no yes MaxCharlson > 3 NumClaims > 10 no yes no yes 2/2 1/1 3/3 1/1

NumClaims Age

5 4 3 2 1 10 20 30 40 50 60 70 Age NumClaims NumClaims < 4 no yes Age > 30 Age > 60 yes no yes no Age < 42 blue NumClaims < 5 no yes no yes red blue NumClaims < 2 red no yes blue Age < 38 NumClaims 5 no yes red blue 4 3 2 1 10 20 30 40 50 60 70 Age

Regression Trees idea: pick splits to minimize mean square error where

Nearest Neighbor Methods

Nearest Neighbor Methods k-NN assigns an unknown object to the most common class of its k nearest neighbors Choice of k? Choice of distance metric?

numClaims age

Library(kknn) myModel <- kknn(yesno~dollar+bang+money,spam.train,spam.test,k=3) > names(myModel) [1] "fitted.values" "CL" "W" "D" "prob" "response" "distance" "call" "terms" > myModel$fitted.values [1] n y y n y n y y y y n n n n n y n n y n n n y n n n y n y n n y n y n n n n n n y y n n n n n n y n n n n y n n n n y y y n n n n y n y n y n n n n n y n

Bayesian Graphical Model Approach X Y D1 D2 D3 D4 D5 S1 S2 S3 S4 S5 average

X1 Y1 X2 Y2 D11 D12 D13 D14 D15 D21 D22 D23 D24 D25 S11 S12 S13 S14 S15 Xn Yn S21 S22 S23 S24 S25 Dn1 Dn2 Dn3 Dn4 Dn5 b40 b50 b10 b20 b30 b41 b51 b11 b21 b31 Sn1 Sn2 Sn3 Sn4 Sn5

Plate Notation Xi Yi Dij Sij i=1,…,n b0j b1j j=1,…,5

Hierarchical Model X Y D1 D2 D3 D4 D5 S1 S2 S3 S4 S5 b1 b2 b3 b4 b5 b

Hierarchical Model Xi Yi Dij Sij i=1,…,n b0j b1j j=1,…,5 m0 t0 m1 t1

Xi Yi full joint: [m0][t0][m1][t1][b0|m0,t0][b1|m1,t1][S|b0,b1,D][D|X,Y][X][Y] Dij metropolis moves one variable at a time from the full conditional, e.g b1: Sij “Gibbs sampling” [b1|b0,m1,t1,m0,t0,S,D,X,Y] b0j b1j [b1,b0,m1,t1,m0,t0,S,D,X,Y] m0 t0 m1 t1 [b1|m1,t1][S|b0,b1,D] sampling importance resampling, rejection sampling, slice sampling

Pros and Cons Bayesian model produces a predictive distribution for location MCMC can be slow Difficult to automate MCMC (convergence issues) Perl-WinBUGS (perl selects training and test data, writes the WinBUGS code, calls WinBUGS, parses the output file)

What if we had no locations in the training data?

Zero Profiling? Simple sniffing devices can gather signal strength vectors from available WiFi devices Can do this repeatedly Locations of the Access Points

Why does this work? Prior knowledge about distance-signal strength Prior knowledge that access points behave similarly Estimating several locations simultaneously

Corridor Effects X Y D1 D2 D3 D4 D5 C1 C2 C3 C4 C5 S1 S2 S3 S4 S5 b1

Results for N=20, no locations corridor main effect corridor -distance interaction average error 0 0 20.8 0 1 16.7 1 0 17.8 1 1 17.3 with mildly informative prior on the distance main effect… corridor main effect corridor -distance interaction average error 0 0 16.3 0 1 14.7 1 0 15.8 1 1 15.9

Discussion Informative priors Convenience and flexibility of the graphical modeling framework Censoring (30% of the signal strength measurements) Repeated measurements & normal error model Tracking Machine learning-style experimentation is clumsy with perl-WinBUGS

Prior Work Use physical characteristics of signal strength propagation and build a model augmented with a wall attenuation factor Needs detailed (wall) map of the building; model portability needs to be determined [RADAR; INFOCOM 2000] based on [Rappaport 1992]