Sven Koenig, Joseph S. B. Mitchell, Apura Mudgal, Craig Tovey A Near-Tight Approximation Algorithm for the Robot Localization Problem Presented by: Mike.

Slides:



Advertisements
Similar presentations
Approximation algorithms for geometric intersection graphs.
Advertisements

Chapter 4 Partition I. Covering and Dominating.
Problem solving with graph search
1 SOFSEM 2007 Weighted Nearest Neighbor Algorithms for the Graph Exploration Problem on Cycles Eiji Miyano Kyushu Institute of Technology, Japan Joint.
Design and Analysis of Algorithms Approximation algorithms for NP-complete problems Haidong Xue Summer 2012, at GSU.
1 NP-completeness Lecture 2: Jan P The class of problems that can be solved in polynomial time. e.g. gcd, shortest path, prime, etc. There are many.
Approximation Algorithms for TSP
1 The TSP : Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell ( )
NP-Complete Problems Polynomial time vs exponential time
1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
Chapter 15 Graph Theory © 2008 Pearson Addison-Wesley. All rights reserved.
The Shortest Path Problem. Shortest-Path Algorithms Find the “shortest” path from point A to point B “Shortest” in time, distance, cost, … Numerous.
Combinatorial Algorithms
Probabilistic Robotics
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Approximation Algorithms for Orienteering and Discounted-Reward TSP Blum, Chawla, Karger, Lane, Meyerson, Minkoff CS 599: Sequential Decision Making in.
Computability and Complexity 23-1 Computability and Complexity Andrei Bulatov Search and Optimization.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Active SLAM in Structured Environments Cindy Leung, Shoudong Huang and Gamini Dissanayake Presented by: Arvind Pereira for the CS-599 – Sequential Decision.
Hardness Results for Problems P: Class of “easy to solve” problems Absolute hardness results Relative hardness results –Reduction technique.
Approximation Algorithms: Combinatorial Approaches Lecture 13: March 2.
1 Vertex Cover Problem Given a graph G=(V, E), find V' ⊆ V such that for each edge (u, v) ∈ E at least one of u and v belongs to V’ and |V’| is minimized.
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Data Structures, Spring 2004 © L. Joskowicz 1 Data Structures – LECTURE 16 All shortest paths algorithms Properties of all shortest paths Simple algorithm:
1 NP-Complete Problems Polynomial time vs exponential time –Polynomial O(n k ), where n is the input size (e.g., number of nodes in a graph, the length.
NP-Complete Problems (Fun part)
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
Steiner trees Algorithms and Networks. Steiner Trees2 Today Steiner trees: what and why? NP-completeness Approximation algorithms Preprocessing.
1 NP-Complete Problems (Fun part) Polynomial time vs exponential time –Polynomial O(n k ), where n is the input size (e.g., number of nodes in a graph,
Approximation Algorithms
Outline Introduction The hardness result The approximation algorithm.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
1 The TSP : NP-Completeness Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell.
Complexity Classes (Ch. 34) The class P: class of problems that can be solved in time that is polynomial in the size of the input, n. if input size is.
Graph Theory Topics to be covered:
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Distance. Euclidean Distance Minimum distance from a source (Value NoData) Input grid must have at least one source cell with the rest of the grid.
Advanced Algorithm Design and Analysis (Lecture 13) SW5 fall 2004 Simonas Šaltenis E1-215b
Young Ki Baik, Computer Vision Lab.
1 Steiner Tree Algorithms and Networks 2014/2015 Hans L. Bodlaender Johan M. M. van Rooij.
Approximation Algorithms for TSP Tsvi Kopelowitz 1.
The Manhattan Tourist Problem Shane Wood 4/29/08 CS 329E.
© The McGraw-Hill Companies, Inc., Chapter 12 On-Line Algorithms.
Introduction to Algorithms All-Pairs Shortest Paths My T. UF.
11 -1 Chapter 12 On-Line Algorithms On-Line Algorithms On-line algorithms are used to solve on-line problems. The disk scheduling problem The requests.
Graphs Definition: a graph is an abstract representation of a set of objects where some pairs of the objects are connected by links. The interconnected.
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
Approximation Algorithms
The Theory of NP-Completeness
More NP-Complete and NP-hard Problems
Mathematical Foundations of AI
Richard Anderson Lecture 26 NP-Completeness
Approximate Algorithms (chap. 35)
Hard Problems Introduction to NP
EECS 203 Lecture 20 More Graphs.
Exact Inference Continued
Computability and Complexity
Comp 245 Data Structures Graphs.
Approximation Algorithms for TSP
The Art Gallery Problem
“Enter Group Name” Tyler Atkinson and Dylan Menchetti
CSE 373: Data Structures and Algorithms
Approximation Algorithms
Probabilistic Map Based Localization
All pairs shortest path problem
The Theory of NP-Completeness
The Complexity of Approximation
A Variation of Minimum Latency Problem on Path, Tree and DAG
Lecture 24 Vertex Cover and Hamiltonian Cycle
Presentation transcript:

Sven Koenig, Joseph S. B. Mitchell, Apura Mudgal, Craig Tovey A Near-Tight Approximation Algorithm for the Robot Localization Problem Presented by: Mike Paton

Introduction: Mobile Robot Localization

Robot localization methods Localization with a known map: Kalman Filters: A local based robot localization method used to account for errors in the robot’s odometry and sensing. Markov localization Global localization using markov models, requires a discrete representation of the world. Particle Swarm: global localization using a swarm of particles ( candidate solutions). SLAM: Simultaneous localization and Mapping EKF: Extended Kalman Filter. FASTSlam

When the path to localizing is important The previous methods do not care about where the robot travels while it is localizing, just that it moves around and gathers data. Figuring out the optimal path is an NP hard problem. This paper presents a solution. An O(log^2(n)log(k))-approximation algorithm is introduced, where the optimal solution is the shortest path the robot needs to take to localize.

Previous work Kleinberg competitive ratio localization Compares the dist. Traveled by the robot to a omniscient verifier: a robot with priori knowledge Environment is a geometric tree G(V,E) The algorithm has a O(n^(2/3))-competitive ratio

The ½ Group Steiner problem Group Steiner problem Given a weighted graph G(V,E) with k groups of vertices g1,g2,g3……….,gk in V; find a minimum-weighted tree that contains at least one vertex from each group. ½ Group Steiner problem Find a minimum weight tree that contains vertices from at least half of the groups. There exists a O(log^3(n)))-approximation for the Group Steiner problem and a O(log^2(n))-approximation for the ½ Group Steiner problem.

Representation Grid Graph model Vertices are discretized squares of the map. An edge is made when traversable vertices are adjacent to each other. Robot can either travel north,south,east,or west Each piece of the map is either blocked or unblocked. An observation is the values of the blocks adjacent to the robot

Terminology G(V,E) = Grid Graph Map. G’(V,E) = Local Graph. The robot makes observations from diff. positions in G to make a larger and larger G’ C0 = The initial position of the robot with respect to G’ C(x,y) = A cell in G’ lying x units east and y units north of C0. H = the set of hypothesis. h = A potential starting point for the robot in grid graph G. H’ = the set of active hypothesis O(h, C) is the opinion of hypothesis h about position C. H(C) = Hypothesis partition of the set of hypotheses according to the following relation: h1 ~ h2 IFF O(h1, C) = O(h2, C). Maj(C) = largest set in H(C) G(C, o) = class of opinion o of coordinate C (bbbb,btbt…)

Majority rule map The majority rule map: Gmaj, is a local map in which each cell is blocked or traversable according to what the majority of hypotheses have to say about it. A Cell is blocked IFF |Maj(C)| > ½ |H| and V(Maj(C) == 0

Half Localization A strategy where the robot can correctly eliminate at least half of the hypotheses in H. To achieve half localization a halving path is calculated. A Halving path is a path the robot takes that will effectively eliminate at least ½ of the hypotheses in H (Half-Localize). Let halving path P = {C 0, C1…,Cm} (Ci+1 is a neighbor of Ci in the majority rule map). The robot starts at its initial position and follows the coordinates in P, at each step observation oi is taken. Once the robot reaches coordinate Cm, at least half of the hypotheses in H will be eliminated.

Half Localize Algorithm Input: G, H, halving path S(c) H’ = H For I = 0 to size of S(c) Move the robot to ci Make observation oi at ci H’ = H’ ∩ G( c i,oi) stop if |H’| <= ½ |H| End END

Computing halving paths The computation is approximated by reducing it to an instance of the ½ group Steiner problem. Convert the majority rule map to a weighted graph. V = set of traversable coordinates in the majority-rule map. Weight of edge (v1,v2) = length of shortest path joining cells v1 and v2 in the majority rule map. The graph is rooted at the origin. k groups are made, one for each hypothesis h(i) in H Group gi = the set of all coordinates c in V, that hi do not share the majority opinion at c. Once the weighted graph is made, the ½ Group Steiner problem is run and a path is returned.

Repeated Half Localization Input: Grid graph G, set of Hypothesis H Output: The robot localizes to a hypothesis h in H Algorithm: While |H| > 1 { Compute Majority rule map Gmaj Convert Gmaj to a weighted graph Ig,h Solve Ig,h using the ½ Group Steiner problem to compute a halving path C Half-localize by strategy S(C) Move back to origin. }

Repeated half localization analysis Approximation Ratio O(log^2(n)log(k))-approximation alrogithm. K = size of hypothesis list. N = size of map. Proof: The number of active hypotheses reduces by at least half after each phase The robot localizes in m <= log(|H|) = log(k) phases. Theorem 1.1: There exists an O(log^n) approx. algorithm A for the rooted ½ Group Steiner Problem. Let C be the halving path and C* be the optimal halving path, then C <= O(log^2(n)) * C* |C*| <= HALF-OPT(G,hi) <= OPT(G,H) Since there are at most log(k) phases the total worst case travel distance is O(log^2(n)log(k)) * OPT(G,H)

Future Work Robot without compass Limited range Polygon with holes Geometric trees 3d grid graph

References Sven Koenig, Joseph S. B. Mitchell, Apurva Mudgal, and Craig Tovey A Near-Tight Approximation Algorithm for the Robot Localization Problem. SIAM J. Comput. 39, 2 (June 2009), DOI= /