Presentation is loading. Please wait.

Presentation is loading. Please wait.

Robotic Search Engines for the Physical World

Similar presentations


Presentation on theme: "Robotic Search Engines for the Physical World"— Presentation transcript:

1 Robotic Search Engines for the Physical World
Aisha Walcott CSAIL Robotics Workshop Feb. 3, 2006

2 Grand Central Terminal
Problem Robots with a long-term presence Capture information about physical spaces Pose and process queries What I will say: “How do we build autonomous robots with a persistent long-term presence in physical spaces. The robots will continuously collect information about the environment. How do we collect this information in such a that it can be queried?” For example, in terms of security, robots may be tasked to keep a current model of an environment. They will need to be able to detect the presence of an ‘unattended package’. Through an interface, security personnel can explore this data” -For a robot to live and maintain a model of an environment, it must be able to navigate safely, cope with sensor errors, and detect changes in the environment. -To answer physical queries, questions about features or events in the environment, the robot must strategically search the environment in order to answer the queries. One possible approach is to prepare 2--5 ppt slides addressing the problem statement, why this is an interesting and challenging problem, what are the key technical issues and the technical approach, and some sample results. Grand Central Terminal

3 Motivation ? Web World Web Crawling URLs Remove URL Find IP
of Host-Name Download Document URLs Extract Links Search Engine go What I will say: “To approach this problem of capturing information in an environment we draw inspiration from the web, in particular search engines Web is a vast, evolving collection of documents. Search engines serve as a gateway to information on the Web. Web crawlers go out and fetch documents. We seek to develop the physical counterpart, where the vast dynamic world. What we want to know is how would a robot be able to do this?” Indexing, Web crawlers go out and fetch documents There are a number of similarities between the digital world, the web in particular, and the physical world. The web is an uncertain, large, decentralized, collection of information that is constantly evolving {{8 Baldi, Pierre 2003; }}. This is also true of the physical world. New buildings are constructed, old buildings are leveled, humans and weather modify landscapes, transportation vehicles and routes are added or changed, and so on. We draw inspiration from search engines in seeking to develop a physical counterpart. More specifically, we propose a framework in which mobile robots, world crawlers, live for great lengths of time in a large-scale physical space, and explore and maintain up-to-date information about the space. We refer to this framework as Robot Google— a search engine that creates an internal, query-able model of the physical world. To ensure that a model of an environment is current, mobile robots must use information in the model to determine which regions in the environment from where to gather data. The data is, in turn, used to update the model. We refer to the process of continuously updating large-scale dynamic models of the physical world as world crawling. This term is adapted from its so-called digital counterpart web crawling. Web crawling is the process in which programs autonomously navigate the web and download information, stored in documents {{8 Baldi, Pierre 2003; }}. Search engines are the primary gateway to information stored on the web. They serve two main purposes. First, search engines use web crawlers to navigate to unexplored portions of the web and to refresh stale information stored in their repositories. Second, search engines enable users to enter queries; the search engine then processes the query in order to retrieve a set of related documents {{8 Baldi, Pierre 2003; }}.

4 Approach World Crawling Model a physical space
Catalog features within the space Fetch specific information Motion Planner Laser Scanner What I say: “We develop this notion called World Crawling. World crawling is the process in which autonomous mobile robots can go out and collect information about physical spaces. We want this information to be cataloged.” “Here is a high level description of world crawling. …” Database: 1) What is the object? 2) Where is the object relative to robot “me” 3) Where am I relative to home (or local origin)? 4) What time is it? Change Detector Data Manager Robot Behaviors Feature Recognition Camera

5 Challenges Change detection Feature is there or not there
Indexing environment information Navigation strategy Determine which regions to visit Exploration Freshness - currentness- percentage data is current to within a certain time, e.g week What I will say: “The challenges we focus on for this research are change detection, devising a navigation strategy, attempting to show guarantees of freshness” Change Detection Predetermined feature-type(s) to observe Break surrounds down into regions (for purpose of indexing) Rank regions based on amount of change (and indexed history) Freshness- define a navigation strategy that maintain some level of alpha beta current- - current To ensure that a model of an environment is current, mobile robots must use information in the model to determine which regions in the environment from where to gather data.

6 Potential Applications
Transportation security Natural disasters Construction sites Industrial complexes Ship haul inspection

7 Thank you


Download ppt "Robotic Search Engines for the Physical World"

Similar presentations


Ads by Google