Download presentation
Presentation is loading. Please wait.
1
USARsim & HRI Research Michael Lewis
2
Background.. USARsim was developed as a research tool for an NSF project to study Robot, Agent, Person Teams in Urban Search & Rescue Katia Sycara CMU- Multi agent systems Illah Nourbakhsh CMU/NASA- Field robotics Mike Lewis Pitt- human control of robotic teams
3
USARSim Design Objective Leverage technology developed by the $30 billion/year game industry to focus on building and validating high fidelity models of robots Piggyback on rapidly evolving technology for game engines Photorealistic graphics to allow study of human-robot interaction and machine vision Best available physics engines to replicate control/mobility challenges of real environments Availability of modeling tools to create realistically complex environments in a reasonable time Compatibility with robotics software such as Player or Pyro and game content such as America’s Army
4
Graphics and Vision The Unreal Engine supports the rapidly evolving graphics acceleration on the new video cards USARSim’s image server captures in-memory video so that images can be made available for: Machine vision algorithms Addition of realistic noise and distortion We are engaged in an on-going validation effort to identify aspects/algorithms that are/are not accurately modeled by the simulation
5
the PER one of our first robotsPER
6
Black Arena- Nike Silo fixed reference environment
7
Red Arena real & simulated
8
Brief history 2002-2003 Developed USARsim simulation Limited to our own robots Limited to our own (RETSINA) control architecture 2003-2004 Extended simulator for general access Added robots widely used in robocup USAR Added api’s for Player & Pyro 2004-2005 Began cooperative development Involved NIST in maintenance Demo competition at robocup in Osaka 2005-2006 Simulation matured Virtual Robots USAR competition in Bremen Rationalization of units, modularization of classes, etc.
9
www.sourceforge.net/projects/usarsim Used for Virtual Robots Competition in USAR League Maintained by NIST
10
Fixed Camera Illusion Can gravity referenced view (GRV) help us maintain awareness of attitude? Less time Less backtracking
11
Camera Control Experiment Video Feed is the strongest perceptual link to remote environment Disorientation Failure to take precaution against hazards Non-detection of mission critical information
12
Camera Control Fixed Camera PTZ Camera Dual ptz Cameras Results More targets for PTA & dual camera Dual camera twice as likely to be “disjoint”
13
MultiRobot Control Fully autonomous cooperation (Machinetta) Manual Cooperative
14
Multi-robot results More complete searches for autonomous & cooperative More victims found in cooperative (followed by manual) Cooperative participants switched more frequently between robots and Frequency of switching correlated with finding victims
15
Validation All robots in USARsim model real robots and so are candidates for validation Gives indication of the degree to which experimental results might be generalizable Provides a common reference for comparing experimental results Provides a mechanism for sharing advances in control code and interfaces among researchers Provides reassurance that software developed using simulation can be ported to hardware
21
Sensor validation for vision Conventional wisdom is that synthetic images are NOT useful for work in machine vision because of intrinsic regularities, etc. Similar to arguments wrt congruential random number generators Question should be empirical not theory
22
Feature Extraction Algorithms Edge detection Template matching Snakes OCR Tested in: Camera well lit Camera dimly lit Simulation well lit
23
Canny Edge Detection original Gaussian Filter to remove noise Sobel operator separates High horizonal/vertical regions Canny operator with thresholding
24
Edge Detection Performance
25
Template Matching TemplateImage with featurecorrelation Inverse of Fourier transform of image x Fourier transform of template
26
Template Convolution simulation real camera images
27
Template convolution distance in pixels estimate & target feature
28
Snake algorithm extraction on simulated image
29
Snake algorithm extraction on real camera image
30
Pitt/CMU Validation Participants controlled either robot or simulation from lab at Pitt Robot testing was conducted in replica of NIST’s Orange Arena at CMU Control was either Direct teleoperation or Command where operator specified waypoint Two robot types, the experimental Personal Explorer Rover (PER) and the Pioneer P3-AT (simulated as P2-AT) were tested
31
Simple & Complex Navigation Tasks 3 Meters 1 Meter 3 Meters
32
All Conditions: Task Completion Times
33
PER Direct vs. Command: Number of Turns
34
Direct Control PER & P3-AT: Average Forward Sequence
35
& now Accelerated Physics Next engine release will support Aegia PhysX Continually improving simulation quality (ex: 3 order of magnitude improvement in physics with hardware acceleration) & validation Let us do tracked robots, collapsing buildings, etc.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.