Vision-Guided Humanoid Footstep Planning for Dynamic Environments P. Michel, J. Chestnutt, J. Kuffner, T. Kanade Carnegie Mellon University – Robotics.

Slides:



Advertisements
Similar presentations
Lecture 7: Potential Fields and Model Predictive Control
Advertisements

AI Pathfinding Representing the Search Space
MBD and CSP Meir Kalech Partially based on slides of Jia You and Brian Williams.
Motion Planning for Point Robots CS 659 Kris Hauser.
School of Systems, Engineering, University of Reading rkala.99k.org April, 2013 Motion Planning for Multiple Autonomous Vehicles Rahul Kala Lateral Potentials.
Segmentation of Floor in Corridor Images for Mobile Robot Navigation Yinxiao Li Clemson University Committee Members: Dr. Stanley Birchfield (Chair) Dr.
Vision Based Control Motion Matt Baker Kevin VanDyke.
An article by: Itay Bar-Yosef, Nate Hagbi, Klara Kedem, Itshak Dinstein Computer Science Department Ben-Gurion University Beer-Sheva, Israel Presented.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
Planning under Uncertainty
Nice, 17/18 December 2001 Adaptive Grids For Bathymetry Mapping And Navigation Michel Chedid and Maria-João Rendas I3S - MAUVE.
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
David Hsu, Robert Kindel, Jean- Claude Latombe, Stephen Rock Presented by: Haomiao Huang Vijay Pradeep Randomized Kinodynamic Motion Planning with Moving.
Footstep Planning Among Obstacles for Biped Robots James Kuffner et al. presented by Jinsung Kwon.
Self-Collision Detection and Prevention for Humonoid Robots Paper by James Kuffner et al. Presented by David Camarillo.
Self-Collision Detection and Prevention for Humonoid Robots Paper by James Kuffner et al. Jinwhan Kim.
Objective of Computer Vision
Motion Planning for Legged Robots on Varied Terrain Kris Hauser, Timothy Bretl, Jean-Claude Latombe Kensuke Harada, Brian Wilcox Presented By Derek Chan.
Planning for Humanoid Robots Presented by Irena Pashchenko CS326a, Winter 2004.
Randomized Planning for Short Inspection Paths Tim Danner Lydia E. Kavraki Department of Computer Science Rice University.
Presented By: Huy Nguyen Kevin Hufford
Vision January 10, Today's Agenda ● Some general notes on vision ● Colorspaces ● Numbers and Java ● Feature detection ● Rigid body motion.
Highlights Lecture on the image part (10) Automatic Perception 16
Behavior Planning for Character Animation Manfred Lau and James Kuffner Carnegie Mellon University.
Objective of Computer Vision
CS 326A: Motion Planning Basic Motion Planning for a Point Robot.
Triangle-based approach to the detection of human face March 2001 PATTERN RECOGNITION Speaker Jing. AIP Lab.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
Chapter 5.4 Artificial Intelligence: Pathfinding.
Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
Chapter 5.4 Artificial Intelligence: Pathfinding.
Constraints-based Motion Planning for an Automatic, Flexible Laser Scanning Robotized Platform Th. Borangiu, A. Dogar, A. Dumitrache University Politehnica.
World space = physical space, contains robots and obstacles Configuration = set of independent parameters that characterizes the position of every point.
SPIE'01CIRL-JHU1 Dynamic Composition of Tracking Primitives for Interactive Vision-Guided Navigation D. Burschka and G. Hager Computational Interaction.
Page: 1 PHAM VAN Tien Real-Time Approach for Auto-Adjusting Vision System Reading Class International Graduate School of Dynamic Intelligent Systems.
DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Third Quarterly IPR Meeting May 11, 1999 P. I.s: Leonidas J. Guibas and Jean-Claude.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Image-Based Segmentation of Indoor Corridor Floors for a Mobile Robot
Learning to Navigate Through Crowded Environments Peter Henry 1, Christian Vollmer 2, Brian Ferris 1, Dieter Fox 1 Tuesday, May 4, University of.
Legged Locomotion Planning Kang Zhao B659 Intelligent Robotics Spring
Image-Based Segmentation of Indoor Corridor Floors for a Mobile Robot Yinxiao Li and Stanley T. Birchfield The Holcombe Department of Electrical and Computer.
Real-time motion planning for Manipulator based on Configuration Space Chen Keming Cis Peking University.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
NUS CS5247 Dynamically-stable Motion Planning for Humanoid Robots Presenter Shen zhong Guan Feng 07/11/2003.
Randomized Kinodynamics Planning Steven M. LaVelle and James J
Autonomous Robots Robot Path Planning (3) © Manfred Huber 2008.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Color Image Segmentation Mentor : Dr. Rajeev Srivastava Students: Achit Kumar Ojha Aseem Kumar Akshay Tyagi.
Department of Computer Science Columbia University rax Dynamically-Stable Motion Planning for Humanoid Robots Paper Presentation James J. Kuffner,
Toward humanoid manipulation in human-centered environments T. Asfour, P. Azad, N. Vahrenkamp, K. Regenstein, A. Bierbaum, K. Welke, J. Schroder, R. Dillmann.
Navigation and Control with LabVIEW Robotics
3D Perception and Environment Map Generation for Humanoid Robot Navigation A DISCUSSION OF: -BY ANGELA FILLEY.
Autonomous Navigation of a
STEREO VISION BASED MOBILE ROBOT NAVIGATION IN ROUGH TERRAIN
Vision-Guided Humanoid Footstep Planning for Dynamic Environments
Chapter 5.4 Artificial Intelligence: Pathfinding
Ke Chen Reading: [7.3, EA], [9.1, CMB]
Reinforcement Learning in POMDPs Without Resets
Real-time Wall Outline Extraction for Redirected Walking
SoC and FPGA Oriented High-quality Stereo Vision System
Search-Based Footstep Planning
Path Planning in Discrete Sampled Space
Ke Chen Reading: [7.3, EA], [9.1, CMB]
CSE (c) S. Tanimoto, 2002 Image Understanding
-Koichi Nishiwaki, Joel Chestnutt and Satoshi Kagami
Stefan Oßwald, Philipp Karkowski, Maren Bennewitz
Presentation transcript:

Vision-Guided Humanoid Footstep Planning for Dynamic Environments P. Michel, J. Chestnutt, J. Kuffner, T. Kanade Carnegie Mellon University – Robotics Institute Humanoids 2005

Objective Paper presents a vision- based footstep planning system that computes the best partial footstep path within its time-limited search horizon, according to problem-specific cost metrics and heuristics.

Related Work reliable, stable gait generation and feedback – Emphasis on pre-generating walking trajectories – Online trajectory generation – Dynamic balance – No accounting for obstacles! Little work focused on developing global navigation autonomy for biped robots

Related Work Obstacle avoidance and local planning based on visual feedback has been studied in humans Several reactive perception-based obstacle avoidance techniques for bipeds have been developed Environment mapping; obstacle detection; color-based segmentation

CMU’s Honda ASIMO Humanoid

Sensing and the Environment ASIMO Robot Global sensing – Overhead camera: compute position of robot, desired goal location, obstacles – All processing done on real-time

Sensing and the Environment: Color Segmentation Colored markers – Bright pink: planar obstacles on the floor – Light blue: desired goal location – Yellow and green: identify robot’s location and orientation – Dark blue: 4 square delimiters to define a rectangular area within which the robot operates Color segmentation performed directly on YUV stream generated by camera – Avoids processing overhead

Sensing and the Environment: Color Segmentation Color thresholds = sample pixel values offline for each marker – Produced series of binary masks including presence or absence of markers pixel – Noise eliminated by erosion/dilation – Connected components labeling is applied to group of pixels Calculate moments for each color blob – Centroid, area, major/minor aces, orientation of floor

Sensing and the Environment: Converting to World Coordinates Assume physical distance between 4 delimiters that outline robot’s walking area are known Scaling used to convert between pixel coordinate of each blob’s centroid and corresponding real- world distances Orientation of robot determined from angle the line connecting the backpack markers forms with horizontal Footstep planning requires precise location of robot’s feet

Sensing and the Environment: Converting to World Coordinates

Sensing and the Environment: Building the Environment Map 2D grid of binary value cells = environment – Value in cell = whether terrain is obstacle free or partially/totally occupied by obstacle – Bitmap representation of freespace and obstacles

Footstep Planning Goal: to find as close to an optimal sequence of actions as possible that causes the robot to reach the goal location while avoiding obstacles in the environment

Footstep Planning: Basic Algorithm Planner Algorithms – Input: environment map E, initial and goal robot states, mapping of possible actions that may be taken in each state and an action-effect mapping – Return: sequence of footstep actions after finding path to goal – Planner computes cost of each candidate footstep location using 3 metrics: Location cost determining whether candidate location is “safe” in environment Step costs which prefers ‘easy’ stepping actions Estimated cost-to-go providing approximation of candidate’s proximity to goal using standard mobile-robot planner

Footstep Planning: Basic Algorithm A* search performed on possible sequences of walking actions – Done until a path is found OR – Specified computation time limit is exceeded

Footstep Planning: Plan Reuse At each step: plan a path towards the goal. – ASIMO takes first step and then replans for next step – Reuse computations from before using a forward search

Evaluation: Vision-Planner Integration

Evaluation: Obstacle Avoidance – Unpredictably Moving Obstacles

Discussion Approach to autonomous humanoid walking in presence of dynamically moving obstacles – Combines sensing, planning and execution in closed loop Currently working: – more realistic estimate of floor directly surrounding robot’s feet – On-body vision to satisfy real-time constraints for sensing loop