Vision-Guided Humanoid Footstep Planning for Dynamic Environments

Slides:



Advertisements
Similar presentations
Lecture 7: Potential Fields and Model Predictive Control
Advertisements

MBD and CSP Meir Kalech Partially based on slides of Jia You and Brian Williams.
Motion Planning for Point Robots CS 659 Kris Hauser.
Segmentation of Floor in Corridor Images for Mobile Robot Navigation Yinxiao Li Clemson University Committee Members: Dr. Stanley Birchfield (Chair) Dr.
Extracting Minimalistic Corridor Geometry from Low-Resolution Images Yinxiao Li, Vidya, N. Murali, and Stanley T. Birchfield Department of Electrical and.
Vision Based Control Motion Matt Baker Kevin VanDyke.
An article by: Itay Bar-Yosef, Nate Hagbi, Klara Kedem, Itshak Dinstein Computer Science Department Ben-Gurion University Beer-Sheva, Israel Presented.
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
A Robotic Wheelchair for Crowded Public Environments Choi Jung-Yi EE887 Special Topics in Robotics Paper Review E. Prassler, J. Scholz, and.
Footstep Planning Among Obstacles for Biped Robots James Kuffner et al. presented by Jinsung Kwon.
Self-Collision Detection and Prevention for Humonoid Robots Paper by James Kuffner et al. Presented by David Camarillo.
Self-Collision Detection and Prevention for Humonoid Robots Paper by James Kuffner et al. Jinwhan Kim.
Objective of Computer Vision
Motion Planning for Legged Robots on Varied Terrain Kris Hauser, Timothy Bretl, Jean-Claude Latombe Kensuke Harada, Brian Wilcox Presented By Derek Chan.
Planning for Humanoid Robots Presented by Irena Pashchenko CS326a, Winter 2004.
Randomized Planning for Short Inspection Paths Tim Danner Lydia E. Kavraki Department of Computer Science Rice University.
Presented By: Huy Nguyen Kevin Hufford
Vision January 10, Today's Agenda ● Some general notes on vision ● Colorspaces ● Numbers and Java ● Feature detection ● Rigid body motion.
Highlights Lecture on the image part (10) Automatic Perception 16
Behavior Planning for Character Animation Manfred Lau and James Kuffner Carnegie Mellon University.
Objective of Computer Vision
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
Chapter 5.4 Artificial Intelligence: Pathfinding.
Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Chapter 5.4 Artificial Intelligence: Pathfinding.
World space = physical space, contains robots and obstacles Configuration = set of independent parameters that characterizes the position of every point.
Class material vs. Lab material – Lab 2, 3 vs. 4,5, 6 BeagleBoard / TI / Digilent GoPro.
Page: 1 PHAM VAN Tien Real-Time Approach for Auto-Adjusting Vision System Reading Class International Graduate School of Dynamic Intelligent Systems.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Learning to Navigate Through Crowded Environments Peter Henry 1, Christian Vollmer 2, Brian Ferris 1, Dieter Fox 1 Tuesday, May 4, University of.
Legged Locomotion Planning Kang Zhao B659 Intelligent Robotics Spring
Image-Based Segmentation of Indoor Corridor Floors for a Mobile Robot Yinxiao Li and Stanley T. Birchfield The Holcombe Department of Electrical and Computer.
Real-time motion planning for Manipulator based on Configuration Space Chen Keming Cis Peking University.
Randomized Kinodynamics Planning Steven M. LaVelle and James J
Autonomous Robots Robot Path Planning (3) © Manfred Huber 2008.
Intelligent Robotics Today: Vision & Time & Space Complexity.
Vision-Guided Humanoid Footstep Planning for Dynamic Environments P. Michel, J. Chestnutt, J. Kuffner, T. Kanade Carnegie Mellon University – Robotics.
Color Image Segmentation Mentor : Dr. Rajeev Srivastava Students: Achit Kumar Ojha Aseem Kumar Akshay Tyagi.
Department of Computer Science Columbia University rax Dynamically-Stable Motion Planning for Humanoid Robots Paper Presentation James J. Kuffner,
Toward humanoid manipulation in human-centered environments T. Asfour, P. Azad, N. Vahrenkamp, K. Regenstein, A. Bierbaum, K. Welke, J. Schroder, R. Dillmann.
Navigation and Control with LabVIEW Robotics
3D Perception and Environment Map Generation for Humanoid Robot Navigation A DISCUSSION OF: -BY ANGELA FILLEY.
Autonomous Navigation of a
STEREO VISION BASED MOBILE ROBOT NAVIGATION IN ROUGH TERRAIN
ME512: Mobile Robotics Last Lecture
A Plane-Based Approach to Mondrian Stereo Matching
Optimal Acceleration and Braking Sequences for Vehicles in the Presence of Moving Obstacles Jeff Johnson, Kris Hauser School of Informatics and Computing.
Chapter 5.4 Artificial Intelligence: Pathfinding
Ke Chen Reading: [7.3, EA], [9.1, CMB]
CS b659: Intelligent Robotics
Reinforcement Learning in POMDPs Without Resets
Schedule for next 2 weeks
Real-time Wall Outline Extraction for Redirected Walking
SoC and FPGA Oriented High-quality Stereo Vision System
EE631 Cooperating Autonomous Mobile Robots Lecture: Collision Avoidance in Dynamic Environments Prof. Yi Guo ECE Dept.
Computer Vision Lecture 5: Binary Image Processing
Search-Based Footstep Planning
Path Planning in Discrete Sampled Space
Ke Chen Reading: [7.3, EA], [9.1, CMB]
CSE (c) S. Tanimoto, 2002 Image Understanding
Robot Intelligence Kevin Warwick.
-Koichi Nishiwaki, Joel Chestnutt and Satoshi Kagami
CSE (c) S. Tanimoto, 2001 Image Understanding
Stefan Oßwald, Philipp Karkowski, Maren Bennewitz
CSE (c) S. Tanimoto, 2004 Image Understanding
Presentation transcript:

Vision-Guided Humanoid Footstep Planning for Dynamic Environments P. Michel, J. Chestnutt, J. Kuffner, T. Kanade Carnegie Mellon University – Robotics Institute Humanoids 2005

Objective Paper presents a vision- based footstep planning system that computes the best partial footstep path within its time-limited search horizon, according to problem-specific cost metrics and heuristics.

Related Work reliable, stable gait generation and feedback Emphasis on pre-generating walking trajectories Online trajectory generation Dynamic balance No accounting for obstacles! Little work focused on developing global navigation autonomy for biped robots

Related Work Obstacle avoidance and local planning based on visual feedback has been studied in humans Several reactive perception-based obstacle avoidance techniques for bipeds have been developed Environment mapping; obstacle detection; color-based segmentation

CMU’s Honda ASIMO Humanoid https://www.youtube.com/watch?v=0dv0wJDwtuw

Sensing and the Environment ASIMO Robot Global sensing Overhead camera: compute position of robot, desired goal location, obstacles All processing done on real-time

Sensing and the Environment: Color Segmentation Colored markers Bright pink: planar obstacles on the floor Light blue: desired goal location Yellow and green: identify robot’s location and orientation Dark blue: 4 square delimiters to define a rectangular area within which the robot operates Color segmentation performed directly on YUV stream generated by camera Avoids processing overhead

Sensing and the Environment: Color Segmentation Color thresholds = sample pixel values offline for each marker Produced series of binary masks including presence or absence of markers pixel Noise eliminated by erosion/dilation Connected components labeling is applied to group of pixels Calculate moments for each color blob Centroid, area, major/minor aces, orientation of floor

Sensing and the Environment: Converting to World Coordinates Assume physical distance between 4 delimiters that outline robot’s walking area are known Scaling used to convert between pixel coordinate of each blob’s centroid and corresponding real-world distances Orientation of robot determined from angle the line connecting the backpack markers forms with horizontal Footstep planning requires precise location of robot’s feet

Sensing and the Environment: Converting to World Coordinates

Sensing and the Environment: Building the Environment Map 2D grid of binary value cells = environment Value in cell = whether terrain is obstacle free or partially/totally occupied by obstacle Bitmap representation of freespace and obstacles

Footstep Planning Goal: to find as close to an optimal sequence of actions as possible that causes the robot to reach the goal location while avoiding obstacles in the environment

Footstep Planning: Basic Algorithm Planner Algorithms Input: environment map E, initial and goal robot states, mapping of possible actions that may be taken in each state and an action-effect mapping Return: sequence of footstep actions after finding path to goal Planner computes cost of each candidate footstep location using 3 metrics: Location cost determining whether candidate location is “safe” in environment Step costs which prefers ‘easy’ stepping actions Estimated cost-to-go providing approximation of candidate’s proximity to goal using standard mobile-robot planner

Footstep Planning: Basic Algorithm A* search performed on possible sequences of walking actions Done until a path is found OR Specified computation time limit is exceeded

Footstep Planning: Plan Reuse At each step: plan a path towards the goal. ASIMO takes first step and then replans for next step Reuse computations from before using a forward search

Evaluation: Vision-Planner Integration

Evaluation: Obstacle Avoidance – Unpredictably Moving Obstacles

Discussion Approach to autonomous humanoid walking in presence of dynamically moving obstacles Combines sensing, planning and execution in closed loop Currently working: more realistic estimate of floor directly surrounding robot’s feet On-body vision to satisfy real-time constraints for sensing loop