Download presentation
Presentation is loading. Please wait.
Published byJustin Nash Modified over 6 years ago
1
Vision-Guided Humanoid Footstep Planning for Dynamic Environments
P. Michel, J. Chestnutt, J. Kuffner, T. Kanade Carnegie Mellon University – Robotics Institute Humanoids 2005
2
Objective Paper presents a vision- based footstep planning system that computes the best partial footstep path within its time-limited search horizon, according to problem-specific cost metrics and heuristics.
3
Related Work reliable, stable gait generation and feedback
Emphasis on pre-generating walking trajectories Online trajectory generation Dynamic balance No accounting for obstacles! Little work focused on developing global navigation autonomy for biped robots
4
Related Work Obstacle avoidance and local planning based on visual feedback has been studied in humans Several reactive perception-based obstacle avoidance techniques for bipeds have been developed Environment mapping; obstacle detection; color-based segmentation
5
CMU’s Honda ASIMO Humanoid
6
Sensing and the Environment
ASIMO Robot Global sensing Overhead camera: compute position of robot, desired goal location, obstacles All processing done on real-time
7
Sensing and the Environment: Color Segmentation
Colored markers Bright pink: planar obstacles on the floor Light blue: desired goal location Yellow and green: identify robot’s location and orientation Dark blue: 4 square delimiters to define a rectangular area within which the robot operates Color segmentation performed directly on YUV stream generated by camera Avoids processing overhead
8
Sensing and the Environment: Color Segmentation
Color thresholds = sample pixel values offline for each marker Produced series of binary masks including presence or absence of markers pixel Noise eliminated by erosion/dilation Connected components labeling is applied to group of pixels Calculate moments for each color blob Centroid, area, major/minor aces, orientation of floor
9
Sensing and the Environment: Converting to World Coordinates
Assume physical distance between 4 delimiters that outline robot’s walking area are known Scaling used to convert between pixel coordinate of each blob’s centroid and corresponding real-world distances Orientation of robot determined from angle the line connecting the backpack markers forms with horizontal Footstep planning requires precise location of robot’s feet
10
Sensing and the Environment: Converting to World Coordinates
11
Sensing and the Environment: Building the Environment Map
2D grid of binary value cells = environment Value in cell = whether terrain is obstacle free or partially/totally occupied by obstacle Bitmap representation of freespace and obstacles
12
Footstep Planning Goal: to find as close to an optimal sequence of actions as possible that causes the robot to reach the goal location while avoiding obstacles in the environment
13
Footstep Planning: Basic Algorithm
Planner Algorithms Input: environment map E, initial and goal robot states, mapping of possible actions that may be taken in each state and an action-effect mapping Return: sequence of footstep actions after finding path to goal Planner computes cost of each candidate footstep location using 3 metrics: Location cost determining whether candidate location is “safe” in environment Step costs which prefers ‘easy’ stepping actions Estimated cost-to-go providing approximation of candidate’s proximity to goal using standard mobile-robot planner
14
Footstep Planning: Basic Algorithm
A* search performed on possible sequences of walking actions Done until a path is found OR Specified computation time limit is exceeded
15
Footstep Planning: Plan Reuse
At each step: plan a path towards the goal. ASIMO takes first step and then replans for next step Reuse computations from before using a forward search
16
Evaluation: Vision-Planner Integration
17
Evaluation: Obstacle Avoidance – Unpredictably Moving Obstacles
18
Discussion Approach to autonomous humanoid walking in presence of dynamically moving obstacles Combines sensing, planning and execution in closed loop Currently working: more realistic estimate of floor directly surrounding robot’s feet On-body vision to satisfy real-time constraints for sensing loop
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.