Download presentation
Presentation is loading. Please wait.
Published byAshley Jones Modified over 9 years ago
1
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time delays. For instance, the graph below plots the position of the head during a sinusoidal motion: the red line indicates joint encoder data, and the blue line shows data from the cameras. The apparent 30 ms delay between these devices can degrade Metalman’s dynamic performance. In this work, we develop simple matching and prediction techniques that allow Metalman to autonomously estimate and reduce these effects. Electrical and Computer Systems Engineering Postgraduate Student Research Forum 2001 Using position-based visual servoing, Metalman has the ability to perform simple manipulation tasks; the sequence below shows Metalman autonomously locating and stacking three randomly placed blocks. Future work will include servoing two arms cooperatively to perform even more complex tasks! Perception and Control in Humanoid Robotics using Vision Geoffrey Taylor Supervisors: A/Prof Lindsay Kleeman A/Prof R Andrew Russell Imagine you had a domestic humanoid robot servant, then consider what you would like it to do … It quickly becomes clear that a practical domestic robot must possess a basic ability to find and grasp objects in a dynamic, cluttered environment (ie. your house!). To address this issue, we have developed a self-calibrating, position-based visual servoing framework. Metalman, the Monash upper-torso humanoid robot, provides a platform for this and other exciting humanoid robot experiments. For more information, check the IRRC web page at www.ecse.monash.edu.au/centres/IRRC It’s a visual thing … Visual servoing is a feedback control technique using visual measurements to robustly regulate the motion of a robot. Metalman uses stereo cameras to estimate the 3D pose (position and orientation) of its hands, by observing bright LEDs attached in a known pattern and feeding the data into a Kalman tracking filter. Other objects are similarly localized via attached coloured markers. Depending on the desired action (eg. grasp an object), Metalman uses this pose information to generate actuating signals that drive the arm to the required pose. Because Metalman continuously estimates the pose of its hands, the system is completely self-calibrating. Progress time indicated at top-right of each frame 160 s 80 s 0 s 20 s 35 s This is the actual stereo view seen by Metalman while tracking its hand LED markers on the hand facilitate pose tracking Biclops active head 3D hand pose measurement gives the relative position and orientation between hand and head Final hand pose depends on the desired action Metalman uses pose information to drive hand in desired direction 100 s Even robots get lonely! Metalman must interact with humans to be truly useful. The experiment below demonstrates simple interaction using motion cues: the user taps on a random block, and Metalman places a finger above the selected object.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.