Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.

Slides:



Advertisements
Similar presentations
University of Karlsruhe September 30th, 2004 Masayuki Fujita
Advertisements

Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
“Intelligent Systems for Welding Process Automation” Prepared for Dr. Arshad Momen Prepared By Naieem Khan CEG-263: Kinematics & Robotics.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
Robot Sensor Networks. Introduction For the current sensor network the topography and stability of the environment is uncertain and of course time is.
Shweta Jain 1. Motivation ProMOTE Introduction Architectural Choices Pro-MOTE system Architecture Experimentation Conclusion and Future Work Acknowledgement.
Department of Electrical and Electronic Engineering, The University of Hong Kong, Pokfulam Road, Hong Kong Three-dimensional curve reconstruction from.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
From Teleoperators to Robots Development of machines and interfaces.
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Robust Object Tracking via Sparsity-based Collaborative Model
IE 447 COMPUTER INTEGRATED MANUFACTURING CHAPTER 9 Material Handling System 1 IE CIM Lecture Notes - Chapter 9 MHS.
Robotics and Me Vidyasagar Murty M.S. in Industrial Engineering University of Cincinnati.
Dana Cobzas-PhD thesis Image-Based Models with Applications in Robot Navigation Dana Cobzas Supervisor: Hong Zhang.
Mobile – robot remote control and communication system design P. Petrova, R. Zahariev Central Laboratory of Mechatronics and Instrumentation Bulgarian.
Introduction to Robotics In the name of Allah. Introduction to Robotics o Leila Sharif o o Lecture #2: The Big.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
ECE 7340: Building Intelligent Robots QUALITATIVE NAVIGATION FOR MOBILE ROBOTS Tod S. Levitt Daryl T. Lawton Presented by: Aniket Samant.
Embedded Visual Control for Robotics 5HC99 ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters,
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts, Lowell Computer.
Simplifying Wheelchair Mounted Robotic Arm Control with a Visual Interface Katherine M. Tsui and Holly A. Yanco University of Massachusetts Lowell Katherine.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
MEAM 620 Project Report Nima Moshtagh.
Communication & Robotics Laboratory 1 Students: Jason Gorski, Aleksandra Krsteva, Yuanyuan Chen Faculty: Imad H. Elhajj Department of Computer Science.
Mechatronics 1 Week 9 & 10. Learning Outcomes By the end of week 9-10 session, students will understand the control system of industrial robots.
Autonomous Robotics Team Autonomous Robotics Lab: Cooperative Control of a Three-Robot Formation Texas A&M University, College Station, TX Fall Presentations.
Vision Guided Robotics
Introduction to Machine Vision Systems
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Class material vs. Lab material – Lab 2, 3 vs. 4,5, 6 BeagleBoard / TI / Digilent GoPro.
Joint International Master Project Dennis Böck & Dirk C. Aumueller 1.
Multiple Autonomous Ground/Air Robot Coordination Exploration of AI techniques for implementing incremental learning. Development of a robot controller.
“Jožef Stefan” Institute Transnational ICT and Security Technology Opportunities – Ljubljana, 31 May 2007 Company Details Address: Jamova 39 Post Code:
Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.
Vision-Based Reach-To-Grasp Movements From the Human Example to an Autonomous Robotic System Alexa Hauck.
Disturbance Rejection: Final Presentation Group 2: Nick Fronzo Phil Gaudet Sean Senical Justin Turnier.
Synthetic Cognitive Agent Situational Awareness Components Sanford T. Freedman and Julie A. Adams Department of Electrical Engineering and Computer Science.
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
To clarify the statements, we present the following simple, closed-loop system where x(t) is a tracking error signal, is an unknown nonlinear function,
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
Robotics Sharif In the name of Allah. Robotics Sharif Introduction to Robotics o Leila Sharif o o Lecture #2: The.
Semi-automated Coaching for Elderly Collaborative effort between UCBerkeley, OHSU and NorthEastern University.
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
ECE 480 Design Team 1 Autonomous Docking of NASA Robotic Arm.
Figure ground segregation in video via averaging and color distribution Introduction to Computational and Biological Vision 2013 Dror Zenati.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
State Leadership Team Session 3: Day 2 Submitting Quality Resources and Navigating the Digital Library Fall 2013.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
Image Tracing Laser System Jason Duarte Azmat Latif Stephen Sundell Tim Weidner.
Autonomy for General Assembly Reid Simmons Research Professor Robotics Institute Carnegie Mellon University.
Robotics Where AI meets the real world. AMAN KUMAR, SECTION –B4902.
WELCOME TO ALL. DIGITAL IMAGE PROCESSING Processing of images which are Digital in nature by a Digital Computer.
ME 4451: Robotics Patrick Chang Ryan Stewart Julia Barry Alex Choi
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Robot Vision Control of robot motion from video
Automation as the Subject of Mechanical Engineer’s interest
Multidisciplinary Senior Design I: Problem Definition Review
EHPV Technology Auto-Calibration and Control Applied to Electro-Hydraulic Valves by Patrick Opdenbosch GOALS Development of a general formulation for control.
Vehicle Segmentation and Tracking in the Presence of Occlusions
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Vision Tracking System
ECE 477 Digital Systems Senior Design Project  Spring 2006
Plankton Classification VIDI: Sign Recognition HANDWRITING RECOGNITION
Visual servoing: a global path-planning approach
Substation Automation IT Needs
Jetson-Enabled Autonomous Vehicle
Presentation transcript:

Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human operator moves a master manipulator and the resulting feedback is utilized by the robotic controller. Cooperative Visual Servo Control for Robotic Manipulators Introduction In this project, the performance of a recently developed cooperative visual servo controller was examined that enables a robot manipulator to track objects moving in the task-space with an unknown trajectory via visual feedback. The goal of this project is to demonstrate the cooperative use of a fixed camera and a camera-in-hand to enable tracking performance by the manipulator despite the fact that neither camera is calibrated. Results Fixed Camera DataCamera-in-hand Tracking Error Methodology Motivation The performance of the cooperative visual servoing controller was experimentally demonstrated. The manipulator was able to track a laser dot moving in some unknown manner through the use of feedback from the fixed and in-hand camera. Future Work Fixed camera (for wide view) In-hand camera (for close-up view) 6-DOF hydraulic manipulator Pan-tilt unit laser pointer Proof-of-principle testbed Fixed Camera DataCamera-in-hand Tracking Error Fixed Camera Data Camera-in-hand Tracking Error The project is motivated by the desire to automate robotic motion in unstructured environments for decommissioning and decontaminating operations by the DOE. As a means to integrate the current research into an existing DOE robotic system, the visual servo controller was implemented on a robotic manipulator testbed that has played a vital role in several DOE teleoperation-based applications. Since the existing robotic testbed is controlled at the force/torque level by a closed (black box) system, the complete robust torque controller could not be implemented without completely replacing the existing system. Rather than replacing the existing controller, a plug-and-play retrofit was implemented. To reduce the need for a human operator to control every movement of the robot, this project targets new methods to provide robots with improved perception capabilities by using sensor information obtained from a camera system to directly control the robot Teleoperation Visual Servoing Moving laser dot In-hand Camera Fixed Camera Vision 2 Vision 1 To facilitate the communication between the computers, cameras, and the robot, a dedicated client/server architecture was developed. Uncalibrated information regarding the image-space position of the laser dot was obtained by the in-hand and fixed cameras. Vision 1 was responsible for determining the pixel information for the in-hand camera and and Vision 2 was responsible for determining the pixel information from the fixed camera. Vision 1 was also responsible for obtaining joint measurements from the robot, receiving pixel information from Vision 2, computing the desired robot trajectory, and delivering the computed desired joint positions to the robot controller. Motivated by the desire to overcome the restriction that the manipulator be confined to planar motion, current efforts are targeting the use of image homography-based approaches to achieve 6 degree-of-freedom object motion tracking. Subsequent efforts will also address other robotic systems such as mobile robots. Visual servoing for other robotic systems such as mobile robots will present new challenges due to the differences in the camera motion and possible constraints imposed on the robot motion.