A ROBOTHUMAN Software System IMA Primitive Agent Hardware Interface A A A A A A A Human Agent Robot Agent System Architecture for Human- Robot Interaction.

Slides:



Advertisements
Similar presentations
A New Generation of Surgical Technique: Telesurgery Using Haptic Interfaces By Sarah L. Choy ~ A haptic interface is a force reflecting device which allows.
Advertisements

Robotics Where AI meets the real world. Ankit Jain
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Future Careers in Embedded Systems, Mechatronics, and Control Mark W. Spong Coordinated Science Laboratory University of Illinois Urbana, IL
Breakout session B questions. Research directions/areas Multi-modal perception cognition and interaction Learning, adaptation and imitation Design and.
University of Minho School of Engineering Centre ALGORITMI Uma Escola a Reinventar o Futuro – Semana da Escola de Engenharia - 24 a 27 de Outubro de 2011.
Carnegie Mellon University School of Computer Science Carnegie Mellon University School of Computer Science Cognitive Primitives for Mobile Robots Development.
Page 1 SIXTH SENSE TECHNOLOGY Presented by: KIRTI AGGARWAL 2K7-MRCE-CS-035.
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Markovito’s Team (INAOE, Puebla, Mexico). Team members.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Base Protection Lab (BPL) December 12, 2007 ONR Program Officer: William “Kip” Krebs, , Alternate POC: Annetta Burger.
HKIE Accreditation Visit Welcome to Signal Processing Lab! Introduction to Signal Processing Lab Project Demonstrations Lab Tour.
P10216 – Robot Navigation and Plant Platform Mahbubul Alam (CE) – Team Lead Corey Provencher (EE) – Technical Lead Marcus Gwillim (CE) Alan Olson (EE)
Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and.
Experiences with an Architecture for Intelligent Reactive Agents By R. Peter Bonasso, R. James Firby, Erann Gat, David Kortenkamp, David P Miller, Marc.
IEEE Technical Committee on Medical Robotics Prof. Paolo Dario Cecilia Laschi with contribution by Cecilia Laschi Scuola Superiore Sant’Anna Pisa, Italy.
Humanoid Robotics – A Social Interaction CS 575 ::: Spring 2007 Guided By Prof. Baparao By Devangi Patel reprogrammable multifunctionalmovable self - contained.
ASIMO. Want a robot to cook your dinner, Do your homework, Clean your house, Or get your groceries? Robots already do a lot of the jobs that we humans.
Computing ESSENTIALS     CHAPTER Ch 9Copyright 2003 The McGraw-Hill Companies, Inc Graphics, Multimedia, and Artificial Intelligence computing.
Tennessee State University College of Engineering ENGINEERING RESEARCH INSTITUTE (ERI) Interdisciplinary Research in Robotics Intelligent Tactical Mobility.
Artificial Intelligence
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Robots at Work Dr Gerard McKee Active Robotics Laboratory School of Systems Engineering The University of Reading, UK
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Human-Robot Interaction -Emerging Opportunities Pramila Rani 1997A3PS071 October 27,2006.
A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes, A. Peters, D. Gaines* Vanderbilt University Center.
Introduction Kinect for Xbox 360, referred to as Kinect, is developed by Microsoft, used in Xbox 360 video game console and Windows PCs peripheral equipment.
DARPA ITO/MARS Project Update Vanderbilt University A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes,
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
CSCE 5013 Computer Vision Fall 2011 Prof. John Gauch
Project title : Automated Detection of Sign Language Patterns Faculty: Sudeep Sarkar, Barbara Loeding, Students: Sunita Nayak, Alan Yang Department of.
SEMINAR REPORT ON K.SWATHI. INTRODUCTION Any automatically operated machine that functions in human like manner Any automatically operated machine that.
HCI / CprE / ComS 575: Computational Perception Instructor: Alexander Stoytchev
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Online Kinect Handwritten Digit Recognition Based on Dynamic Time Warping and Support Vector Machine Journal of Information & Computational Science, 2015.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
DARPA ITO/MARS Project Update Vanderbilt University A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes,
REU 2004 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Distributed Rational.
ORAU Center for Science Education 100 ORAU Way, Oak Ridge, TN
KAMI KITT ASSISTIVE TECHNOLOGY Chapter 7 Human/ Assistive Technology Interface.
A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes (Presenter), A. Peters Vanderbilt University Center.
The palm was created using a modular cavity design. It was designed using ProEngineer and printed using Rapid Prototype. The fingers were made using Polymorph.
UNIT I. EMBEDDED SYSTEM It is an electrical/electro-mechanical system designed to perform a specific function. It is a combination of hardware and software.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev
Autonomy for General Assembly Reid Simmons Research Professor Robotics Institute Carnegie Mellon University.
Robotics Where AI meets the real world. AMAN KUMAR, SECTION –B4902.
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
Developing a Low-Cost Robot Colony Association for the Advancement of Artificial Intelligence November 10, 2007 James Kong Felix Duvallet Austin Buchan.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Computer Science and Engineering Department The University of Texas at Arlington MavHome: An Intelligent Home Environment.
REU 2009 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Information Processing.
NEW TRENDS IN EDUCATION OF ROBOTICS Óbuda University, Doctoral School of Safety and Security Sciences, Budapest, Hungary Gyula.
What is Multimedia Anyway? David Millard and Paul Lewis.
For Official NASA Use Only
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
GESTURE CONTROLLED ROBOTIC ARM
Humanoid Robotics – A Social Interaction
Jörg Stückler, imageMax Schwarz and Sven Behnke*
ROBOTICS.
Visual Tracking on an Autonomous Self-contained Humanoid Robot
HCI/ComS 575X: Computational Perception
AI Stick Easy to learn and use, accelerate the industrialization of artificial intelligence, and let the public become an expert in AI.
Presentation transcript:

A ROBOTHUMAN Software System IMA Primitive Agent Hardware Interface A A A A A A A Human Agent Robot Agent System Architecture for Human- Robot Interaction Human Interaction Hardware System

Welcome to the Center for Intelligent Systems at the Vanderbilt University School of Engineering. CIS conducts research on intelligent robotics and on intelligent manufacturing. Research Activities and CIS Links: Intelligent Robotics Lab Intelligent Manufacturing Recent Publications CIS-Affiliated Faculty, Students, and Alumni The CIS Newsletter Employment Opportunities Contact Information: Center for Intelligent Systems, Vanderbilt University Dr. Kazuhiko KawamuraDr. Kazuhiko Kawamura, Director Dr. Alan PetersDr. Alan Peters, Assistant Director Dr. Mitch WilkesDr. Mitch Wilkes, Assistant Director Florence (Flo) FottrellFlorence (Flo) Fottrell, Administrator Box 131 Station B Nashville, TN Phone: (615) (Lab), (615) (Office) Fax: (615) Other Links  U.S.-Japan Center Home Page U.S.-Japan Center Home Page  Vanderbilt University School of Engineering Vanderbilt University School of Engineering  IEEE RAS Service Robot Technical Committee IEEE RAS Service Robot Technical Committee

Research Projects under IRL Publications Papers Online People Faculty, Students, and Alumni Robot Links Link to Interesting Robot Sites Site Contents

Research Projects under IRL Publications Papers Online People Faculty, Students, and Alumni Robot Links Link to Interesting Robot Sites Site Contents Welcome Welcome to the Intelligent Robotics Laboratory (IRL) at Vanderbilt University's School of Engineering! The IRL is a part of the Center for Intelligent Systems and conducts research on service robots and human/robot symbiosis. Contact Information: Director: Dr. Kazuhiko Kawamura Assistant Directors: Dr. M. Wilkes Dr. R.A. Peters II Research Faculty: Dr. G. Biswas Dr. D. Gaines Dr. D. Fisher Dr. P.K. Basu Administrator: Flo Fottrell Lab Manager: Mark Cambron Web Manager: Tamara RogersDr. Kazuhiko KawamuraDr. M. Wilkes Dr. R.A. Peters IIDr. G. Biswas Dr. D. Gaines Dr. D. Fisher Dr. P.K. BasuFlo FottrellMark CambronTamara Rogers Intelligent Robotics Lab Vanderbilt University Box 131 Station B Nashville, TN Phone: (615) Fax: (615)

Research Projects under IRL Publications Papers Online People Faculty, Students, and Alumni Robot Links Link to Interesting Robot Sites Site Contents Humanoids  ISAC ISAC Mobile Robots  HelpMate HelpMate Bio-Mimetic Control Systems  Associative Memory Associative Memory  Attention System Attention System  Sensory Systems Sensory Systems  Spreading Activation/Learning Spreading Activation/Learning  High Level Agent Structure High Level Agent Structure McKibben Artificial Muscles  Basics Basics  ISAC Arms ISAC Arms Rehabilitation Robotics  Rehab Robotics Rehab Robotics Industrial Automation  Intelligent Planners Intelligent Planners  Industrial Pick and Place Robot Industrial Pick and Place Robot  Remote Manufacturing Systems Remote Manufacturing Systems Intelligent Machine Architecture (IMA)  IMA IMA  IMA II IMA II Anthropomorphic Manipulators  PneuHand PneuHand  PneuHand II PneuHand II Robots & the Arts  Theremin Playing Theremin Playing Climber Robot  Robin Robin

ISAC is a dual-arm humanoid robot that was designed and built in the IRL as a research platform for service robotics. The system contains  Two pneumatic 6DOF SoftArms actuated by McKibben artificial muscles.  An air compressor and compressed air delivery system.  A Greifer gripper.  A four fingered, anthropomorphic dexterous manipulator, that we call the PneuHand, designed and built by the IRL.  Two force-torque sensors connected at the arm's wrist joints.  A Directed Perception pan-tilt platform modified in house for independent verge control of two color cameras.  Two 200 MHz Dual processor Pentium Pros. One controls grayscale image processing and the other controls the two SoftArms with two arm controller boards (built in house), and a multi-channel audio signal processor.  One 266 MHz Pentium-2 with two Imagenation color frame grabbers.  One 200 MHz Pentium Pro. The dual-arm system provides a test-bed to develop new technologies for user-to-robot and robot-to-user communications, including audio, visual, and gestural methods.

The Intelligent Robotics Lab is currently working to incorporate a mobile robot with the ISAC system. The Helpmate mobile robot was donated by Yaskawa Electric of Japan. Helpmate has been upgraded with the following new features:  A 400MHz Pentium II motherboard.  A 5DOF rubbertuator-actuated softarm.  A Lidar sensor for navigation.  A vision system, including CATCH and a PCI color frame grabber.  New control software, based on IMA.  Connection to the Internet via wireless Ethernet Helpmate will soon become an integral part of the ISAC system. A new software architechture (see related pages for IMA) will allow a combination of local autonomy and user direction, enabling Helpmate to navigate hallways and rooms to accomplish tasks. We are also using HelpMate as a test bed for IMA2, a revised version of IMA.

What Helpmate looked like before we got a hold of it.

A side/front view, showing the sonar arrays, and the arm just hanging there.

This is a rear view, showing the DC-to- AC converter (the black box on the "tailgate"), the air compressor (that red pumpkin-looking thing), the servo valve tree (in the middle), and the manipulator.

Previous SoftArm in a feeding task

ISAC, our dual- arm humanoid, in its original configuration (with the Greifer gripper, the FMA gripper, and the original CATCH pan/tilt/verge head)

A previous version of ISAC, with some of his tools.

 D.M. Wilkes, W.A. Alford, R.T. Pack, T.E. Rogers, E.E. Brown, Jr., R.A. Peters II, and K. Kawamura, “Service Robots for Rehabilitation And Assistance", Chapter 2 in Teodorescu and Jain, “Intelligent Systems and Techniques in Rehabilitation”, CRC Press, 1999.“Service Robots for Rehabilitation And Assistance"  W. A. Alford, T. Rogers, D. M. Wilkes, and K. Kawamura, "Multi-Agent System for a Human-Friendly Robot", Proceedings of the 1999 IEEE International Conference on Systems, Man, and Cybernetics (SMC '99), pp , October 12-15, 1999, Tokyo, Japan."Multi-Agent System for a Human-Friendly Robot"  K. Kawamura, "Human-Robot Interaction for a Human-Friendly Robot: A Working Paper", Proceedings of the Second International Symposium on HUmanoid RObotics (HURO '99), pp , October 8-9, 1999, Tokyo, Japan."Human-Robot Interaction for a Human-Friendly Robot: A Working Paper"  A. Alford, S. Northrup, K. Kawamura, K-W. Chan, "Music Playing Robot", Proceedings of the International Conference on Field and Service Robotics (FSR '99), pp , August 29-31, 1999, Pittsburgh, PA."Music Playing Robot" Recent Publications 1999

 S. Charoenseang, A. Srikaew, D.M. Wilkes, and K. Kawamura, "3-D Collision Avoidance for the Dual-Arm Humanoid Robot", IASTED International Conference on Robotics and Manufacturing, Banff, Canada, July, 1998"3-D Collision Avoidance for the Dual-Arm Humanoid Robot"  D.M. Wilkes, A. Alford, R.T. Pack, T. Rogers, R.A. Peters II, and K. Kawamura, "Toward Socially Intelligent Service Robots", Applied Artificial Intelligence, An International Journal, vol. 12, pp , "Toward Socially Intelligent Service Robots"  A. Srikaew, M.E. Cambron, S. Northrup, R.A. Peters II, D.M. Wilkes, and K. Kawamura, "Humanoid Drawing Robot", IASTED International Conference on R obotics and Manufacturing, Banff, Canada, July, 1998."Humanoid Drawing Robot"  S. Charoenseang, A. Srikaew, D.M. Wilkes, and K. Kawamura, "Integrating Visual Feedback and Force Feedback in 3-D Collision Avoidance for a Dual-Arm Humanoid Robot", Proceedings of 1998 International Conference on Systems, Man and Cybernetics, California, USA, October, 1998."Integrating Visual Feedback and Force Feedback in 3-D Collision Avoidance for a Dual-Arm Humanoid Robot" Recent Publications 1998

Recent Publications 1997  D.M. Wilkes, R.T. Pack, W.A. Alford, and K. Kawamura, "HuDL, A Design Philosophy for Socially Intelligent Service Robots", working notes of the AAAI Symposium on Socially Intelligent Agents, November, 1997"HuDL, A Design Philosophy for Socially Intelligent Service Robots"  R.T. Pack, D.M. Wilkes, and K. Kawamura, "A Software Architecture for Integrated Service Robot Development", 1997 IEEE Conf. On Systems, Man, and Cybernetics, Orlando, pp , September, 1997."A Software Architecture for Integrated Service Robot Development"  A. Alford, D. M. Wilkes, K. Kawamura, and R.T. Pack, "Flexible Human Integration for Holonic Manufacturing Systems", Proceedings of the World Manufacturing Congress, New Zealand, pp , November, 1997."Flexible Human Integration for Holonic Manufacturing Systems"  R.T. Pack, D. M. Wilkes, G. Biswas, and K. Kawamura, "Intelligent Machine Architecture for Object-Based System Integration", Proceedings of the 1997 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Waseda University, Japan, June 1997."Intelligent Machine Architecture for Object-Based System Integration"

 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human Agent

 Model of the current human: description of the current human  Human activity: description of what the user is doing  User’s request: the nature of the interaction, the task the user request of the robot Human Agent Internal Model

Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z) = (175, 56, 186)

Model of the Human  Name Stan  Emotion Sad  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z) = (175, 56, 186) (x, y,z) Stan

 Detection module  Monitoring module  Identification module Human Agent Modules

Detection Module  Allows the robot to detect human presence  Uses multiple sensor modalities  IR motion sensor array  Speech recognition  Skin-color segmentation  Face detection

Monitoring Module  Keeps track of the detected human  Localization and tracking algorithms  Face tracking  Finger pointing gesture  Basic speech interface

Identification Module Under development  Attempts to identify detected human based on stored model and current model  Voice pattern comparison  Name  Height  Clothing color  Detects changes in dynamic model  Clothing color  Height