Humanoid-Human Interaction Presented by KMR ANIK 1.

Slides:



Advertisements
Similar presentations
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
Advertisements

University of Minho School of Engineering Centre ALGORITMI Uma Escola a Reinventar o Futuro – Semana da Escola de Engenharia - 24 a 27 de Outubro de 2011.
Empirical and Data-Driven Models of Multimodality Advanced Methods for Multimodal Communication Computational Models of Multimodality Adequate.
AI in the News 19/9/2006. WowWee “ Creation of Breakthrough Consumer Robotic and Electronic Products”
Supporting Collaboration: Digital Desktops to Intelligent Rooms Mary Lou Maher Design Computing and Cognition Group Faculty of Architecture University.
Augmented assembly using a multimodal interface Muscle Showcase Sanni Siltanen, VTT Alex Potamianos, TUC.
Recent Developments in Human Motion Analysis
Behaviors for Compliant Robots Benjamin Stephens Christopher Atkeson We are developing models and controllers for human balance, which are evaluated on.
Pratik Shah CS 575 Prof: K.V.Bapa Rao
Human-robot interaction Michal de Vries. Humanoid robots as cooperative partners for people Breazeal, Brooks, Gray, Hoffman, Kidd, Lee, Lieberman, Lockerd.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Computer Vision for Interactive Computer Graphics Mrudang Rawal.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
“Get outa here!”.
Intelligent Agents: an Overview. 2 Definitions Rational behavior: to achieve a goal minimizing the cost and maximizing the satisfaction. Rational agent:
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
 For many years human being has been trying to recreate the complex mechanisms that human body forms & to copy or imitate human systems  As a result.
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
Artificial Intelligence
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Artificial Intelligence
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
Human-Robot Interaction -Emerging Opportunities Pramila Rani 1997A3PS071 October 27,2006.
Humanoid Robots Debzani Deb.
Technology to support psychosocial self-management Kurt L. Johnson, Ph.D. Henry Kautz, Ph.D.
EWatchdog: An Electronic Watchdog for Unobtrusive Emotion Detection based on Usage Analysis Rayhan Shikder Department.
1 Imaginarium Merging science and practice David Kirsh, UCSD Cognitive Science, Assoc. Director ACCHI Erik Viirre M.D. Ph.D., UCSD School of Medicine Sheldon.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Zhengyou Zhang Microsoft Research Digital Object Identifier: /MMUL Publication Year: 2012, Page(s): Professor: Yih-Ran Sheu Student.
Presentation by: K.G.P.Srikanth. CONTENTS  Introduction  Components  Working  Applications.
Melody Murphy AIL 605 The University of Alabama Artificial Intelligence.
Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
Building Humanoid Robots Our quest to create intelligent machines Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics.
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
I Robot.
User-System Interaction: from gesture to action Prof. dr. Matthias Rauterberg IPO - Center for User-System Interaction TU/e Eindhoven University of Technology.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
CONTENTS INTRODUCTION TO A.I. WORKING OF A.I. APPLICATIONS OF A.I. CONCLUSIONS ON A.I.
Intelligent Robot Architecture (1-3)  Background of research  Research objectives  By recognizing and analyzing user’s utterances and actions, an intelligent.
Hirota lab. 1 Mentality Expression by the eyes of a Robot Presented by: Pujan Ziaie Supervisor: Prof. K. Hirota Dept. of Computational Intelligence and.
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
Electronic visualization laboratory, university of illinois at chicago Towards Lifelike Interfaces That Learn Jason Leigh, Andrew Johnson, Luc Renambot,
Data Mining Concepts and Techniques Course Presentation by Ali A. Ali Department of Information Technology Institute of Graduate Studies and Research Alexandria.
  Computer vision is a field that includes methods for acquiring,prcessing, analyzing, and understanding images and, in general, high-dimensional data.
Giri.K.R [4jn08ec016] Harish.Kenchangowdar[4jn10ec401] Sandesh.S[4jn08ec043] Mahabusaheb.P[4jn09ec040]
SixthSense Technology Visit to download
Face Detection 蔡宇軒.
Lecture 8: Wireless Sensor Networks By: Dr. Najla Al-Nabhan.
BeNeFri University Vision assistance for people with serious sight problems Moreno Colombo Marin Tomić Future User Interfaces, BeNeFri University.
MIT Artificial Intelligence Laboratory — Research Directions Intelligent Perceptual Interfaces Trevor Darrell Eric Grimson.
NCP meeting Jan 27-28, 2003, Brussels Colette Maloney Interfaces, Knowledge and Content technologies, Applications & Information Market DG INFSO Multimodal.
Under Guidance of Mr. A. S. Jalal Associate Professor Dept. of Computer Engineering and Applications GLA University, Mathura Presented by Dev Drume Agrawal.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
 ASMARUL SHAZILA BINTI ADNAN  Word Emotion comes from Latin word, meaning to move out.  Human emotion can be recognize from facial expression,
ASIMO Tomorrow's Humanoid Robot Jason Peach Technical Writing 4/5/2009.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Hand Gestures Based Applications
Automated Detection of Human Emotion
Fundamentals of Information Systems
AHED Automatic Human Emotion Detection
Project Oxygen… Shashwat Shriparv
What is blue eyes ? aims on creating computational machines that have perceptual and sensory ability like those of human beings. interactive computer.
Multimodal Human-Computer Interaction New Interaction Techniques 22. 1
Automated Detection of Human Emotion
Presentation transcript:

Humanoid-Human Interaction Presented by KMR ANIK 1

Humanoid-Human Interaction  Study of human factors related to the tasking and control of humanoid robots  Goal is to make communication efficiently, accurately and conveniently with human  Since humanoid are most of the time heavy and large, safety should be ensured during communication  Make humanoid suitable for learning and adaptive behavior 2

Mode of Communications  Sound recognition and Response  Visual Communications  Gesture identifications  Identifying speech  identifying the area from where the sound comes  detecting emotions of the sounds  detecting specific commands from sounds  Detecting Human/Animal faces  Acting according to the image/video detection  Finding emotions  Show Impression according to the input received  Identifying and showing different signs and emotions  Head Pose estimation  Pointing gesture detection  Performing arts such as dancing, showing special body movements 3

Karlsruhe Humanoid-Analysis of a multimodal communication-1 Perceptual Components A. Speech recognition B. Sound event classifications C. Person localization and tracking D. Face identification E. Pointing gesture detection F. Head-pose estimation G. Dialogue processing and Fusion H. Tight coupling of speech and dialogue processing Karlsruhe Humanoid Robot source: Ref [1] 4

Karlsruhe Humanoid-Analysis of a multimodal communication-2 Figure in left shows the overview of the perceptual components of Karlsruhe Humanoid source: Ref [1] 5

Challenges in Humanoid-Human interaction  Determining which of the incoming visual/audio signals are relevant to the current tasks  Mapping between bodies  Recognizing success and identifying inadequate actions  Chaining between the actions should be identified  Generalizing the tasks source: Ref [2] 6

Technology-Visual and Audio  High resolution camera to Kinect  Inventing computer vision techniques  360 degree movable eyes and head  Signal processing software developed  Better hardware implementation Visual Sound 7

Why advancements for Humanoid-Human interaction?  Social robotics  Humanoid-Human Collaboration  Breaking the communication barrier between human and machine  More situational awareness for humanoid  More task handling capability 8

Human-Humanoid Interaction Asimo  Identifying specific people and talk accordingly  Identifying speech, tone, order  No facial expression  Showing different signs  Can identify and evaluate task  Can perform Greetings and handshake Photo courtesy: 9

Human-Robot Interaction: Jibo-Social Robotics  Identifying specific people and talk accordingly  recognizing speech, tone, order  Showing facial expression  Can identify visual objects  Can identify and evaluate task  Show expressions like human Photo courtesy: 10

Conclusion  Maximum utilization of a Humanoid robot is possible only when the interaction with human is accurate and smooth 11

References 1. Rainer Stifelhagen, Hazim Kemal Ekenel, Chistian Fuguen, petre gieselmann, hartwig holzpfel, florian kraft, kai nickel, Micheal voit and Alex waibel “Enabling multimodal human-Robot interaction for the Karlsruhu Humanoid Robot”. IEEE Transtactions on robotics VOL 23, NO. 5, October, Rodney A. Brooks, Cynthia Breazeal, Brial Scassellati, Una-May O’reilly, “Technologies for Human/Humanoid natural interactions”, MiT artificial intelligence laboratory. 3. Cynthia breazeal, Andrew Brooks, David chilongo, Jesse gray, Guy Hoffman, Corry Kidd, Hans lee, Jeff Lieberman, Andrea lockerd, “Working Collaboratively with humanoid robots”. MIT media lab, Robotic life group. 4. Hyun S.Yang, Yong-Ho seo, Yeong-nam chae, II-woong jeong, Won-Hyung Kang, ju-ho Lee, “Design and development of Biped Humanoid Robot, AMI2, for social interaction with humans”. AIM Lab, EECS dept, Korea advanced institute of science and technology. 12

Thank you 13

Questions? 14