Nicole looks at faces Development of a visual robot interface to interpret facial expressions NICOLE: Future robots will share their environment with humans.

Slides:



Advertisements
Similar presentations
Rui Tang, Darren Cosker and Wenbin Li Global Alignment for Dynamic 3D Morphable Model Construction University of Bath.
Advertisements

Taavi Tamberg What is screen? Device User Interface Information Service Innovation.
Module ONE TASK No. 1 Module TWO TASK No. 2 TASK No. 3 TASK No. 4 Module THREE Module FOUR YOU Are a skilled Recruiter now with 50% experience.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
University of Minho School of Engineering Centre ALGORITMI Uma Escola a Reinventar o Futuro – Semana da Escola de Engenharia - 24 a 27 de Outubro de 2011.
Markovito’s Team (INAOE, Puebla, Mexico). Team members.
Object-Oriented Analysis and Design
- List of Multimodal Libraries - (EIF students only)
Overview of Computer Vision CS491E/791E. What is Computer Vision? Deals with the development of the theoretical and algorithmic basis by which useful.
PRESENTED BY SHEHREEN THAWERANI. .Story telling is the human action whether verbal or visual that conveys thoughts and feelings.. It is as fluid as water.
 A robot is a machine or a computer program which seems to have a life of its own.  A social robot is a robot that interacts and communicates with humans.
 For many years human being has been trying to recreate the complex mechanisms that human body forms & to copy or imitate human systems  As a result.
What is Body Language? Facial expressions – eyes and mouth Posture – head, back and shoulders Gestures – hands Stance – arms and legs A way to communicate.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
HAND GESTURE BASED HUMAN COMPUTER INTERACTION. Hand Gesture Based Applications –Computer Interface A 2D/3D input device (Hand Tracking) Translation of.
By: Andrea Sloane COMP 4620 Fall Human-Like Robots and Androids are Designed to Resemble Humans in the Most Life-Like Fashion Possible Human-like.
Natural Motion Capture for Humanoid Robots Dr. Mike Goodrich Alan Atherton Tim Major Tyler Gill.
GIP: Computer Graphics & Image Processing 1 1 Medical Image Processing & 3D Modeling.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
Web Design Teams What role do you play?. Client Person or organization who pays for the website to be designed and maintained. Person or organization.
Object-Oriented Analysis and Design OVERVIEW. Objectives  Describe Information Systems  Explain the role of a systems analyst  Introduce object-oriented.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
© 2007 Tom Beckman Features:  Are autonomous software entities that act as a user’s assistant to perform discrete tasks, simplifying or completely automating.
 Face Detection Face Detection  Eye Detection Eye Detection  Lip-reading Lip-reading  Facial Feature Detection Facial Feature Detection  Gaze Tracking.
Different forms of communication 2A.P1 / 2A.M1. The Communication Cycle.
A Dialogue System for Robots using VoiceXML Louise Funke & Marc Bauer 2007/12/11 EDA171/DATN06 Language Processing and Computational Linguistics Pierre.
Sign Language and Communication
Human Computer Interaction © 2014 Project Lead The Way, Inc.Computer Science and Software Engineering.
University of Coimbra ISR – Institute of Systems and Robotics University of Coimbra - Portugal Institute of Systems and Robotics
By: Kenzo Abrahams Supervisor: Mehrdad Ghaziasgar Co-supervisor: James Connan Mentored by: Diego Mushfieldt.
Model of the Human  Name Stan  Emotion Happy  Command Watch me  Face Location (x,y,z) = (122, 34, 205)  Hand Locations (x,y,z) = (85, -10, 175) (x,y,z)
 Motivated by desire for natural human-robot interaction  Encapsulates what the robot knows about the human  Identity  Location  Intentions Human.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
XcGenic Proprietary Information Interactive Animation Contact: XcGenic Proprietary Information Interactive.
Aibo companion DOAS – group 1 Aitor Azcarate Onaindia Abeer Mahdi
Evaluating the User Interface of a Ubiquitous Computing system Doorman Kaj Mäkelä Tampere University Computer Human Interaction Group.
Communication in the Workplace 20 June, 2011 Session 1 – Week Business Communication.
Language in Cognitive Science. Research Areas for Language Computational models of speech production and perception Signal processing for speech analysis,
ENTERFACE’08 Multimodal Communication with Robots and Virtual Agents mid-term presentation.
Michele Gattullo 2 yr. doctoral program - XXVIII cycle DMMM·ING-IND/15·VR 3 Lab.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
SixthSense Technology Visit to download
9/30/ Cognitive Robotics1 Human-Robot Interaction Cognitive Robotics David S. Touretzky & Ethan Tira-Thompson Carnegie Mellon Spring 2009.
Human-Robot Interaction
Characteristics of Graphical and Web User Interfaces
SIXTH SENSE TECHNOLOGY
Automated Detection of Human Emotion
Chapter 3 Understanding users
GESTURE RECOGNITION TECHNOLOGY
Visitor Tracking Software -
P1: Smart Phone Interaction Workaround
Check Out for Visitors Tracking Software
Jörg Stückler, imageMax Schwarz and Sven Behnke*
معلم الصف الثالث الابتدائي
Mimicking Human Gestures During Conversation
Mixed Reality Server under Robot Operating System
RIS715 Human-Computer Interaction
Touchless Touchscreen
ФОНД “СОЦИАЛНО ПОДПОМАГАНЕ”
Ch 4: Perceiving Persons
Characteristics of Graphical and Web User Interfaces
Natural User Interaction with Perceptual Computing
Predicting Body Movement and Recognizing Actions: an Integrated Framework for Mutual Benefits Boyu Wang and Minh Hoai Stony Brook University Experiments:
Boyu Wang and Minh Hoai Stony Brook University
Anarghya Mitra and Zelun Luo
AHED Automatic Human Emotion Detection
Face Detection Gender Recognition 1 1 (19) 1 (1)
Applications Discussion
Founded Meuro/year funding 12 FP6 and 20 FP5 IST projects
Automated Detection of Human Emotion
Presentation transcript:

Nicole looks at faces Development of a visual robot interface to interpret facial expressions NICOLE: Future robots will share their environment with humans. They will need to have two basic skills: 1. To interact with the persons 2. To navigate in the world. To study this we started the development of the Guide Robot Nicole. Her task will be to guide visitors through the ISR, thus her means of interaction will be: A. Giving information to the visitors B. Reacting on gestures performed by known people Project: The past work has implemented the detection, recognition and tracking of hands and faces. The face tracking module should now be extended by using a “Morphable Model for the Synthesis of 3D Faces”. The approach will also involve the 3-D animation software “MAYA” Contact: Supervisor of the project: Jorge Dias (e-mail: jorge@isr.uc.pt, tel. internal 1117 or external 239-796219) or (e-mail: jrett@isr.uc.pt, tel. internal 1007 or external 239-796204).