Integration of Machine Learning, Quantum Networks and software- hardware methodology in humanoid robots Projects 2005: Interactive Robot Theatre as a future.

Slides:



Advertisements
Similar presentations
Approaches, Tools, and Applications Islam A. El-Shaarawy Shoubra Faculty of Eng.
Advertisements

Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Breakout session B questions. Research directions/areas Multi-modal perception cognition and interaction Learning, adaptation and imitation Design and.
QUANTUM ROBOTICS IN ROBOT THEATRE. Quantum Logic Binary Logic Fuzzy Logic Quantum Signals and Automata 0, 1 [0,1] Hilbert Space, Bloch Sphere.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
Artificial Intelligence
Kits and other Inexpensive Robotics Platforms Intelligent Robotics Lab in Suite FAB 70, Portland State University You are cordially invited.
First Midterm (takehome) Your name: total Try to write the responses to the questions as if you were writing a conference.
AI in the News 19/9/2006. WowWee “ Creation of Breakthrough Consumer Robotic and Electronic Products”
Artificial Intelligence
KAIST CS780 Topics in Interactive Computer Graphics : Crowd Simulation A Task Definition Language for Virtual Agents WSCG’03 Spyros Vosinakis, Themis Panayiotopoulos.
Portland Cyber Theatre Marek Perkowski Graduate Seminar, Friday, May 7, 2004 Part 1.
Introduction to Artificial Intelligence Ruth Bergman Fall 2004.
Types of Robots. Some Literature Motorola books not for this year.
Towards Robot Theatre Marek Perkowski Department of Electrical and Computer Engineering, Portland State University, Portland, Oregon,
Pratik Shah CS 575 Prof: K.V.Bapa Rao
Robot Metaphors and Models. Animatronic “Robot” or device brain effectors.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Intelligent Robotics and Embedded Systems Dr. Marek Perkowski and Dr. Douglas Hall
Integration of Machine Learning, Quantum Networks and software-hardware methodology in humanoid robots Interactive Robot Theatre as a future toy Talk presented.
ASIMO. Want a robot to cook your dinner, Do your homework, Clean your house, Or get your groceries? Robots already do a lot of the jobs that we humans.
Probabilistic State Machines to describe emotions Happy state Ironic state Unhappy state “you are beautiful” / ”Thanks for a compliment” “you are blonde!”
SLIDES SET NUMBER 1.. Class 478/578: General 1 1.My name is Marek Perkowski 2.You can call my Marek, or Dr. Perkowski or whatever you like. 3.This class.
Towards Robot Theatre Marek Perkowski Department of Electrical and Computer Engineering, Portland State University, Portland, Oregon,
Why are ‘Speaking and Listening’ skills so important?
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
Artificial Intelligence
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
Track: Speech Technology Kishore Prahallad Assistant Professor, IIIT-Hyderabad 1Winter School, 2010, IIIT-H.
How can robots help people and make the world a better place?
Artificial Intelligence
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Gerhard K. Kraetzschmar The Cool Science Institute Educational Robotics A Glimpse on Robotics Tutorial Material.
Behavior, Dialog and Learning The dialog/behavior has the following components: –(1) Eliza-like natural language dialogs based on pattern matching and.
Lesson D2-2 Understanding Effective Communication Techniques.
Understanding Effective Communication Techniques
Humanoid Robots Debzani Deb.
Institute of Perception, Action and Behaviour (IPAB) Director: Prof. Sethu Vijayakumar.
Effective Communication Objectives:   Identify the components of effective communications   Organize information needed to complete a task   Compare.
THE NEW ERA OF LIFE. Introduction: Artificial Intelligence (AI) is the area of computer science focusing on creating machines that can engage on behaviors.
Effective Public Speaking Chapter # 3 Setting the Scene for Community in a Diverse Culture.
4/12/2007dhartman, CS A Survey of Socially Interactive Robots Terrance Fong, Illah Nourbakhsh, Kerstin Dautenhahn Presentation by Dan Hartmann.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
SEMINAR REPORT ON K.SWATHI. INTRODUCTION Any automatically operated machine that functions in human like manner Any automatically operated machine that.
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
Artificial Intelligence By Michelle Witcofsky And Evan Flanagan.
CHAPTER 19 Communication Skills.
Artificial intelligence
UNDERSTANDING EFFECTIVE COMMUNICATION TECHNIQUES.
KaaShiv InfoTech presents ROBOTICS For Inplant Training / Internship, please download the "Inplant training registration form"
DO NOW: 1.State whether you agree or disagree with this statement-and tell me WHY- “Everyone learns the same way.” Be prepared to justify your answer.
University of Kurdistan Artificial Intelligence Methods (AIM) Lecturer: Kaveh Mollazade, Ph.D. Department of Biosystems Engineering, Faculty of Agriculture,
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
Intro to Health Science Chapter 4 Section 3.3
Artificial Intelligence Hossaini Winter Outline book : Artificial intelligence a modern Approach by Stuart Russell, Peter Norvig. A Practical Guide.
Effective Communication Techniques. Interest Approach Give each student a copy of a relevant news article. Explain the importance of skimming and scanning.
HONDAASIMO ROBOT  ASIMO, Advanced Step in Innovative Mobility, and is the most advanced humanoid robot in the world.  It is the first humanoid robot.
The RoboCup Standard Platform League : Soccer for robots Requires fast, stable, intelligent robots Robots wear out and are time consuming to work with.
Artificial Intelligence
Verbal And Non-Verbal Communication
Overview of Artificial Intelligence (1) Artificial intelligence (AI) Computers with the ability to mimic or duplicate the functions of the human brain.
CHAPTER 1 Introduction BIC 3337 EXPERT SYSTEM.
IPAB Research Areas and Strengths
Artificial Intelligence (CS 370D)
Robot Metaphors and Models
Review and Ideas for future Projects
CEN3722 Human Computer Interaction Advanced Interfaces
Robot Metaphors and Models
Presentation transcript:

Integration of Machine Learning, Quantum Networks and software- hardware methodology in humanoid robots Projects 2005: Interactive Robot Theatre as a future toy

Toys is a very serious business

Talking Robots Many talking toys exist, but they are still very primitive Actors for robot theatre, agents for advertisement, education and entertainment. Designing inexpensive natural size humanoid caricature and realistic robot heads We concentrate on Machine Learning techniques used to teach robots behaviors, natural language dialogs and facial gestures. Dog.com from Japan Work in progress

Robot with a Personality? Future robots will interact closely with non-sophisticated users, children and elderly, so the question arises, how they should look like? If human face for a robot, then what kind of a face? Handsome or average, realistic or simplified, normal size or enlarged? Why is Kismet so successful? We believe that a robot that will interact with humans should have some kind of “personality” and Kismet so far is the only robot with “personality”. The famous example of a robot head is Kismet from MIT.

Robot face should be friendly and funny The Muppets of Jim Henson are hard to match examples of puppet artistry and animation perfection. We are interested in robot’s personality as expressed by its: –behavior, –facial gestures, –emotions, –learned speech patterns.

Behavior, Dialog and Learning Robot activity as a mapping of the sensed environment and internal states to behaviors and new internal states (emotions, energy levels, etc). Our goal is to uniformly integrate verbal and non-verbal robot behaviors. Words communicate only about 35 % of the information transmitted from a sender to a receiver in a human-to-human communication. The remaining information is included in para-language. Emotions, thoughts, decision and intentions of a speaker can be recognized earlier than they are verbalized.

Morita’s Theory

Robot Head Construction, 1999 Furby head with new control Jonas Jonas We animate various kinds of humanoid heads with from 4 to 20 DOF, looking for comical and entertaining values.

Mister Butcher 4 degree of freedom neck Latex skin from Hollywood

Robot Head Construction, 2000 Skeleton Alien We use inexpensive servos from Hitec and Futaba, plastic, playwood and aluminum. The robots are either PC-interfaced, use simple micro-controllers such as Basic Stamp, or are radio controlled from a PC or by the user.

Adam Marvin the Crazy Robot Technical Construction, 2001 Details

Virginia Woolf heads equipped with microphones, USB cameras, sonars and CDS light sensors 2001

Max Image processing and pattern recognition uses software developed at PSU, CMU and Intel (public domain software available on WWW). Software is in Visual C++, Visual Basic, Lisp and Prolog. BUG (Big Ugly Robot) 2002

Visual Feedback and Learning based on Constructive Induction 2002

Professor Perky 1 dollar latex skin from China We compared several commercial speech systems from Microsoft, Sensory and Fonix. Based on experiences in highly noisy environments and with a variety of speakers, we selected Fonix for both ASR and TTS for Professor Perky and Maria robots. We use microphone array from Andrea Electronics. Professor Perky with automated speech recognition (ASR) and text-to-speech (TTS) capabilities 2002, Japan

Maria, 2002/ DOF

Construction details of Maria location of controlling rods location of head servos location of remote servos Custom designed skin skull

Animation of eyes and eyelids

Software/Hardware Architecture Network- 10 processors, ultimately 100 processors. Robotics Processors. ACS 16 Speech cards on Intel grant More cameras Tracking in all robots. Robotic languages – Alice and Cyc-like technologies.

Cynthia, 2004, June

Currently the hands are not moveable. We have a separate hand design project.

HAHOE KAIST ROBOT THEATRE, KOREA, SUMMER 2004 Sonbi, the Confucian ScholarPaekchong, the bad butcher

Yangban the Aristocrat and Pune his concubine The Narrator

We base all our robots on inexpensive radio- controlled servo technology.

We are familiar with latex and polyester technologies for faces

New Silicone Skins

What to emphasize in our cooperation? We want to develop a general methodology for prototyping software/hardware systems for interactive robots that work in human environment. Safety, not hitting humans. Image processing, voice recognition, speech synthesis, expressing emotions, recognizing human emotions. Machine Learning technologies.

Probabilistic State Machines to describe emotions Happy state Ironic state Unhappy state “you are beautiful” / ”Thanks for a compliment” “you are blonde!” / ”I am not an idiot” P=1 P=0.3 “you are blonde!” / Do you suggest I am an idiot?” P=0.7

Facial Behaviors of Maria Do I look like younger than twenty three? Maria asks:  “yes”  “no” Response: Maria smiles Maria frowns

Probabilistic Grammars for performances Who? What? Where? Speak ”Professor Perky”, blinks eyes twice Speak “In the classroom”, shakes head P=0.1 Speak “Was drinking wine” P=0.1 P=0.3 P=0.5 Speak ”Professor Perky” Speak ”Doctor Lee” Speak “in some location”, smiles broadly Speak “Was singing and dancing” P=0.5 P=0.1 …. P=0.1

Human-controlled modes of dialog/interaction Robot asks Human teaches Human commandsHuman asks Robot performs “Hello Maria” “Thanks, I have a question” “Thanks, I have a lesson” “Thanks, I have a command” “Lesson finished” “Questioning finished” “Command finished” “Stop performance” “Question”

Robot-Receptionist Initiated Conversation Robot What can I do for you? Human Robot asks This represents operation mode

Robot-Receptionist Initiated Conversation Robot What can I do for you? Human I would like to order a table for two Robot asks

Robot-Receptionist Initiated Conversation Robot Smoking or non- smoking? Human Robot asks

Robot-Receptionist Initiated Conversation Robot Smoking or non- smoking? Human I do not understand Robot asks

Robot-Receptionist Initiated Conversation Robot Do you want a table in a smoking or non-smoking section of the restaurant? Non-smoking section is near the terrace. Human Robot asks

Robot-Receptionist Initiated Conversation Robot Do you want a table in a smoking or non-smoking section of the restaurant? Non-smoking section is near the terrace. Human A table near the terrace, please Robot asks

Human-Initiated Conversation Robot Human Hello Maria Robot asks initialization

Human-Initiated Conversation Robot Human Hello Maria What can I do for you? Robot asks

Human-Asking Robot Human Question Human asks Question Robot asks

Human-Asking Robot Human Question Human asks Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks Lee wrote book “Flowers”.

Human-Asking Robot Human This is not true. Human asks Lee wrote book “Flowers”.

Human-Teaching Robot Human Questioning finished Human teaches “Questioning finished” Robot asks Human asks Thanks, I have a lesson Human ends questioning

Human-Teaching Robot Human Questioning finished Human teaches “Questioning finished” Robot asks Human asks Thanks, I have a lesson Robot enters asking mode What can I do for you?

Human-Teaching Robot Human Thanks, I have a lesson Human teaches “Questioning finished” Robot asks Human asks Thanks, I have a lesson Human starts teaching What can I do for you?

Human-Teaching Robot Human Thanks, I have a lesson Yes Human teaches

Human-Teaching Robot Human I give you question- answer pattern Yes Human teaches

Human-Teaching Robot Human Question pattern: What book Smith wrote? Yes Human teaches

Robot Human Answer pattern: Smith wrote book “Automata Theory” Yes Human teaches Human-Teaching

Human-Teaching Robot Human Checking question: What book wrote Smith? Yes Human teaches

Human-Teaching Robot Human Checking question: What book wrote Smith? Smith wrote book “Automata Theory” Human teaches

Human-Teaching Robot Human I give you question- answer pattern Yes Human teaches

Human-Teaching Robot Human Question pattern: Where is room of Lee? Yes Human teaches

Human-Teaching Robot Human Answer pattern: Lee is in room 332 Yes Human teaches

Human-Checking what robot learned Robot Human Lesson finished Human asks Question Robot asks Human teaches “Lesson finished”

Human-Checking what robot learned Robot Human Lesson finished Human asks Question Robot asks Human teaches “Lesson finished” What can I do for you?

Human-Checking what robot learned Robot Human Question Human asks Question Robot asks Human teaches “Lesson finished” What can I do for you?

Human-Asking Robot Human Question Human asks Question Robot asks Human teaches “Lesson finished” Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks Lee wrote book “Automata Theory” Observe that robot found similarity between Smith and Lee and generalized (incorrectly)

Behavior, Dialog and Learning The dialog/behavior has the following components: –(1) Eliza-like natural language dialogs based on pattern matching and limited parsing. Commercial products like Memoni, Dog.Com, Heart, Alice, and Doctor all use this technology, very successfully – for instance Alice program won the 2001 Turing competition. –This is a “conversational” part of the robot brain, based on pattern-matching, parsing and black-board principles. –It is also a kind of “operating system” of the robot, which supervises other subroutines.

(2) Subroutines with logical data base and natural language parsing (CHAT). –This is the logical part of the brain used to find connections between places, timings and all kind of logical and relational reasonings, such as answering questions about Japanese geography. (3) Use of generalization and analogy in dialog on many levels. –Random and intentional linking of spoken language, sound effects and facial gestures. –Use of Constructive Induction approach to help generalization, analogy reasoning and probabilistic generations in verbal and non-verbal dialog, like learning when to smile or turn the head off the partner. Behavior, Dialog and Learning

(4) Model of the robot, model of the user, scenario of the situation, history of the dialog, all used in the conversation. (5) Use of word spotting in speech recognition rather than single word or continuous speech recognition. (6) Avoidance of “I do not know”, “I do not understand” answers from the robot. –Our robot will have always something to say, in the worst case, over-generalized, with not valid analogies or even nonsensical and random. Behavior, Dialog and Learning

Recent Works Multi-brain: sub-brains communicate through natural language: –Devil, angel and myself. –Egoist and moralist CAM – Contents Addressable Memory. Cypress funded project in 2005.

Generalization of the Ashenhurst- Curtis decomposition model

This kind of tables known from Rough Sets, Decision Trees, etc Data Mining

Decomposition is hierarchical At every step many decompositions exist

Constructive Induction: Technical Details U. Wong and M. Perkowski, A New Approach to Robot’s Imitation of Behaviors by Decomposition of Multiple-Valued Relations, Proc. 5 th Intern. Workshop on Boolean Problems, Freiberg, Germany, Sept , 2002, pp A. Mishchenko, B. Steinbach and M. Perkowski, An Algorithm for Bi-Decomposition of Logic Functions, Proc. DAC 2001, June 18-22, Las Vegas, pp A. Mishchenko, B. Steinbach and M. Perkowski, Bi- Decomposition of Multi-Valued Relations, Proc. 10 th IWLS, pp , Granlibakken, CA, June 12-15, IEEE Computer Society and ACM SIGDA.

Decision Trees, Ashenhurst/Curtis hierarchical decomposition and Bi-Decomposition algorithms are used in our software These methods create our subset of MVSIS system developed under Prof. Robert Brayton at University of California at Berkeley [2]. – The entire MVSIS system can be also used. The system generates robot’s behaviors (C program codes) from examples given by the users. This method is used for embedded system design, but we use it specifically for robot interaction. Constructive Induction

Braitenberg Vehicles

Example 1: Simulation Quantum Circuits |0  |1  |x  |0  |1  |x  |0  |1  |x  VV†V† V = U |0  |1  V|x  |0  |1  |0  |1  |x  |0  |1  |0  |1  |x  ?

Quantum Portland Faces

Conclusion. What did we learn (1) the more degrees of freedom the better the animation realism. (2) synchronization of spoken text and head (especially jaw) movements are important but difficult. (3) gestures and speech intonation of the head should be slightly exaggerated.

Conclusion. What did we learn(cont) (4) the sound should be laud to cover noises coming from motors and gears and for a better theatrical effect. (5) noise of servos can be also reduced by appropriate animation and synchronization. (6) best available ATR and TTS packages should be applied, especially those that use word spotting. (7) use puppet theatre experiences.

(8) because of a too slow learning, improved parameterized learning methods will be developed, but also based on constructive induction. (9) open question: funny versus beautiful. (10) either high quality voice recognition from headset or low quality in noisy room. YOU CANNOT HAVE BOTH WITH CURRENT ATR TOOLS. The bi-decomposer of relations and other useful software used in this project can be downloaded from cad.eecs.berkeley.edu/mvsis/. Conclusion. What did we learn(cont)

This is the most advanced humanoid robot theatre robot project outside of JapanThis is the most advanced humanoid robot theatre robot project outside of Japan Open to international collaborationOpen to international collaboration

Can we do this in Poland?

Yes, engineers from Technical University of Gliwice produce already a commercially available hexapod International Intel Science Talent Competition and PDXBOT 2004

Additional Slides with Background

Robot Toy Market - Robosapiens toy, poses in front of

Globalization Globalization implies that images, technologies and messages are everywhere, but at the same time disconnected from a particular social structure or context. (Alain Touraine) The need of a constantly expanding market for its products chases the bourgoise over the whole surface of the globe. It must nestle everywhere, settle everywhere, establish connections everywhere. (Marx & Engels, 1848)

India and China - what’s different? They started at the same level of wealth and exports in 1980 China today exports $ 184 Bn vs $ 34 Bn for India China’s export industry employs today over 50 million people (vs 2 m s/w in 2008, and 20 m in the entire organized sector in India today!) (> 60% of the world marketChina’s export industry consists of toys (> 60% of the world market), bicycles (10 m to the US alone last year), and textiles (a vision of having a share of > 50% of the world market by 2008)

Learning from Korea and Singapore The importance of Learning –To manufacture efficiently –To open the door to foreign technology and investment –To have sufficient pride in ones own ability to open the door and go out and build ones own proprietary identity To invest in fundamentals like Education to have the right cultural prerequisites for catching up To have pragmatism rule, not ideology

Samsung 1979 Started making microwaves 1980 First export order (foreign brand) 1983 OEM contracts with General Electric 1985 All GE microwaves made by Samsung 1987 All GE microwaves designed by Samsung 1990 The world’s largest microwave manufacturer - without its own brand 1990 Launch own brand outside Korea 2000Samsung microwaves # 1 worldwide, twelve factories in twelve countries (including India, China and the US) 2003 – the largest electronics company in the world

How did Samsung do it? By learning from GE and other buyers By working very hard - 70 hour weeks, 10 days holiday (no CLs in Korea) By being very productive - 9 microwaves per person per day vs 4 at GE By meeting every delivery on time, even if it meant working 7-day weeks for six months By developing new models so well that it got GE to stop developing their own

Ashenhurst Functional Decomposition Evaluates the data function and attempts to decompose into simpler functions. if A  B = , it is disjoint decomposition if A  B  , it is non-disjoint decomposition B - bound set A - free set F(X) = H( G(B), A ), X = A  B X

A Standard Map of function ‘z’ Bound Set Free Set a b \ c z Columns 0 and 1 and columns 0 and 2 are compatible column compatibility = 2 Explain the concept of generalized don’t cares

NEW Decomposition of Multi- Valued Relations if A  B = , it is disjoint decomposition if A  B  , it is non-disjoint decomposition F(X) = H( G(B), A ), X = A  B Relation A B X

Forming a CCG from a K-Map z Bound Set Free Set a b \ c Columns 0 and 1 and columns 0 and 2 are compatible column compatibility index = 2 C1C1 C2C2 C0C0 Column Compatibility Graph

Forming a CIG from a K-Map Columns 1 and 2 are incompatible chromatic number = 2 z a b \ c C1C1 C2C2 C0C0 Column Incompatibility Graph

A unified internal language is used to describe behaviors in which text generation and facial gestures are unified. This language is for learned behaviors. Expressions (programs) in this language are either created by humans or induced automatically from examples given by trainers. Constructive Induction

Is it worthy to build humanoid robots? Man’s design versus robot’s design The humanoid robot is versatile and adaptive, it takes its form from a human, a design well-verified by Nature. Complete isomorphism of a humanoid robot with a human is very difficult to achieve (walking) and not even not entirely desired. All what we need is to adapt the robot maximally to the needs of humans – elderly, disabled, children, entertainment. Replicating human motor or sensor functionality are based on mechanistic methodologies, but adaptations and upgrades are possible – for instance brain wave control or wheels Is is a cheating?

Is it worthy to build humanoid robots? Can building a mechanistic digital synthetic version of man be anything less than a cheat when man is not mechanistic, digital nor synthetic? If reference for the “ultimate” robot is man, then there is little confusion about one’s aim to replace man with a machine.

Man & Machine Main reason to build machines in our likeness is to facilitate their integration in our social space: –SOCIAL ROBOTICS Robot should do many things that we do, like climbing stairs, but not necessarily in the way we do it – airplane and bird analogy. Humanoid robots/social robots should make our life easier.

The Social Robot “developing a brain”: –Cognitive abilities as developed from classical AI to modern cognitive ideas (neural networks, multi-agent systems, genetic algorithms…) “giving the brain a body”: –Physical embodiment, as indicated by Brooks [Bro86], Steels [Ste94], etc. “a world of bodies”: –Social embodiment A Social Robot is: –A physical entity embodied in a complex, dynamic, and social environment sufficiently empowered to behave in a manner conducive to its own goals and those of its community.

Anthropomorphism Social interaction involves an adaptation on both sides to rationalise each others actions, and the interpretation of the others actions based on one’s references Projective Intelligence: the observer ascribes a degree of “intelligence” to the system through their rationalisation of its actions

Anthropomorphism & The Social Robot Objectives –Augment human-robot sociality –Understand and rationalize robot behavior Embrace anthropomorphism BUT - How does the robot not become trapped by behavioral expectations? REQUIRED: A balance between anthropomorphic features and behaviors leading to the robot’s own identity

Finding the Balance Movement –Behaviour (afraid of the light) –Facial Action Coding System (iconic vs. synthetic) –Perlin Noise (alive) Form –Physical construction –Degrees of freedom Interaction –Communication (robot-like vs. human voice) –Social cues/timing Autonomy Function & role –machine vs. human capabilities

Emotion Robots Experiments Autonomous mobile robots Emotion through motion “Projective emotion” Anthropomorphism Social behaviors Qualitative and quantitative analysis to a wide audience through online web-based experiments

What is proposed? Portland State University group has experience on: 1.Physically building stationary humanoid robots 2.Physically building mobile, autonomous and radio- controlled robots 3.Image processing, speech recognition, speech synthesis and interactive natural language dialog for robots 4.Integration of a complete four robot theatre using Visual Studio based multi- processor network based system 1.Machine Learning based on Decomposition of Multiple- Valued Functions and Relations (3) PSU Integrate the experiences, theories and software of both groups into a new mobile humanoid robot. 1.The robot will be on mobile wheeled base and will be controlled by radio from a network of computers. 2.The robot will be humanoid. It will have hands, torso, hands, neck and head, about 30 degrees of freedom. 3.The computer system will be based on the combination of PSU Visual Studio multi- processor system and University of Freiberg UML-based FPGA/processor system. 4.The Machine Learning system developed on the former grant by Christian Lang will be used (standard software) together with MVSIS and Decision Tree based learning that are now used.

More details related to p.4. The robot learns the following: 1.Perceptions 2.Behaviors 3.Input-output mappings Each of them may require the same approach: 1.Feature vectors are created using other software such as robot vision or speech recognition. 2.Learning engine such as NN, decision tree or decomposition is used to create behavior in form of software or hardware downloaded to FPGA.

The perception learning tasks Robot Vision:Robot Vision: 1.Where is a face? (Face detection) 2.Who is this person (Face recognition, learning with supervisor, person’s name is given in the process. 3.Age and gender of the person. 4.Hand gestures. 5.Emotions expressed as facial gestures (smile, eye movements, etc) 6.Objects hold by the person 7.Lips reading for speech recognition. 8.Body language.

The perception learning tasks Speech recognition:Speech recognition: 1.Who is this person (voice based speaker recognition, learning with supervisor, person’s name is given in the process.) 2.Isolated words recognition for word spotting. 3.Sentence recognition. Sensors.Sensors. 1.Temperature 2.Touch 3.movement

The behavior learning tasks Facial and upper body gestures:Facial and upper body gestures: 1.Face/neck gesticulation for interactive dialog. 2.Face/neck gesticulation for theatre plays. 3.Face/neck gesticulation for singing/dancing. Hand gestures and manipulation.Hand gestures and manipulation. 1.Hand gesticulation for interactive dialog. 2.Hand gesticulation for theatre plays. 3.Hand gesticulation for singing/dancing.

Learning the perception/behavior mappings 1.Tracking the human. 2.Full gesticulation as a response to human behavior in dialogs and dancing/singing. 3.Modification of semi-autonomous behaviors such as breathing, eye blinking, mechanical hand withdrawals, speech acts as response to person’s behaviors. 4.Playing games with humans. 5.Body contact with human such as safe gesticulation close to human and hand shaking.