Towards Robot Theatre Marek Perkowski Department of Electrical and Computer Engineering, Portland State University, Portland, Oregon, 97207-0751.

Slides:



Advertisements
Similar presentations
Robot Theatre Research at PSU. Intelligent Robotics Evolutionary generation of robot motions Common Robot Language for Humanoids Raghuvanshi Zhao, Hun.
Advertisements

QUANTUM ROBOTICS IN ROBOT THEATRE. Quantum Logic Binary Logic Fuzzy Logic Quantum Signals and Automata 0, 1 [0,1] Hilbert Space, Bloch Sphere.
Kits and other Inexpensive Robotics Platforms Intelligent Robotics Lab in Suite FAB 70, Portland State University You are cordially invited.
First Midterm (takehome) Your name: total Try to write the responses to the questions as if you were writing a conference.
Theory of Event Expressions Tool to design motions directly from symbols. This theory is general enough to allow arbitrary motion to be symbolically described.
January 5, 2015CS21 Lecture 11 CS21 Decidability and Tractability Lecture 1 January 5, 2015.
AI in the News 19/9/2006. WowWee “ Creation of Breakthrough Consumer Robotic and Electronic Products”
Portland Cyber Theatre Marek Perkowski Graduate Seminar, Friday, May 7, 2004 Part 1.
A breakthrough year in robot theatre? Henrik IbsenHenrik Ibsen in "Heddatron," the dopey and strangely moving gloss on "Hedda Gabler" by Les Freres.
Efficient Decomposition of Large Fuzzy Functions and Relations Marek Perkowski + Portland State University, Dept. Electrical Engineering, Portland, Oregon.
Robot Metaphors and Models. Animatronic “Robot” or device brain effectors.
Integration of Machine Learning, Quantum Networks and software- hardware methodology in humanoid robots Projects 2005: Interactive Robot Theatre as a future.
ISTD 2003, Thoughts and Emotions Interactive Systems Technical Design Seminar work: Thoughts & Emotions Saija Gronroos Mika Rautanen Juha Sunnari.
Intelligent Robotics and Embedded Systems Dr. Marek Perkowski and Dr. Douglas Hall
Integration of Machine Learning, Quantum Networks and software-hardware methodology in humanoid robots Interactive Robot Theatre as a future toy Talk presented.
ASIMO. Want a robot to cook your dinner, Do your homework, Clean your house, Or get your groceries? Robots already do a lot of the jobs that we humans.
Robotics for Intelligent Environments
Probabilistic State Machines to describe emotions Happy state Ironic state Unhappy state “you are beautiful” / ”Thanks for a compliment” “you are blonde!”
SLIDES SET NUMBER 1.. Class 478/578: General 1 1.My name is Marek Perkowski 2.You can call my Marek, or Dr. Perkowski or whatever you like. 3.This class.
Towards Robot Theatre Marek Perkowski Department of Electrical and Computer Engineering, Portland State University, Portland, Oregon,
Efficient Decomposition of Large Fuzzy Functions and Relations.
Why are ‘Speaking and Listening’ skills so important?
“As is our confidence, so is our capacity
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
REAL ROBOTS. iCub It has a height of 100 cm, weighs 23 Kg, and is able to recognize and manipulate objects. Each hand has 9 DOF and can feel objects almost.
Learning Styles.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
COMPUTING IN THE NATIONAL CURRICULUM. WHY?  The 2014 national curriculum introduces a new subject, computing, which replaces ICT. This represents continuity.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
Gerhard K. Kraetzschmar The Cool Science Institute Educational Robotics A Glimpse on Robotics Tutorial Material.
Behavior, Dialog and Learning The dialog/behavior has the following components: –(1) Eliza-like natural language dialogs based on pattern matching and.
Lesson D2-2 Understanding Effective Communication Techniques.
Understanding Effective Communication Techniques
Humanoid Robots Debzani Deb.
By the end of this chapter, you should:  Understand the properties of an engineering requirement and know how to develop well-formed requirements that.
How necessary is it to use and interpret it?. Non-verbal Communication  Nonverbal communications is the process of communication through sending and.
Speech Recognition Robot
CMPS 3223 Theory of Computation Automata, Computability, & Complexity by Elaine Rich ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Slides provided.
Effective Public Speaking Chapter # 3 Setting the Scene for Community in a Diverse Culture.
Facial Expressions of Android Robots. Maria Curie-Sklodowska.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
The Complete Theatre as a Single Robot. The mechanical design concept Complete automated system of: – robots, – controlled cameras, – controlled furniture,
Interactive Spaces Huantian Cao Department of Computer Science The University of Georgia.
“Do NOW” “Do NOW” What is the Definition of Peer Pressure? What is the Definition of Peer Pressure? What is the difference between Direct and Indirect.
Lecture 15 – Social ‘Robots’. Lecture outline This week Selecting interfaces for robots. Personal robotics Chatbots AIML.
Intermediate 2 Software Development Process. Software You should already know that any computer system is made up of hardware and software. The term hardware.
How Solvable Is Intelligence? A brief introduction to AI Dr. Richard Fox Department of Computer Science Northern Kentucky University.
I Robot.
Homework 1 (with additional explanations) Intelligent Robotics 1, ECE 478/578 Deadline, October 15 This Homework is Group Project Will be incorporated.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
Everyone Communicates Few Connect
CHAPTER 19 Communication Skills.
UNDERSTANDING EFFECTIVE COMMUNICATION TECHNIQUES.
Understanding Effective Communication Techniques.
DO NOW: 1.State whether you agree or disagree with this statement-and tell me WHY- “Everyone learns the same way.” Be prepared to justify your answer.
Intermediate 2 Computing Unit 2 - Software Development.
Why Can't A Computer Be More Like A Brain?. Outline Introduction Turning Test HTM ◦ A. Theory ◦ B. Applications & Limits Conclusion.
1 The Software Development Process ► Systems analysis ► Systems design ► Implementation ► Testing ► Documentation ► Evaluation ► Maintenance.
REU 2009 Computer Science and Engineering Department The University of Texas at Arlington Research Experiences for Undergraduates in Information Processing.
The RoboCup Standard Platform League : Soccer for robots Requires fast, stable, intelligent robots Robots wear out and are time consuming to work with.
Perceptive Computing Democracy Communism Architecture The Steam Engine WheelFire Zero Domestication Iron Ships Electricity The Vacuum tube E=mc 2 The.
Robot Metaphors and Models
Quantum Logic Quantum Logic Boolean Logic EXOR/AND Logic
What is an ANN ? The inventor of the first neuro computer, Dr. Robert defines a neural network as,A human brain like system consisting of a large number.
Review and Ideas for future Projects
Quantum Logic Quantum Logic Boolean Logic EXOR/AND Logic
Example of application: Decomposition
Robot Metaphors and Models
Learning in Robots.
Presentation transcript:

Towards Robot Theatre Marek Perkowski Department of Electrical and Computer Engineering, Portland State University, Portland, Oregon,

Week 2 Lectures 3 and 4

Humanoid Robots and Robot Toys

Talking Robots Many talking robots exist, but they are still very primitive Work with elderly and disabled Actors for robot theatre, agents for advertisement, education and entertainment. Designing inexpensive natural size humanoid caricature and realistic robot heads We concentrate on Machine Learning techniques used to teach robots behaviors, natural language dialogs and facial gestures. Dog.com from Japan Work in progress

Robot with a Personality? Future robots will interact closely with non-sophisticated users, children and elderly, so the question arises, how they should look like? If human face for a robot, then what kind of a face? Handsome or average, realistic or simplified, normal size or enlarged? Why is Kismet so successful? We believe that a robot that will interact with humans should have some kind of “personality” and Kismet so far is the only robot with “personality”. The famous example of a robot head is Kismet from MIT.

Robot face should be friendly and funny The Muppets of Jim Henson are hard to match examples of puppet artistry and animation perfection. We are interested in robot’s personality as expressed by its: –behavior, –facial gestures, –emotions, –learned speech patterns.

Behavior, Dialog and Learning Robot activity as a mapping of the sensed environment and internal states to behaviors and new internal states (emotions, energy levels, etc). Our goal is to uniformly integrate verbal and non-verbal robot behaviors. Words communicate only about 35 % of the information transmitted from a sender to a receiver in a human-to-human communication. The remaining information is included in para-language. Emotions, thoughts, decision and intentions of a speaker can be recognized earlier than they are verbalized. NASA

Morita’s Theory

Robot Metaphors and Models

Animatronic “Robot” or device brain effectors

Perceiving “Robot” brain sensors

Reactive Robot is the simplest behavioral robot Brain is a mapping sensors This is the simplest robot that satisfies the definition of a robot effectors

Reactive Robot in environment brain sensors This is the simplest robot that satisfies the definition of a robot effectors ENVIRONMENT is a feedback

Braitenberg Vehicles and Quantum Automata Robots

Another Example: Braitenberg Vehicles and Quantum BV

Braitenberg Vehicles

Emotional Robot has a simple form of memory or state Brain is a Finite State Machine sensors This is the simplest robot that satisfies the definition of a robot effectors

Behavior as an interpretation of a string Newton, Einstein and Bohr. Hello Professor Hello Sir Turn Left. Turn right. behavior

Behavior as an interpretation of a tree Newton, Einstein and Bohr. Hello Professor Hello Sir Turn Left. Turn right. behavior Grammar. Derivation. Alphabets.

Our Base Model and Designs

Neck and upper body movement generation

Robot Head Construction, 1999 Furby head with new control Jonas Jonas We built and animated various kinds of humanoid heads with from 4 to 20 DOF, looking for comical and entertaining values. High school summer camps, hobby roboticists, undergraduates

Mister Butcher 4 degree of freedom neck Latex skin from Hollywood

Robot Head Construction, 2000 Skeleton Alien We use inexpensive servos from Hitec and Futaba, plastic, playwood and aluminum. The robots are either PC-interfaced, use simple micro-controllers such as Basic Stamp, or are radio controlled from a PC or by the user.

Adam Marvin the Crazy Robot Technical Construction, 2001 Details

Virginia Woolf heads equipped with microphones, USB cameras, sonars and CDS light sensors 2001

Max Image processing and pattern recognition uses software developed at PSU, CMU and Intel (public domain software available on WWW). Software is in Visual C++, Visual Basic, Lisp and Prolog. BUG (Big Ugly Robot) 2002

Visual Feedback and Learning based on Constructive Induction 2002 Uland Wong, 17 years old

Professor Perky 1 dollar latex skin from China We compared several commercial speech systems from Microsoft, Sensory and Fonix. Based on experiences in highly noisy environments and with a variety of speakers, we selected Fonix for both ASR and TTS for Professor Perky and Maria robots. We use microphone array from Andrea Electronics. Professor Perky with automated speech recognition (ASR) and text-to-speech (TTS) capabilities 2002, Japan

Maria, 2002/ DOF

Construction details of Maria location of controlling rods location of head servos location of remote servos Custom designed skin skull

Animation of eyes and eyelids

Cynthia, 2004, June

Currently the hands are not moveable. We have a separate hand design project.

Software/Hardware Architecture Network- 10 processors, ultimately 100 processors. Robotics Processors. ACS 16 Speech cards on Intel grant More cameras Tracking in all robots. Robotic languages – Alice and Cyc-like technologies.

Face detection localizes the person and is the first step for feature and face recognition. Acquiring information about the human: face detection and recognition, speech recognition and sensors.

Face features recognition and visualization.

Use of Multiple- Valued (five- valued) variables Smile, Mouth_Open and Eye_Brow_Raise for facial feature and face recognition.

HAHOE KAIST ROBOT THEATRE, KOREA, SUMMER 2004 Sonbi, the Confucian ScholarPaekchong, the bad butcher Czy znacie dobra sztuke dla teatru robotow?

Editing movements

Yangban the Aristocrat and Pune his concubine The Narrator

We base all our robots on inexpensive radio- controlled servo technology.

We are familiar with latex and polyester technologies for faces Martin Lukac and Jeff Allen wait for your help, whether you want to program, design behaviors, add muscles, improve vision, etc.

New Silicone Skins

A simplified diagram of software explaining the principle of using machine learning based on constructive induction to create new interaction modes of a human and a robot.

Probabilistic and Finite State Machines

Probabilistic State Machines to describe emotions Happy state Ironic state Unhappy state “you are beautiful” / ”Thanks for a compliment” “you are blonde!” / ”I am not an idiot” P=1 P=0.3 “you are blonde!” / Do you suggest I am an idiot?” P=0.7

Facial Behaviors of Maria Do I look like younger than twenty three? Maria asks:  “yes”  “no” Response: Maria smiles Maria frowns

Probabilistic Grammars for performances Who? What? Where? Speak ”Professor Perky”, blinks eyes twice Speak “In the classroom”, shakes head P=0.1 Speak “Was drinking wine” P=0.1 P=0.3 P=0.5 Speak ”Professor Perky” Speak ”Doctor Lee” Speak “in some location”, smiles broadly Speak “Was singing and dancing” P=0.5 P=0.1 …. P=0.1

Human-controlled modes of dialog/interaction Robot asks Human teaches Human commandsHuman asks Robot performs “Hello Maria” “Thanks, I have a question” “Thanks, I have a lesson” “Thanks, I have a command” “Lesson finished” “Questioning finished” “Command finished” “Stop performance” “Question”

Dialog and Robot’s Knowledge

Robot-Receptionist Initiated Conversation Robot What can I do for you? Human Robot asks This represents operation mode

Robot-Receptionist Initiated Conversation Robot What can I do for you? Human I would like to order a table for two Robot asks

Robot-Receptionist Initiated Conversation Robot Smoking or non- smoking? Human Robot asks

Robot-Receptionist Initiated Conversation Robot Smoking or non- smoking? Human I do not understand Robot asks

Robot-Receptionist Initiated Conversation Robot Do you want a table in a smoking or non-smoking section of the restaurant? Non-smoking section is near the terrace. Human Robot asks

Robot-Receptionist Initiated Conversation Robot Do you want a table in a smoking or non-smoking section of the restaurant? Non-smoking section is near the terrace. Human A table near the terrace, please Robot asks

Human-Initiated Conversation Robot Human Hello Maria Robot asks initialization

Human-Initiated Conversation Robot Human Hello Maria What can I do for you? Robot asks

Human-Asking Robot Human Question Human asks Question Robot asks

Human-Asking Robot Human Question Human asks Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks Lee wrote book “Flowers”.

Human-Asking Robot Human This is not true. Human asks Lee wrote book “Flowers”.

Human-Teaching Robot Human Questioning finished Human teaches “Questioning finished” Robot asks Human asks Thanks, I have a lesson Human ends questioning

Human-Teaching Robot Human Questioning finished Human teaches “Questioning finished” Robot asks Human asks Thanks, I have a lesson Robot enters asking mode What can I do for you?

Human-Teaching Robot Human Thanks, I have a lesson Human teaches “Questioning finished” Robot asks Human asks Thanks, I have a lesson Human starts teaching What can I do for you?

Human-Teaching Robot Human Thanks, I have a lesson Yes Human teaches

Human-Teaching Robot Human I give you question- answer pattern Yes Human teaches

Human-Teaching Robot Human Question pattern: What book Smith wrote? Yes Human teaches

Robot Human Answer pattern: Smith wrote book “Automata Theory” Yes Human teaches Human-Teaching

Human-Teaching Robot Human Checking question: What book wrote Smith? Yes Human teaches

Human-Teaching Robot Human Checking question: What book wrote Smith? Smith wrote book “Automata Theory” Human teaches

Human-Teaching Robot Human I give you question- answer pattern Yes Human teaches

Human-Teaching Robot Human Question pattern: Where is room of Lee? Yes Human teaches

Human-Teaching Robot Human Answer pattern: Lee is in room 332 Yes Human teaches

Human-Checking what robot learned Robot Human Lesson finished Human asks Question Robot asks Human teaches “Lesson finished”

Human-Checking what robot learned Robot Human Lesson finished Human asks Question Robot asks Human teaches “Lesson finished” What can I do for you?

Human-Checking what robot learned Robot Human Question Human asks Question Robot asks Human teaches “Lesson finished” What can I do for you?

Human-Asking Robot Human Question Human asks Question Robot asks Human teaches “Lesson finished” Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks Yes, you ask a question.

Human-Asking Robot Human What book wrote Lee? Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks I have no sure information.

Human-Asking Robot Human Try to guess. Human asks Lee wrote book “Automata Theory” Observe that robot found similarity between Smith and Lee and generalized (incorrectly)

Behavior, Dialog and Learning The dialog/behavior has the following components: –(1) Eliza-like natural language dialogs based on pattern matching and limited parsing. Commercial products like Memoni, Dog.Com, Heart, Alice, and Doctor all use this technology, very successfully – for instance Alice program won the 2001 Turing competition. –This is a “conversational” part of the robot brain, based on pattern-matching, parsing and black-board principles. –It is also a kind of “operating system” of the robot, which supervises other subroutines.

(2) Subroutines with logical data base and natural language parsing (CHAT). –This is the logical part of the brain used to find connections between places, timings and all kind of logical and relational reasonings, such as answering questions about Japanese geography. Behavior, Dialog and Learning

(3) Use of generalization and analogy in dialog on many levels. –Random and intentional linking of spoken language, sound effects and facial gestures. –Use of Constructive Induction approach to help generalization, analogy reasoning and probabilistic generations in verbal and non-verbal dialog, like learning when to smile or turn the head off the partner. Behavior, Dialog and Learning

(4) Model of the robot, model of the user, scenario of the situation, history of the dialog, all used in the conversation. (5) Use of word spotting in speech recognition rather than single word or continuous speech recognition. (6) Continuous speech recognition (Microsoft) (7) Avoidance of “I do not know”, “I do not understand” answers from the robot. –Our robot will have always something to say, in the worst case, over-generalized, with not valid analogies or even nonsensical and random. Behavior, Dialog and Learning

Constructive Induction

What is constructive induction? Constructive induction is a logic-based method of teaching a robot of new knowledge. It can be compared to neural networks. Teaching is constructing some structure of a logic function: –Decision tree –Sum of Products –Decomposed structue

Name (examples) Age (output) d SmileHeightHair Color Joan Kid (0) a(3)b(0)c(0) Mike Teenager (1) a(2)b(1)c(1) Peter Mid-age (2) a(1)b(2)c(2) Frank Old (3) a(0)b(3)c(3) Example “Age Recognition” Examples of data for learning, four people, given to the system

Smile - a Very often often moderately rarely Values 3210 Height - b Very Tall TallMiddleShort Values 3210 Color - c GreyBlackBrownBlonde Values 3210 Example “Age Recognition” Encoding of features, values of multiple-valued variables

Multi-valued Map for Data ab\ c d = F( a, b, c ) ab\ c Groups show a simple induction from the Data

Old people smile rarely ab\ c Groups show a simple induction from the Data Middle-age people smile moderately Teenagers smile often Children smile very often Grey hair blonde hair

Another example: teaching movements Input variables Output variables

Generalization of the Ashenhurst- Curtis decomposition model

This kind of tables known from Rough Sets, Decision Trees, etc Data Mining

Decomposition is hierarchical At every step many decompositions exist Which decomposition is better? Original table First variant of decomposition Second variant

Constructive Induction: Technical Details U. Wong and M. Perkowski, A New Approach to Robot’s Imitation of Behaviors by Decomposition of Multiple-Valued Relations, Proc. 5 th Intern. Workshop on Boolean Problems, Freiberg, Germany, Sept , 2002, pp A. Mishchenko, B. Steinbach and M. Perkowski, An Algorithm for Bi-Decomposition of Logic Functions, Proc. DAC 2001, June 18-22, Las Vegas, pp A. Mishchenko, B. Steinbach and M. Perkowski, Bi- Decomposition of Multi-Valued Relations, Proc. 10 th IWLS, pp , Granlibakken, CA, June 12-15, IEEE Computer Society and ACM SIGDA.

Decision Trees, Ashenhurst/Curtis hierarchical decomposition and Bi-Decomposition algorithms are used in our software These methods create our subset of MVSIS system developed under Prof. Robert Brayton at University of California at Berkeley [2]. – The entire MVSIS system can be also used. The system generates robot’s behaviors (C program codes) from examples given by the users. This method is used for embedded system design, but we use it specifically for robot interaction. Constructive Induction

Ashenhurst Functional Decomposition Evaluates the data function and attempts to decompose into simpler functions. if A  B = , it is disjoint decomposition if A  B  , it is non-disjoint decomposition B - bound set A - free set F(X) = H( G(B), A ), X = A  B X

A Standard Map of function ‘z’ Bound Set Free Set a b \ c z Columns 0 and 1 and columns 0 and 2 are compatible column compatibility = 2 Explain the concept of generalized don’t cares

NEW Decomposition of Multi- Valued Relations if A  B = , it is disjoint decomposition if A  B  , it is non-disjoint decomposition F(X) = H( G(B), A ), X = A  B Relation A B X

Forming a CCG from a K-Map z Bound Set Free Set a b \ c Columns 0 and 1 and columns 0 and 2 are compatible column compatibility index = 2 C1C1 C2C2 C0C0 Column Compatibility Graph

Forming a CIG from a K-Map Columns 1 and 2 are incompatible chromatic number = 2 z a b \ c C1C1 C2C2 C0C0 Column Incompatibility Graph

A unified internal language is used to describe behaviors in which text generation and facial gestures are unified. This language is for learned behaviors. Expressions (programs) in this language are either created by humans or induced automatically from examples given by trainers. Constructive Induction

Conclusion. What did we learn (1) the more degrees of freedom the better the animation realism. Art and interesting behavior above certain threshold of complexity. (2) synchronization of spoken text and head (especially jaw) movements are important but difficult. Each robot is very different. (3) gestures and speech intonation of the head should be slightly exaggerated – superrealism, not realism.

Conclusion. What did we learn(cont) (4) Noise of servos: –the sound should be laud to cover noises coming from motors and gears and for a better theatrical effect. –noise of servos can be also reduced by appropriate animation and synchronization. (5) TTS should be enhanced with some new sound-generating system. What? (6) best available ATR and TTS packages should be applied. (7) OpenCV from Intel is excellent. (8) use puppet theatre experiences. We need artists. The weakness of technology can become the strength of the art in hands of an artist.

(9) because of a too slow learning, improved parameterized learning methods should be developed, but also based on constructive induction. (10) open question: funny versus beautiful. (11) either high quality voice recognition from headset or low quality in noisy room. YOU CANNOT HAVE BOTH WITH CURRENT ATR TOOLS. (12) low reliability of the latex skins and this entire technology is an issue. Conclusion. What did we learn(cont)

We won an award in PDXBOT We showed our robots to several audiences International Intel Science Talent Competition and PDXBOT 2004, 2005 Robot shows are exciting Our Goal is to build toys for 21-st Century and in this process, change the way how engineers are educated.

What to remember? Robot as a mapping from inputs to outputs Braitenberg Vehicles State machines, grammars and probabilistic state machines Natural language conversation with a robot Image processing for a interactive robot. Constructive induction for behavior and language acquisition.

Projects: Project 1 –Lego NXT. 2 people. Editor for state-machine and probabilistic state machine base robot behavior of mobile robots with sensors. Project 2 –Vision for KHR-1 robot; Immitation. 2 people. Matthias Sunardi – group leader. Project 3 –Head design for a humanoid robot

Projects: Project 4 –Leg design for a humanoid robot Project 5 –Hand design for a humanoid robot Project 6 –EyeSim simulator – no robot needed. Project 7 –Conversation with a humanoid robot (dialog and speech).

Projects: Project 8 –Editor for an animatronic robot theatre Project 9 – Quantum-Computer Controlled Robot Project 10 – Project 11 –