Embodied Speech and Facial Expression Avatar  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  May 10, 2004.

Slides:



Advertisements
Similar presentations
Lab7: Introduction to Arduino
Advertisements

3D Graphical Display Ararat Adamian Brian McDonald Tyler Blair Adrian Williams.
By: Russ Butler ECE4220 Spring 2012 Dr. DeSouza May 2, 2012.
In this presentation you will:
Indian Institute of Technology Hyderabad ROBOTICS LINE FOLLOWER HARI KISHAN TANDEY – ES12B1008 DILIP KONDAPARTHI – ES12B1010 SAI KARTIK – CE12B1015.
Vex Robotics Introduction to Sensors. introduction to sensors Sensors assist robots in seeing and feeling the physical world through which they travel.
ELECTRICAL. Circuits Outline Power Hub Microcontroller Sensor Inputs Motor Driver.
This is an audio presentation. Please turn on your computer speakers. Press to start the presentation.
L.
Embodied Speech and Facial Expression Avatar Design Proposal  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  February 9, 2004.
Solar Car Data Collection System Matt Boyden Rene Dupuis Ryan Lavallee 4/23/08.
1 Electrical and Computer Engineering Guitar Virtuos Justin D’Alessando (EE) Jacob Dionne (CSE) Adam Montalbano (CSE) Jeffrey Newton (EE) Team Kelly Midway.
Solar Car Data Collection System Matt Boyden Rene Dupuis Ryan Lavallee 4/8/08.
ENGR 101: Robotics Lecture 2 – Text Programming Outline  Introduction to PBASIC  Variables, I/O, Arithmetic  Controlling the LEDs References 
1 Electrical and Computer Engineering Guitar Virtuos Justin D’Alessandro (EE) Jacob Dionne (CSE) Adam Montalbano (CSE) Jeffrey Newton (EE) Team Kelly Final.
Coordinate Based Tracking System
Arduino. Arduino is a tool for making computers that can sense and control more of the physical world than your desktop computer. It's an open-source.
ACTIVE SUSPENSION TEST PLATFORM BRANDON NAYDEN & CHIAO LIU BY ADVISED BY: STEVEN GUTSCHLAG.
Robotic Arm Controller A VLSI Implementation Team: Justin Hamann & Dave McNamara Team: Justin Hamann & Dave McNamara Advisor: Dr. Vinod Prasad Advisor:
Motor Control of an Oscillating Pendulum Nick Myers and Chirag Patel March 9, 2004 Advised by: Dr. James Irwin and Mr. Jose Sanchez Bradley University.
Embodied Speech and Facial Expression Avatar Critical Design Review  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  March 10, 2004.
Digital I/O Connecting to the Outside World
E102 LAB Temperature Control of Heated Block. Temperature Controller Specifications Design, simulate and test a control system for temperature control.
Dr. Hoganson CSIS HC11 Demo Program This is our first lab using the 68HC11 microcontroller. We will “talk” to the microcontroller from a PC, run.
Embedded Systems Design
ARDUINO PROGRAMMING Working with the Arduino microcontroller.
Humanoid Robot Head May Team Members: Client/Faculty Advisor: Dan Potratz (CprE) Tim Meer (EE) Dr. Alex Stoytchev Cody Genkinger (CprE) Jason Pollard.
Microcontroller Hands-on Workshop #3 Ahmad Manshad New Mexico State University Institute of Electrical and Electronics Engineers November 7, 2009.
Introduction to Interfacing Projects Nareen Khan.
Programming Concepts Part B Ping Hsu. Functions A function is a way to organize the program so that: – frequently used sets of instructions or – a set.
The George Washington University Electrical & Computer Engineering Department ECE 002 Dr. S. Ahmadi Class 2.
Programming Concepts (Part B) ENGR 10 Introduction to Engineering 1 Hsu/Youssefi.
4 May I/O Control. 4 May What is I/O Control? A Doors feature that allows you to program input events to drive output responses – for example:
Exploring with Lego Robots Daniel Limbrick (Texas A&M University) Emily Sherrill (Tennessee Tech University)
SDMAY11-01 Advisor: Dr. Ajjarapu Team Members: Luke Rupiper Shonda Butler Andrew Nigro Ryan Semler Chad Hand.
University of Pennsylvania Moore School of Electrical Engineering ABSTRACT: The ability to communicate is essential for surviving in today’s world, but.
Measurement and Control. Control Systems A control system usually consists of a processor, a control program, interfaces and a device under the processor's.
Microcomputers Final Project.  Camera surveillance is an important aspect of Robotics.  Autonomous robots require the use of servos for camera control.
1 Electrical and Computer Engineering Guitar Virtuos Justin D’Alessando (EE) Jacob Dionne (CSE) Adam Montalbano (CSE) Jeffrey Newton (EE) Team Kelly Comprehensive.
Humanoid Robot Head Dan Potratz Cody Genkinger Tim Meer Jason Pollard Andrew Taylor.
Engineering H193 - Team Project Gateway Engineering Education Coalition Lab 1P. 1Spring Quarter Introduction to Sensors Lab 1.
Lynxmotion Robotic Arm
Human Factors Issues Chapter 8. What is Human Factors? Application of the scientific knowledge of human capabilities and limitations to the design of.
The George Washington University Electrical & Computer Engineering Department ECE 002 Dr. S. Ahmadi Class3/Lab 2.
Final Year Project(EPT4046) Development of an internet controlled Surveillance Mobile Robot By Mimi Madihah Bt Mohd Idris Id: BACHELOR OF ENGINEERING.
SUBMITTED BY LENIN C INTRODUCTION Railways - cheapest mode of transportation. Aims to avoid accidents. Using simple electronic components- automate.
Mindstorm NXT-G Introduction Towson University Robotics.
1 Introduction to Haptics Introduction to the Hapkit board Allison M. Okamura Stanford University.
Autonomous Wheelchair Tyler Morton & Ben Hoerst Senior Design Advisor: Dr. Stanislaw Legowski Project Advisor: Dr. Steven Barrett ECE Senior Design.
QuickBev Group 29: Phillip Nielsen & Michael Perreux TA: Braedon Salz ECE445 Spring 2016 May 4, 2016.
Lynxmotion Robotic Arm © 2013 Project Lead The Way, Inc.Computer Integrated Manufacturing
Lego Mindstorm Robots 9797 kit.  Students will learn how to identify how to detect a change in a condition.  Students will learn where and how to identify.
ISA CLICK CONTROL #38 – FALL 2014 ERIC BRUNNGRABER DRAKE ISABIRYE.
Programming and Debugging with the Dragon and JTAG Many thanks to Dr. James Hawthorne for evaluating the Dragon system and providing the core content for.
Deep Touch Pressure Abdomen Belt Group 32 Kevin Rathbun & Luke Fleming & Chang-O Pyo ECE 445 Senior Design April 28, 2015.
Having fun with code, using Arduino in a middle school CS classroom
Programming Concepts (Part B) ENGR 10 Introduction to Engineering
ARDUINO BASED AUTOMATIC TEMPERATURE BASED FAN SPEED CONTROLLER
PROPELLER DISPLAY OF MESSAGE BY VIRTUAL LEDS
SCADA for Remote Industrial Plant
PC Mouse operated Electrical Load Control Using VB Application
Introduction to Handshaking Communication with SSC-32U
Servos The material presented is taken from a variety of sources including:
Programming Concepts (Part B) ENGR 10 Introduction to Engineering
Instructor Resources.
Development Commitment Package
LEGO Education - Mindstorms EV3 - Computer/control center – The Brick
Programming Design ROBOTC Software Principles of Engineering
Programming Concepts (Part B) ENGR 10 Introduction to Engineering
Instructor Resources.
Presentation transcript:

Embodied Speech and Facial Expression Avatar  Dan Harbin - Evan Zoss - Jaclyn Tech - Brent Sicking  May 10, 2004

Problem Background/Needs Statement Messages of the face help illustrate verbal communication by revealing what the expresser is feeling or trying to convey. The ability to generate animated facial expressions together with speech is important to many diverse application areas. –A deaf person could use an animated face as a lip-reading system. –An autistic child could be positively affected from a robotic face in terms of social interaction, language development, and learning through structure and repetition.

Goals and Objectives The overall goal of this project is to create a robotic face capable of displaying human emotion accompanied with speech.

Goals and Objectives Reverse engineer Yano’s motors and sensors so we are able to move them to any desired position. Develop a GUI that allows the user to move each motor in both directions to a desired position. Research the psychology behind the use of facial expressions to convey emotion and mimic these facial expressions with the Yano face. Develop a GUI that allows the user to select and display real human facial expressions. Develop software to mimic speech based on a measure of the intensity of various pre-recorded wave files.

Yano Control System

Part 1: The Computer 1.Allows the user to directly control the movement of Yano’s eyes, cheeks, and mouth motors. 2.Provides parameterized control of Yano’s facial expressions by allowing the user to both select from a predefined set of expressions and to control his expression in terms of valence, arousal, and stance. 3.Allows the user to load a pre-recorded wave file and play it back as Yano mimics human speech based on the intensity of the wave file.

User Interface: The Main Menu

User Interface: Manual Motor Control

User Interface: Facial Expressions

Arousal – to stir up, excite; provoke; to awaken from. Valence – the degree of attraction or aversion that an individual feels toward a specific object or event. Stance – the altitude or position of a standing person; mental posture; point of view; a station, a position.

User Interface: Sound Processing Progress Bar File Name (.wav) Intensity Meter (based on power waveform)

User Interface: Sound Processing Original WaveformPower Waveform

Input Port: AD1 AD5 Power: Gnd Vcc Serial Port Motor Control Port: SV6 SV1 Part 2: SV203 Microcontroller

Receives command through the serial port Set or Clear the appropriate Motor Control Pin(s) Read an analogue voltage off of the desired Input Pin(s) Transmit a value representing the voltage back up the serial line SV203 Functional Description

Serial Port – ASCII text commands are sent to the board via the serial port to tell it what to do. Values from the input pins are also sent back to the computer via the serial port –List of commands we use: SVxM0 – initialize pin x to use digital logic PSx – set pin x high PCx – clear pin x to low Dn – delay for n milliseconds before next command PC1PC3PC5D300PS1PS3PS5 – typical motor control command ADy – read the voltage of input pin y, transmit up serial port Motor Control Port – sends the logic controls for the motors to the Yano I/O Board. When a pin is set high with PSx, it is set to 6V, PCx will set it to 0V. We use six pins, SV1 through SV6 A/D Input Port – receives the status of Yano’s switches from the Yano I/O Board. We use 5 pins, AD1 through AD5. Each pin will have 6V on it if it’s switch is open, and near 0V if it is closed. The SV203 converts these voltages to the numbers 0 – 255 for 0V-6V. SV203 Interface Description

SV203 Microcontroller Yano Switch Circuit: Part 3: Yano I/O Board

Receives logic controls for the motors from SV203 Converts them into powered control for Yano’s motors Reads in status of Yano’s switches, open or closed Converts this to a voltage, 6V for open, 0V for closed, and sends back to SV203 Yano I/O Board Functional Description

Motor Control Input – the logic input for the H-Bridges that determines motor direction and movement. They are paired off, 2 pins per H-Bridge, 1 Bridge per motor: –Mouth: SV5 and SV6 –Cheeks: SV3 and SV4 –Eyes: SV1 and SV2 Motor Outputs – 3 two pin ports, one for each motor, each pin will have either Vcc or Gnd. If both pins are Vcc (default state) there is no potential between them and the motor will not turn. If one pin drops to Gnd, the motor will turn one way, vice-versa for the other pin. Sensor Inputs – these ports connect directly to Yano’s switches. The mouth and cheek motors each have two limit switches to determine when they run far enough in each direction. The eye motor can run a complete 360 degree rotation, and so just has a single sensor that is triggered when the eye motor is in a particular place around the rotation. Sensor Outputs - the interface back to the SV203 that has 5 pins, each of which are set to 6V for open switch and 0V for closed switch. They are paired off according to which motor they are the limit switches for: –Mouth: AD3 and AD4 –Cheeks: AD1 and AD2 –Eyes: AD5 Yano I/O Board Interface Description

Part 4: Yano

Yano has 3 motors powered by the Yano I/O Board. One for each the mouth, one for the cheeks, and one to control the eyelids, eyebrows, and ears. When the mouth and cheek motors reach their endpoints (ie. fully open or fully closed), they close a switch to indicate that limit is reached. These switches are read by the Yano I/O Board. Yano Functional Description

Yano’s interfaces are the motor controls, and the switch feedbacks. The wires are coded as follows: –Motors: Red/Black – Eyes Green/Black – Cheeks White/Black – Mouth –Sensors: Red/Green/Brown – Mouth Gray/Yellow/Pink – Cheeks Green/Yellow/Red/Brown – Eyes Yano Interface Description

Validation and Testing Procedures Calibration Test - Calibrate the motors, then run the motors to its limits and back to see if it stays calibrated. Expression Test - Change from any one expression to any other expression, and the face should show the desired expression each time. Speech Test - Using a sample sound file, make sure Yano produces the right mouth movements for the differences in sound volume consistently and accurately.

Validation and Testing Procedures How do we know we accomplished our goal? Calibration – We are able to know the exact position of each motor at any given time while the software is running. Expressions – We can produce and move between a pre- defined set of believable and readable facial expressions. Speech – Yano is consistently able to move his mouth in concurrence with a wave file; The movement and amount of opening is believable and realistic.

Itemized Budget PartQuantityCost Computer1N/A Yano1$65.49 SV203 Microcontroller1$59.98 TC4424 H-bridges9$9.33 Serial Cable1$11.99 Breadboard1$ pin.100" Female Locking Connector6$ pin.100" Female Locking Connector1$ pin.100" Female Locking Connector1$ pin.100" Female Locking Connector2$ pin.100" Male Locking Connector4$ pin.100" Male Locking Connector3$ pin.100" Male Locking Connector1$ pin.100" Male Locking Connector1$1.49 1kΩ Resistor5$ µF Capacitor1$ µF Capacitor1$0.10 Green Wire24$1.00 Red Wire17$1.00 Black Wire12$1.00 Total $199.26

Timeline of Tasks

Thank you Applied Materials and The National Science Foundation