Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012.

Slides:



Advertisements
Similar presentations
Device receives electronic signal transmitted from signs containing information A device that can communicate GPS location relative to the destination.
Advertisements

Projected Arrival Time Michael Pao Michael Smeets Li-Ren Zhou Abstract The Projected Arrival Time (PAT) system uses the Global Positioning System (GPS)
Real time vehicle tracking and driver behavior monitoring using a cellular handset based on accelerometry and GPS data Kevin Burke 4 th Electronic and.
Hierarchy of Design Voice Controlled Remote Voice Input Control Path Speech Processing IR Interface.
A Novel Approach of Assisting the Visually Impaired to Navigate Path and Avoiding Obstacle-Collisions.
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
Outline Introduction Music Information Retrieval Classification Process Steps Pitch Histograms Multiple Pitch Detection Algorithm Musical Genre Classification.
Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
1 Voice Command Generation for Teleoperated Robot Systems Authors : M. Ferre, J. Macias-Guarasa, R. Aracil, A. Barrientos Presented by M. Ferre. Universidad.
Wheelesley : A Robotic Wheelchair System: Indoor Navigation and User Interface Holly A. Yanco Woo Hyun Soo DESC Lab.
Field Navigational GPS Robot Final Presentation & Review Chris Foley, Kris Horn, Richard Neil Pittman, Michael Willis.
Spik v1.0 Voice Commands Execution in a Windows Environment Dekel Abelson Eliran Dahan Instructor: Ari Todtfeld.
Software. Application Software performs useful work on general-purpose tasks such as word processing and data analysis. The user interacts with the application.
INTELLIGENT ROBOT WALKER. AVAILABLE DEVICES The PAM-AID Reactive device Basic obstacle avoidance Not supportive more an aid for the blind The SMARTCANE.
MTA ETA. Product Description A real-time simulation system that estimates the expected time that it will take a certain bus to arrive at an end- user’s.
“ Walk to here ” : A Voice Driven Animation System SCA 2006 Zhijin Wang and Michiel van de Panne.
A Navigation System for Increasing the Autonomy and the Security of Powered Wheelchairs S. Fioretti, T. Leo, and S.Longhi yhseo, AIMM lab.
Group 9: Chill Geordi: RFID based location sensing Brian Loo (bloo) Geeta Shroff (gshroff) Zane Starr (zcs)
An Integral System for Assisted Mobility Manuel Mazo & the Research group of the SIAMO Project Yuchi Ming, IC LAB.
Design and Implementation of Metallic Waste Collection Robot
Speech Recognition Calculator ECE L02 - Group 8 Alfredo Herrera John Holmes Josh Liang Alex Kee.
Annemarie Kokosy and Matthew Pepper ISEN / EKHUFT April 24 th, 2012 University of Essex 1 Part-financed by the European Regional Development Fund Definition.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Closing conference of SYSIASS – June 17 th 2014 Multimodal Bio-signal based Control of Intelligent Wheelchair Professor Huosheng Hu Leader of Activity.
ICMetrics Experimental Platform Daniel Newman University of Kent 27 January 2012 Ecole Centrale of Lille 1 Part-financed by the European Regional Development.
Gesture Recognition Using Laser-Based Tracking System Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Namiki Laboratory UNIVERSITY OF.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
ICMetrics Experimental Platform Jenya Kovalchuk University of Essex 27 January 2012 Ecole Centrale of Lille 1 Part-financed by the European Regional Development.
Activity 1: Multi-sensor based Navigation of Intelligent Wheelchairs Theo Theodoridis and Huosheng Hu University of Essex 27 January 2012 Ecole Centrale.
The SmartWheeler platform Collaboration between McGill, U.Montreal, Ecole Polytechnique Montreal + 2 clinical rehab centers. Standard commercial power.
Java-Based In-Car Cell Phone Integration By:Chris Keller Greg Nehus Matt Odille.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Center for Human Computer Communication Department of Computer Science, OG I 1 Designing Robust Multimodal Systems for Diverse Users and Mobile Environments.
Department of Computer Science and Engineering, CUHK 1 Final Year Project 2003/2004 LYU0302 PVCAIS – Personal Video Conference Archives Indexing System.
CEG 4392 : Maze Solving Robot Presented by: Dominic Bergeron George Daoud Bruno Daoust Erick Duschesneau Bruno Daoust Erick Duschesneau Martin Hurtubise.
Voice Recognition for Wheelchair Control Theo Theodoridis, Xin Liu, and Huosheng Hu.
Final Presentation.  Software / hardware combination  Implement Microsoft Robotics Studio  Lego NXT Platform  Flexible Platform.
ROBOT NAVIGATION By: Sitapa Rujikietgumjorn Harika Tandra Neeharika Jarajapu.
Human Interaction Development Using the Countess Quanta Robot Brad Pitney Yin Shi.
Steering Committee meeting Interreg SYSIASS project Activity 3 Huosheng Hu June 20 th 2012 ISEN 1 Part-financed by the European Regional Development Fund.
Mobile Robot Navigation Using Fuzzy logic Controller
See3PO - A Visually Capable Path Finding Robot See3PO Frank Marino, Nick Wang, Jacky Yu, Hao Wu and Debarati Basu Department of Computer Science University.
Sylnovie Merchant, Ph.D. MIS 161 Spring 2005 MIS 161 Systems Development Life Cycle II Lecture 5: Testing User Documentation.
A Multidisciplinary Approach for Using Robotics in Engineering Education Jerry Weinberg Gary Mayer Department of Computer Science Southern Illinois University.
Ffffffffffffffffffffffff Controlling an Automated Wheelchair via Joystick/Head-Joystick Supported by Smart Driving Assistance Thomas Röfer 1 Christian.
Low cost tactile feedback platform for teleoperation and VR sensing Human Machine Interaction & Low cost technologies Adrien Moucheboeuf - July 8 th, 2015.
Chatter Box Daniel Dunham Mike Nelson Nick Noack.
Analysis of Movement Related EEG Signal by Time Dependent Fractal Dimension and Neural Network for Brain Computer Interface NI NI SOE (D3) Fractal and.
Road Inventory Data Collection Re-engineering Collected Data Items (more than 50 items): –Street Names. –Pavement width, number of lanes, etc. –Bike path,
Controlling Computer Using Speech Recognition (CCSR) Creative Masters Group Supervisor : Dr: Mounira Taileb.
Tapia Robotics 2009: Import Antigravity Kate Burgers '11, Becky Green '11, Sabreen Lakhani '11, Pam Strom '11 and Zachary Dodds In the summer of 2008,
Work meeting Interreg SYSIASS project 24 th June 2011 ISEN 1 Part-financed by the European Regional Development Fund.
Preliminary Clinical and Technical Specifications Any User - Anywhere 1 Part-financed by the European Regional Development Fund.
AN INTELLIGENT ASSISTANT FOR NAVIGATION OF VISUALLY IMPAIRED PEOPLE N.G. Bourbakis*# and D. Kavraki # #AIIS Inc., Vestal, NY, *WSU,
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
Head movements based control of an intelligent wheelchair in an indoor environment E.J. Rechy-Ramirez and H. Hu University of Essex 24 April 2012 Colchester,
Nir Mendel, Yuval Pick & Ilya Roginsky Advisor: Prof. Ronen Brafman
Wekinator
Business-logic Layer Presentation Layer Network Layer Digital Signal Processing Layer SmartHome API SmartHome Software Architecture SH mobile application.
Robotic Assistance. The PROBLEM Providing assistance for the Blind –What do we mean by “Blind?” Stereotypical blindness Visually impaired What assistance.
Preston M. Green Dept. of Electrical & Systems Engineering Current Undergraduate Research Projects Examples.
Today’s Communication Complexities: Customer, Employee, Enterprise CustomerReality EnterpriseReality EmployeeReality Lost Revenue Customer Satisfaction.
HUMAN MEDIA INTERACTION CREATIVE TECHNOLOGY FOR PLAY AND
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
ARD Presentation January, 2012 BodyPointer.
N. Capp, E. Krome, I. Obeid and J. Picone
Development of a Wheelchair Virtual Driving Environment: Trials With Subjects With Traumatic Brain Injury  Donald M. Spaeth, PhD, Harshal Mahajan, MS,
Chapter 4 . Trajectory planning and Inverse kinematics
Auditory Morphing Weyni Clacken
Presentation transcript:

Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012 Ecole Centrale de Lille 1 Part-financed by the European Regional Development Fund

Part-financed by the European Regional Development Fund 1. Outline of the task within the context of the project 1)To develop novel multimodal human-machine interfaces by integration of voice, gesture, brain and muscle. 2)To understand:  the user who interacts with it  the system (the computer technology and its usability)  the interaction between the user and the system 3)A proper balance between  Functionality - defined by the set of actions or services that it provides to its users, based on the system usability.  Usability - the range and degree that the system can be used efficiently & adequately by certain users. 2

Part-financed by the European Regional Development Fund System Integration 3 Voice, gesture, EEG,,.. Electrodes GPS Gyro, Laser... Activity 1 Navigation Multimodal HMI Activity 2 Communication 1. Outline of the task within the context of the project

Part-financed by the European Regional Development Fund 4 System Software Structure II.Main results – Gesture based HMI

Part-financed by the European Regional Development Fund 5 System GUI

Part-financed by the European Regional Development Fund Docking Area B Wood box barrier Pitch boundary Planed routes Docking Area A Experiment 1

Part-financed by the European Regional Development Fund 7 Experimental 1 Results: (upper) multi-modality control; (lower) joystick control

Part-financed by the European Regional Development Fund 8 Fig.10 Planned task 2 map for indoor experiment Experiment 2

Part-financed by the European Regional Development Fund 9 Experimental 2 Results: (Left) multi-modality control; (Right) joystick control

Part-financed by the European Regional Development Fund 10 II.Main results – Voice based HMI Task: To use voice recognition for controlling a wheelchair Purpose: To aid people with limited physical capability Software: The Microsoft Speech SDK Hardware: The Essex robotic wheelchair Experimentation: The Essex robotic arena

Part-financed by the European Regional Development Fund 11 Speech Recognition Structure Driving components:  · Start: Capture voice command  · Sampling: Sample voice signal in real-time  · Calculate energy: Validate signal’s presence  · Calculate zero-crossing rate: Validate signal’s changes  · Calculate entropy: Validate signal’s utterance  · Speech recognition by parser: Microsoft Speech SDK  · Driving: Forward, Back, Left, Right, Stop ) Start Sampling real-time signals Calculate energy Calculate zero- crossing rate Calculate entropy Speech recognition by parser Driving

Part-financed by the European Regional Development Fund 12 Microsoft Speech SDK Features:  Developed by Microsoft’s Speech Technologies Group  Aims to recognize audio speech and perform text-to-speech synthesizing  This API can be used on common programming languages including C++ FFTW Core:  FFTW is a ready-made library for computing discrete Fourier transform (DFT)  Developed using the C++ language by MIT  Can be used for increasing the running speed Recognition Accuracy:  Four commands are employed for control  Exceptional recognition accuracy  Adequate real-time control CommandAccuracy Forward90% Back93% Right92% Left86% Stop90%

Part-financed by the European Regional Development Fund 13 Testing Results Environment 1 A simple corridor with no obstacles Task: Reach destination at the same horizontal coordinate as the origin Environment 2 An open area with two obstacles Task: Avoid obstacles in a zigzag fashion and return back to the origin Test Time (sec) Environment 1 Time (sec) Environment Average ≈ 2min≈ 3.5min

Part-financed by the European Regional Development Fund III.Future challenges and the work to be done 1)A novel multi-modal HMI will be developed by integration of voice control, gesture control, brain and muscle actuated control in order to meet the needs of different users. 2) The novel navigation and control algorithms developed in Activity 1 will be integrated to the wheelchair, including map- building, path planning, obstacle-avoidance, self-localization, trajectory generation, etc. 3) An integrated communications system allowing confidential data developed in Activity 2 to be made available at the intelligent wheelchair 14