Integrating Mobile and Cloud Computing for Electromyography (EMG) and Inertial Measurement Unit (IMU)-based Neural-Machine Interface Victor Delaplaine,

Slides:



Advertisements
Similar presentations
Department of Mathematics and Computer Science
Advertisements

Department of Electrical and Computer Engineering He Zhou Hui Zheng William Mai Xiang Guo Advisor: Professor Patrick Kelly ASLLENGE.
A Mobile-Cloud Pedestrian Crossing Guide for the Blind
Department of Electrical and Computer Engineering Texas A&M University College Station, TX Abstract 4-Level Elevator Controller Lessons Learned.
1 HealthSense : Classification of Health-related Sensor Data through User-Assisted Machine Learning Presenter: Mi Zhang Feb. 23 rd, 2009 From Prof. Gregory.
Final Year Project: Design and Build an alternative input device Air Mouse Colin Grogan.
Feature Extraction Spring Semester, Accelerometer Based Gestural Control of Browser Applications M. Kauppila et al., In Proc. of Int. Workshop on.
Aefa Personal Exercise Assistant. Introduction Team members: Justin Bumpus-Barnett Dmitri Musatkin Cilranus Thompson Sean Cline Course Instructor: Dr.
July 25, 2010 SensorKDD Activity Recognition Using Cell Phone Accelerometers Jennifer Kwapisz, Gary Weiss, Samuel Moore Department of Computer &
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Development of the iPad Application “Sound Reading” Ryan DuToit, Bennett Hansen, Dr. Timothy Urness *Department of Mathematics and Computer Science, College.
Hands-free Control of Standard DICOM Imaging Software using Leap Motion or Myo Controller. As Shown at RSNA 2014.
Submitted by:- Vinay kr. Gupta Computer Sci. & Engg. 4 th year.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Sign Classification Boosted Cascade of Classifiers using University of Southern California Thang Dinh Eunyoung Kim
Presenter: D. Jayasakthi Advisor: Dr. Kai-Wei ke.
Hand Motion Identification Using Independent Component Analysis of Data Glove and Multichannel Surface EMG Pei-Jarn Chen, Ming-Wen Chang, and and Yi-Chun.
Face Image-Based Gender Recognition Using Complex-Valued Neural Network Instructor :Dr. Dong-Chul Kim Indrani Gorripati.
Team Members Ming-Chun Chang Lungisa Matshoba Steven Preston Supervisors Dr James Gain Dr Patrick Marais.
It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses It Starts with iGaze: Visual Attention Driven Networking with Smart Glasses.
LetItFlow Architecture Specification Project Meeting Vienna, – Victor Carmocanu SIVECO Romania.
An E-Textiles. Virginia Tech e-Textiles Group Design of an e-textile computer architecture – Networking – Fault tolerance – Power aware – Programming.
TECH RELATED TOPIC PRESENTATION MICROPROCESSOR: CSE341 COURSE INSTRUCTOR DR. JIA UDDIN Assistant Professor Department of Computer Science and Engineering.
Introduction to Mobile-Cloud Computing. What is Mobile Cloud Computing? an infrastructure where both the data storage and processing happen outside of.
Vikash ranjan vipul vikram Rajat kapoor sultan amed.
Google. Android What is Android ? -Android is Linux Based OS -Designed for use on cell phones, e-readers, tablet PCs. -Android provides easy access to.
IntroOH-1 CSE 5810 Remote Health Care Monitoring by Wearable Sensors and Mobile Devices Kanchan Jha Computer Science & Engineering Department The University.
Teng Wei and Xinyu Zhang
Interface Design for Carpal Tunnel Exercise Program
Sensors Journal, IEEE, Issue Date: May 2013,
Intelligent Learning Systems Design for Self-Defense Education
Hand Gestures Based Applications
BlooDragu: Enhancing Motor Skills with Robotic Arm
EMG-HUMAN MACHINE INTERFACE SYSTEM
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
Operating System.
Capstone Project, Computer Science Department
Outline Introduction Standards Project General Idea
Sliding Puzzle Project
Walk n’ Play Group #8 - Team Murali Krishna Goli Viswanath Patimalla
Bachelor of Engineering Technology (HONS) in Medical Electronics,
Andrew McCombs March 10th, 2011
Posture Monitoring System for Context Awareness in Mobile Computing
Lecture 1: Getting Ready
Human-Machine Interface for Myoelectric Applications using EMG and IMU
Emergency Personnel Indoor Locator
When to engage in interaction – and how
The Improvement of PaaS Platform ZENG Shu-Qing, Xu Jie-Bin 2010 First International Conference on Networking and Distributed Computing SQUARE.
Major Project Presentation Phase - I
Recognizing Smoking Gestures with Inertial Measurements Unit (IMU)
Cloud Computing BY: Udit Jain.
Mixed Reality Server under Robot Operating System
Higher School of Economics , Moscow, 2016
PRESENTED BY: CH.MOUNICA B.KEERTHANA SKINPUT. PRESENTED BY: CH.MOUNICA B.KEERTHANA SKINPUT.
Brandon Hixon Jonathan Moore
eResearch at Emory Phase I: eIRB
AI Stick Easy to learn and use, accelerate the industrialization of artificial intelligence, and let the public become an expert in AI.
Xin Qi, Matthew Keally, Gang Zhou, Yantao Li, Zhen Ren
LEAP MOTION: GESTURAL BASED 3D INTERACTIONS
Navigation System on a Quadrotor
ECE Computer Engineering Design Project
PRELIMINARY DESIGN REVIEW
Higher School of Economics , Moscow, 2016
Single Parameter Tuning
Classifier-Feature Accuracy
THE ASSISTIVE SYSTEM SHIFALI KUMAR BISHWO GURUNG JAMES CHOU
Y. Ordonez1, A. Bituin1, K. Kyain1, A. Maxwell2, Z. Jiang2
MyoHMI Architecture Background
Jetson-Enabled Autonomous Vehicle ROS (Robot Operating System)
ML Approach to Approximating Ambient Light Exposure
Presentation transcript:

Integrating Mobile and Cloud Computing for Electromyography (EMG) and Inertial Measurement Unit (IMU)-based Neural-Machine Interface Victor Delaplaine, Ricardo Colin, Danny Ceron, Paul Leung Advisor: Dr. Xiaorong Zhang Mentor: Alex David ASPIRES Summer 2018 | Computer Engineering San Francisco State University Cañada College

Project Outline Motivation and Background Research Goal and Specific Tasks Design and Implementation Experimental Results Conclusion Future Work Follow structure of final paper

Motivation Estimated number of over 32 million amputees around the world. 11.7 million patients go to Physical Therapy for a spectrum of issues. There are no affordable solutions that can help these patient

Neural-Machine Interface (NMI) NMI utilizes neural activities to control machines. Human Neural Control System External Devices Neural Machine Interface Takes sensors from the body and displays it from the human to machines

EMG-based NMI EMG (Electromyographic) signals: Effective bioelectric signals for expressing movement intent

EMG Pattern Recognition Processing EMG Signals to classify gestures 2 phases. Training classifies incoming data into gestures. Testing Utilizes the classified data to make prediction of the gesture. Mention pictures this are some of the application of patttern recognition

Requirements for EMG-base NMIs Fast → Needs to work in real time: Lag time < 200ms Portable → Can be taken anywhere Reliable → Predict gestures accurately Durable & Robust → Withstand everyday occurrences like sweat and shifts in the armband No existing nmi meeting these requirements. So we are trying to develop a system that does this.

Research Goal Develop an open, low-cost, portable and flexible research platform (MyoHMI) for developing EMG and IMU-based NMI by integrating edge and cloud computing techniques

Previous Work The way we had the cloud upload was unorganized

Specific Tasks Improve the previously developed MyoHMI software by Integrating IMU tab Integrating Cloud Computing Creating a website to make our project open source Creating an User Independent Pattern Recognition experiment

MyoHMI - System Architecture Raw Data Myo Armband Raw EMG and IMU Data Featured Data Database/Server Gesture Output Classification Model Cloud Pattern Recognition Virtual Reality Prosthetics

Myo Armband 8 EMG sensors surrounding the forearm 9-axis Inertial Measurement Unit (IMU): measures acceleration, angular velocity, and magnetic forces Bluetooth Low Energy (BLE) wireless communication

Integrating IMU Possible uses with MyoHMI: Detect hand and arm motion and orientation Possible applications include: VR/AR rehabilitation games Sign language recognition Control of assistive robots and prostheses

IMU Tab Collects two sets of IMU data from armband Data is utilized to calculate motion of the armband. Values are displayed on the screen An artificial horizon used to see the tilt of the armband

Integrating Data Storage Setup Elastic Compute Cloud (EC2) AWS Setup Amazon Relational Database Service (RDS) AWS Setup mySQL database Included method in MyoHMI to store users EMG data. EC2: web service that provides secure, resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers. RDS: makes it easy to set up, operate, and scale a relational database in the cloud.

User Login User login allows us to store user’s specific data to the cloud. Raw Data Extracted Feature Data Trained Models

Website Showcase product, and provide installation guidance Shows basic instructions on how to use the app Open to the community To make the app open source Help us grow our featured EMG Database Increase User Independent Pattern Recognition Accuracy

Website Menu Content Home How to use Guidance of how to download, install the app Contact us Survey

Website - Home Page

Website - How to Use

Android Application - Preview

Experiment - User Independent Pattern Recognition If we collect data from multiple users and train one model based on all of that data, we hope this model can be applied to any human We conducted an experiment collecting data from several users and developed model based on all data collected and tested data from each user against it

Experimental Controls The armband must be in a consistent position across all subjects. Align the light on the armband with the middle finger

Experimental Protocol 10 Subjects Train 8 Gestures: Rest, Fist, Point, Open Hand, Wave In, Wave Out, Supination, Pronation Upload to cloud (Tap “Cloud Image” button)

Experimental Results Developed a Java program, to be ran in the cloud, that trains one model from 9 subjects (excluding 1) Run data from each excluded subject against this model to see how it performs We like our accuracy around 90 percent or above to be used and its not feasible

Experimental Results - User Survey Question Topic Responsiveness Accuracy Ease of Use Aesthetic Average Rating 4.25 4.125 4.0 Table 1. Average Rating of Mobile Application from 10 users

Conclusion Integrated Inertial Measurement Unit (IMU) GUI Needs further testing and implementation Successfully implemented MySQL for Cloud Computing Created a website that provides instructions and is open source Created an User Independent Pattern Recognition experiment 50% accurate with 10 test subjects need more subjects Basically dr x wants us to go over the specific tasks from earlier and re present our results of each task. Focus on results for each task and the how each one worked out

Future Work Fully implement the IMU for gesture recognition Improve website instructions Gather additional trained data for User Independent Pattern Recognition Improve accuracy of the gesture recognition

Questions?

Resources MEMS gyro works Retrieved from: https://www.youtube.com/watch?v=WNf_kdrfeB4 Inertial Measurement Units I: https://stanford.edu/class/ee267/lectures/lecture9.pdf Introduction to IMU: http://students.iitk.ac.in/roboclub/lectures/IMU.pdf the standard coordinate 3-space system, aka 6DoF: http://dsky9.com/rift/vr-tech-6dof/ Design, Analysis, and Control of Prosthetic Hands: http://bretl.csl.illinois.edu/prosthetics/ Amputees statistics: https://web.stanford.edu/class/engr110/2011/LeBlanc-03a.pdf Physical Therapy: https://www.webpt.com/blog/post/7-thought-provoking-facts-about-physical-therapy-you-cant-ignore

Classifications and cross validations: MyoHMI Architecture Background Motivation: There is an estimated number of 6.7 million amputees around the world. Developing an inexpensive inutive, and reliable platform would help toward those limbless patients. Goal: Anticipate the human motion intention in real time from wearable sensors and by using basic Machine Learning algorithms. EMG: The application was created through visual studios receives raw EMG from the myo armband through the use of bluetooth low energy (BLE) Objectives Results Acknowledgements References Zhang, X., Chen, X., Li, Y., Wang, Kongqiao., and Yang, J. “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors.” IEEE Transactions on Systems, Man, and Cybernetics. 41.6 (2011). Aspires paper 2017 Decision tree. Wikipedia. https://en.wikipedia.org/wiki/Decision_tree This project is supported by the US Department of Education through the Minority Science and Engineering Improvement Program (MSEIP, Award No. P120A150014); and through the Hispanic-Serving Institution Science, Technology, Engineering, and Mathematics (HSI STEM) Program, Award No. P031C110159. We would like to thank Dr. Amelito Enriquez from Canada College for the opportunity to participate in this internship and for guiding us through the whole program. Also, we would like to express ur appreciation to Dr. Xiaorong Zhang from San Francisco State University, our faculty advisor, and our San Francisco State graduate mentor, Alexander David, for all his guidance and advice throughout the whole internship. Integrating Mobile and Cloud Computing for Electromyography (EMG) and Inertial Measurement Unit (IMU)-based Neural-Machine Interface Alex David1, Victor Delaplaine2, Ricardo Colin2, Danny Ceron2, Paul Leung2 Advisor: 1Dr. Xiaorong Zhang, Computer Engineering Department 1San Francisco State University, 2Cañada College Myo Armband Raw EMG and IMU Data Feature Extraction EMG 8 EMG sensors streaming at 200Hz A 9-axis IMU that streaming around 50Hz Communicates via Bluetooth Low Energy 10 test subjects perform 1 trial of 8 gestures Following the training phase records data for all gestures to be stored in a mySQL database hosted on AWS. Raw EMG on the app. Amazon Web Services Raw IMU on the app. Selected Features Develop an open, low-cost, portable and flexible research platform for developing EMG PR-based NMI by integrating edge and cloud computing techniques Take advantage of Amazon Web Services storage and find a good method to store data so it is very accessible for user independent pattern recognition. To Integrate the IMU data in MyoHMI so that the gesture predicted can be more precise. Cloud Pattern Recognition Gesture Classification Supervised Machine Learning Conclusions The User Independent PR module in the application has an estimated value of 50 % accuracy with 10 different subjects when testing a gesture fist. The most effective way to store data is using a using a structured query language database. Feature Calculator: data windowing: uses overlapping sliding window, it allow the application to obtain a denser data set, meaning the receiving device allows input of many sets of data and perform analysis before waiting for an acknowledgment or the current window to finish Feature extraction: Classifications and cross validations: Virtual Reality Prosthetics