Brain-Machine Interface (BMI) System Identification Siddharth Dangi and Suraj Gowda BMIs decode neural activity into control signals for prosthetic limbs.

Slides:



Advertisements
Similar presentations
Machine Learning and Data Mining Linear regression
Advertisements

CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Tracking Unknown Dynamics - Combined State and Parameter Estimation Tracking Unknown Dynamics - Combined State and Parameter Estimation Presenters: Hongwei.
Adaptive Filters S.B.Rabet In the Name of GOD Class Presentation For The Course : Custom Implementation of DSP Systems University of Tehran 2010 Pages.
Lecture 13 – Perceptrons Machine Learning March 16, 2010.
Inferring Hand Motion from Multi-Cell Recordings in Motor Cortex using a Kalman Filter Wei Wu*, Michael Black †, Yun Gao*, Elie Bienenstock* §, Mijail.
Perceptron.
The loss function, the normal equation,
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Artificial Neural Networks ECE 398BD Instructor: Shobha Vasudevan.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
1 Part I Artificial Neural Networks Sofia Nikitaki.
Least-Mean-Square Algorithm CS/CMPE 537 – Neural Networks.
I welcome you all to this presentation On: Neural Network Applications Systems Engineering Dept. KFUPM Imran Nadeem & Naveed R. Butt &
Single-Channel Speech Enhancement in Both White and Colored Noise Xin Lei Xiao Li Han Yan June 5, 2002.
Adaptive Rao-Blackwellized Particle Filter and It’s Evaluation for Tracking in Surveillance Xinyu Xu and Baoxin Li, Senior Member, IEEE.
Speaker Adaptation for Vowel Classification
Goals of Adaptive Signal Processing Design algorithms that learn from training data Algorithms must have good properties: attain good solutions, simple.
A brain-machine interface instructed by direct intracortical microstimulation Joseph E. O’Doherty, Mikhail A. Lebedev, Timothy L. Hanson, Nathan A. Fitzsimmons.
Optimal Adaptation for Statistical Classifiers Xiao Li.
Lecture 4 Neural Networks ICS 273A UC Irvine Instructor: Max Welling Read chapter 4.
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Data mining and statistical learning - lecture 11 Neural networks - a model class providing a joint framework for prediction and classification  Relationship.
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
Adaptive Signal Processing
By Asst.Prof.Dr.Thamer M.Jamel Department of Electrical Engineering University of Technology Baghdad – Iraq.
Neural NetworksNN 11 Neural netwoks thanks to: Basics of neural network theory and practice for supervised and unsupervised.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
Matlab Matlab Sigmoid Sigmoid Perceptron Perceptron Linear Linear Training Training Small, Round Blue-Cell Tumor Classification Example Small, Round Blue-Cell.
STUDY, MODEL & INTERFACE WITH MOTOR CORTEX Presented by - Waseem Khatri.
Minimum Mean Squared Error Time Series Classification Using an Echo State Network Prediction Model Mark Skowronski and John Harris Computational Neuro-Engineering.
Real time DSP Professors: Eng. Julian Bruno Eng. Mariano Llamedo Soria.
Logistic Regression Week 3 – Soft Computing By Yosi Kristian.
Operant Conditioning of Cortical Activity E Fetz, 1969.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
CHAPTER 5 S TOCHASTIC G RADIENT F ORM OF S TOCHASTIC A PROXIMATION Organization of chapter in ISSO –Stochastic gradient Core algorithm Basic principles.
Non-Bayes classifiers. Linear discriminants, neural networks.
LEAST MEAN-SQUARE (LMS) ADAPTIVE FILTERING. Steepest Descent The update rule for SD is where or SD is a deterministic algorithm, in the sense that p and.
Insight: Steal from Existing Supervised Learning Methods! Training = {X,Y} Error = target output – actual output.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Overview of Adaptive Filters Quote of the Day When you look at yourself from a universal standpoint, something inside always reminds or informs you that.
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Naïve Coadaptive Cortical Control Gregory J Gage, Kip A Ludwig, Kevin J Otto, Edward L Ionides and Daryl R Kipke. Journal of Neural Engineering 2 (2005)
Artificial Intelligence Methods Neural Networks Lecture 3 Rakesh K. Bissoondeeal Rakesh K. Bissoondeeal.
Neural NetworksNN 21 Architecture We consider the architecture: feed- forward NN with one layer It is sufficient to study single layer perceptrons with.
Artificial Neural Networks for Data Mining. Copyright © 2011 Pearson Education, Inc. Publishing as Prentice Hall 6-2 Learning Objectives Understand the.
Information Processing by Neuronal Populations Chapter 6: Single-neuron and ensemble contributions to decoding simultaneously recoded spike trains Information.
DSP-CIS Part-III : Optimal & Adaptive Filters Chapter-9 : Kalman Filters Marc Moonen Dept. E.E./ESAT-STADIUS, KU Leuven
Linear Models Tony Dodd. 21 January 2008Mathematics for Data Modelling: Linear Models Overview Linear models. Parameter estimation. Linear in the parameters.
Machine Learning Artificial Neural Networks MPλ ∀ Stergiou Theodoros 1.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
CEE 6410 Water Resources Systems Analysis
Deep Feedforward Networks
One-layer neural networks Approximation problems
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 473 Introduction to Artificial Intelligence Neural Networks
Neural Networks Advantages Criticism
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
Neural Networks Geoff Hulten.
Chapter 8: Generalization and Function Approximation
Recursively Adapted Radial Basis Function Networks and its Relationship to Resource Allocating Networks and Online Kernel Learning Weifeng Liu, Puskal.
Artificial Neural Network learning
Presentation transcript:

Brain-Machine Interface (BMI) System Identification Siddharth Dangi and Suraj Gowda BMIs decode neural activity into control signals for prosthetic limbs Aim to improve quality of life for severely disabled patients suffering from neurological injuries and disease Restore a human’s ability to move and communicate with the world

“Center-Out” Training Task Monkey uses joystick to move a cursor to targets Record neural firing rates and cursor kinematic data Train decoding algorithm using collected data to predict cursor kinematics Switch cursor control from joystick to decoder

Echo-State Network (ESN) Problem – relationship between neural signals and limb kinematics is highly nonlinear Idea – create a large, recurrent neural network with random weights Can be used to learn the input-output behavior of a nonlinear system Training connections inside the reservoir is difficult and computationally expensive Use supervised learning to train only the output layer weights

Kalman Filter-based methods Adaptive Kalman filter – Allow parameters to auto-adjust – Stochastic gradient descent Standard model State prediction Kinematic state at time t Firing rates at time t Gaussian noise variables Combined Kalman-ESN method – Weight estimates based on error variances

LMS and Wiener Filter Wiener Filter – Rewrite model equation by tiling collected data: – Closed-form solution for weight matrix: Least-Mean Squares (LMS) – Gradient descent solution for weight matrix: Kinematic state at time n Firing rates at time n Error term at time n Filter weights Model:

Performance Results

Prediction Results Simulation Parameters Trained decoders for 100 seconds on training data Measured Mean-Squared Error (MSE) and Correlation Coefficient (CC) on 40 seconds of new data Prediction Method Position MSEVelocity MSEPosition CCVelocity CC LMS Filter Wiener Filter Echo-State Network Adaptive Kalman Standard Kalman Filter Combined Kalman-ESN

Classification of Neural State Neuron firing rates signals can be treated as (behavior- driven) state-space trajectories Experiment – use logistic regression to classify trajectories into higher-level states (e.g., planning vs. not planning) Classes: Logistic Regression Model: Online Estimation Algorithm 93.1% classification accuracy All errors were “false alarm” before/after planning periods

Conclusions Ranking methods based on MSE of position predictions shows that: – “Pure” linear regression models (LMS and Wiener) need more training time to perform well – Kalman-based models that maintain a state-space/dynamics model perform better than those that don’t – Combination of linear (Kalman) and nonlinear (ESN) methods performs the best, and better than any single method alone