Author: Sotetsu Koyamada, Yumi Shikauchi, et al. (Kyoto University)

Slides:



Advertisements
Similar presentations
Introduction to Support Vector Machines (SVM)
Advertisements

Mining customer ratings for product recommendation using the support vector machine and the latent class model William K. Cheung, James T. Kwok, Martin.
(SubLoc) Support vector machine approach for protein subcelluar localization prediction (SubLoc) Kim Hye Jin Intelligent Multimedia Lab
CSC321: Introduction to Neural Networks and Machine Learning Lecture 24: Non-linear Support Vector Machines Geoffrey Hinton.
ECG Signal processing (2)
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
An Introduction of Support Vector Machine
Support Vector Machines and Kernels Adapted from slides by Tim Oates Cognition, Robotics, and Learning (CORAL) Lab University of Maryland Baltimore County.
Machine learning continued Image source:
The Disputed Federalist Papers : SVM Feature Selection via Concave Minimization Glenn Fung and Olvi L. Mangasarian CSNA 2002 June 13-16, 2002 Madison,
The Mellow Years?: Neural Basis of Improving Emotional Stability over Age Williams et al. (2006) Amir Shams Tabrizi.
fMRI introduction Michael Firbank
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Application of Statistical Techniques to Neural Data Analysis Aniket Kaloti 03/07/2006.
Implementing a reliable neuro-classifier
Optimal Adaptation for Statistical Classifiers Xiao Li.
2806 Neural Computation Support Vector Machines Lecture Ari Visa.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Spatial Pyramid Pooling in Deep Convolutional
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Presented By Wanchen Lu 2/25/2013
Machine Learning Queens College Lecture 13: SVM Again.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 24 – Classifiers 1.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Data mining for credit card fraud: A comparative study.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 14/15 – TP19 Neural Networks & SVMs Miguel Tavares.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
A Tensorial Approach to Access Cognitive Workload related to Mental Arithmetic from EEG Functional Connectivity Estimates S.I. Dimitriadis, Yu Sun, K.
10/18/ Support Vector MachinesM.W. Mak Support Vector Machines 1. Introduction to SVMs 2. Linear SVMs 3. Non-linear SVMs References: 1. S.Y. Kung,
Functional Connectivity in an fMRI Working Memory Task in High-functioning Autism (Koshino et al., 2005) Computational Modeling of Intelligence (Fri)
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
Learning to perceive how hand-written digits were drawn Geoffrey Hinton Canadian Institute for Advanced Research and University of Toronto.
Today Ensemble Methods. Recap of the course. Classifier Fusion
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Fig.1. Flowchart Functional network identification via task-based fMRI To identify the working memory network, each participant performed a modified version.
RSVM: Reduced Support Vector Machines Y.-J. Lee & O. L. Mangasarian First SIAM International Conference on Data Mining Chicago, April 6, 2001 University.
MVPD – Multivariate pattern decoding Christian Kaul MATLAB for Cognitive Neuroscience.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
C O R P O R A T E T E C H N O L O G Y Information & Communications Neural Computation Machine Learning Methods on functional MRI Data Siemens AG Corporate.
Locally Linear Support Vector Machines Ľubor Ladický Philip H.S. Torr.
Blackbox classifiers for preoperative discrimination between malignant and benign ovarian tumors C. Lu 1, T. Van Gestel 1, J. A. K. Suykens 1, S. Van Huffel.
Support-Vector Networks C Cortes and V Vapnik (Tue) Computational Models of Intelligence Joon Shik Kim.
A Kernel Approach for Learning From Almost Orthogonal Pattern * CIS 525 Class Presentation Professor: Slobodan Vucetic Presenter: Yilian Qin * B. Scholkopf.
Next, this study employed SVM to classify the emotion label for each EEG segment. The basic idea is to project input data onto a higher dimensional feature.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Learning by Loss Minimization. Machine learning: Learn a Function from Examples Function: Examples: – Supervised: – Unsupervised: – Semisuprvised:
Big data classification using neural network
Brain scanning Writing a short essay. The BIG question: which part of the brain does what and how do we know?
Machine Learning Fisher’s Criteria & Linear Discriminant Analysis
Representational Similarity Analysis
Faster R-CNN – Concepts
Representational Similarity Analysis
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
To discuss this week What is a classifier? What is generalisation?
Image Recognition. Contents: Motivation Objective Definition Introduction Preprocessing / Edge Detection Neural Networks in Image Recognition Practical.
Blind Signal Separation using Principal Components Analysis
Introduction to Neural Networks
Introduction to Deep Learning with Keras
Goodfellow: Chapter 14 Autoencoders
John H.L. Hansen & Taufiq Al Babba Hasan
Machine Learning – a Probabilistic Perspective
Modeling IDS using hybrid intelligent systems
Bug Localization with Combination of Deep Learning and Information Retrieval A. N. Lam et al. International Conference on Program Comprehension 2017.
Example of training and deployment of deep convolutional neural networks. Example of training and deployment of deep convolutional neural networks. During.
Goodfellow: Chapter 14 Autoencoders
Do Better ImageNet Models Transfer Better?
Presentation transcript:

Deep Learning of fMRI big data: a novel approach to subject-transfer decoding Author: Sotetsu Koyamada, Yumi Shikauchi, et al. (Kyoto University) Submitted to Neural Networks SI: NN learning in Big Data Februry 3,2015 Speaker: Tian kai Date: 2015/4/10

Content Briefing Introduction Data Description Model Analysis for Trained Decoder Results Some Comments

Briefing Introduction The problem? Brain decoding The difficulties? Large variation in brain activities across individuals. The possible application? Brain machine interface(BMI), neuron rehabilitation, therapy of mental disorders Decoder Brain States Brain Activities

Briefing Introduction More Details 1.This problem can be thought as a classification problem. 2.It is difficult to obtain sufficient data from single person to build a reliable decoder. 3.The idea of subject-transfer.

fMRI Data Data acquisition: Human Connection Project(HCP) 499 healthy adults TR=720 ms TE=33.1 ms flip angle 52° FOV=208*180 mm 72 slices resolution: 2.0*2.0 mm Preprocessing: removal of spatial artifacts and distortions Within-subject cross-modal registrations, reduction of the bias field, and alignment to standard space. Feature dimension: 116

fMRI Data Each participants was asked to perform seven tasks related to the following categories: Emotion Gambling Language Motor Relational Social Working Memory

Model DNN

Subject-transfer Decoding Select 100 person from 499 individuals(D). 1) unrelated with each other 2)successfully completed all seven cognitive tasks twice. Separate D into 10-fold test valid train

Analysis for Trained Decoder Sensitivity analysis Sensitivity map:

PSA: to compute the direction v that f is most sensitive in the input space.

The solution to this problem is the maximal eigenvector of K. This vector was defined as principal sensitivity map(PSM).

Results Some baseline methods: logistic regression SVMs with linear kernel and RBF kernel

Results They investigated how the decoder’s performance changes with the size of training dataset.

Results Principle sensitivity analysis(PSA) ROI

Results PSA

Some Comments About this paper 1. What is big data? 2.Any innovation? 3. Deep learning for transfer learning.