IEEE BIBM 2016 Xu Min, Wanwen Zeng, Ning Chen, Ting Chen*, Rui Jiang*

Slides:



Advertisements
Similar presentations
Limin Wang, Yu Qiao, and Xiaoou Tang
Advertisements

Prof. Carolina Ruiz Computer Science Department Bioinformatics and Computational Biology Program WPI WELCOME TO BCB4003/CS4803 BCB503/CS583 BIOLOGICAL.
Spatial Pyramid Pooling in Deep Convolutional
Exploring Alternative Splicing Features using Support Vector Machines Feature for Alternative Splicing Alternative splicing is a mechanism for generating.
Deep Learning for Efficient Discriminative Parsing Niranjan Balasubramanian September 2 nd, 2015 Slides based on Ronan Collobert’s Paper and video from.
Computational Approaches for Biomarker Discovery SubbaLakshmiswetha Patchamatla.
Deep Visual Analogy-Making
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
Enhanced Regulatory Sequence Prediction Using Gapped k-mer Features 王荣 14S
Convolutional LSTM Networks for Subcellular Localization of Proteins
Final Report (30% final score) Bin Liu, PhD, Associate Professor.
Nawanol Theera-Ampornpunt, Seong Gon Kim, Asish Ghoshal, Saurabh Bagchi, Ananth Grama, and Somali Chaterji Fast Training on Large Genomics Data using Distributed.
Ganesh J, Soumyajit Ganguly, Manish Gupta, Vasudeva Varma, Vikram Pudi
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Efficient Estimation of Word Representations in Vector Space By Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean. Google Inc., Mountain View, CA. Published.
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
Spectral Algorithms for Learning HMMs and Tree HMMs for Epigenetics Data Kevin C. Chen Rutgers University joint work with Jimin Song (Rutgers/Palentir),
DeepWalk: Online Learning of Social Representations
Medical Semantic Similarity with a Neural Language Model Dongfang Xu School of Information Using Skip-gram Model for word embedding.
Graph-based Dependency Parsing with Bidirectional LSTM Wenhui Wang and Baobao Chang Institute of Computational Linguistics, Peking University.
Distributed Representations for Natural Language Processing
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Attention Model in NLP Jichuan ZENG.
Big data classification using neural network
CNN-RNN: A Unified Framework for Multi-label Image Classification
SUNY Korea BioData Mining Lab - Journal Review
CS273B: Deep learning for Genomics and Biomedicine
Faster R-CNN – Concepts
Hierarchical Question-Image Co-Attention for Visual Question Answering
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Deep Learning Amin Sobhani.
Sentence Modeling Representation of sentences is the heart of Natural Language Processing A sentence model is a representation and analysis of semantic.
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
Saliency-guided Video Classification via Adaptively weighted learning
Learning Mid-Level Features For Recognition
Pick samples from task t
Neural Machine Translation by Jointly Learning to Align and Translate
A Hierarchical Model of Reviews for Aspect-based Sentiment Analysis
Combining CNN with RNN for scene labeling (segmentation)
Basic machine learning background with Python scikit-learn
Deep learning and applications to Natural language processing
Vector-Space (Distributional) Lexical Semantics
Mean Euclidean Distance Error (mm)
Object detection as supervised classification
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
Deep Learning based Machine Translation
convolutional neural networkS
Principles of using neural networks for predicting molecular traits from DNA sequence Principles of using neural networks for predicting molecular traits.
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
Albert Xue, Binbin Huang, Jianrong Wang
Convolutional Neural Networks for Visual Tracking
Towards Understanding the Invertibility of Convolutional Neural Networks Anna C. Gilbert1, Yi Zhang1, Kibok Lee1, Yuting Zhang1, Honglak Lee1,2 1University.
convolutional neural networkS
Word Embedding Word2Vec.
Word embeddings based mapping
Word embeddings based mapping
Papers 15/08.
Y2Seq2Seq: Cross-Modal Representation Learning for 3D Shape and Text by Joint Reconstruction and Prediction of View and Word Sequences 1, Zhizhong.
Socialized Word Embeddings
RCNN, Fast-RCNN, Faster-RCNN
View Inter-Prediction GAN: Unsupervised Representation Learning for 3D Shapes by Learning Global Shape Memories to Support Local View Predictions 1,2 1.
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Attention for translation
Learn to Comment Mentor: Mahdi M. Kalayeh
CSE 291G : Deep Learning for Sequences
Automatic Handwriting Generation
Deep Learning in Bioinformatics
Peng Cui Tsinghua University
Bidirectional LSTM-CRF Models for Sequence Tagging
Presentation transcript:

IEEE BIBM 2016 Xu Min, Wanwen Zeng, Ning Chen, Ting Chen*, Rui Jiang* ISMB/ECCB 2017 Chromatin Accessibility Prediction via Convolutional Long Short-Term Memory Networks with k-mer Embedding Xu Min, Wanwen Zeng, Ning Chen, Ting Chen*, Rui Jiang* Presenter: Xu Min Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China minxueric@gmail.com

Xu Min, Tsinghua University IEEE BIBM 2016 Contents Background Method Results 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Contents Background Method Results 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Background What is chromatin accessibility? (Wang et al., 2016) Biological experiments such as DNase-seq, FAIRE-seq, and ATAC-seq, etc. Expensive & time consuming. 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Previous work Computational methods mainly fall into two classes: kmer-based methods e.g. kmer-SVM (Lee et al., 2011), gkm-SVM (Ghandi et al., 2014) Pros: feature sets for arbitrary-length sequences Cons: capture only local motif patterns deep learning-based methods (CNN-based) e.g. DeepBind (Alipanahi et al., 2015), DeepSEA(Zhou and Troyanskaya, 2015), Basset (Kelley et al., 2016), DeepEnhancer (Min et al., 2016) Pros: detect motifs automatically, superior performance Cons: one-hot encoding, require fix-length input 2017/7/24 Xu Min, Tsinghua University

Our idea Regard DNA sequences as sentences. IEEE BIBM 2016 Our idea Regard DNA sequences as sentences. Sentence classification (Kim, 2014) Word embedding: Skip-gram (Mikolov et al., 2013) and GloVe (Pennington et al., 2014) 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Contents Background Method Results 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Our approach Our approach: Convolutional long short-term memory networks with k-mer embedding. Combine two classes of methods: We regard one DNA sequence as a kmer sequence, and train kmer vector by GloVe. We then use convolutional LSTM network for supervised learning to classify input sequences. 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Method Feature learning in three stages: Classification loss function: 2017/7/24 Xu Min, Tsinghua University

k-mer embedding with GloVe IEEE BIBM 2016 k-mer embedding with GloVe Learning embedding representations mainly from the co-occurrence statistics information of k-mers. GloVe model’s cost function: Embedding k-mers by GloVe unsupervised results: 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Bidirectional LSTM Produce a fixed-length output features. Learn long-range relationships of DNA sequences. 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Novelties We fuse the informative k-mer features into a deep neural network by embedding k-mers into a low dimensional vector space. We are able to handle variable-length DNA sequences as input and capture long-distance dependencies thanks to LSTM units. 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Contents Background Method Results 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Experiments setup Dataset: ENCODE DNase-seq experiments for 6 cell lines. Train:validation:test sets 0.85:0.05:0.10 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Experiments setup Unsupervised training of k-mer embedding: Kmer length k=6, Splitting stride s=2 GloVe: window size=15, embedding dimension=100 Supervised deep learning architecture: Keras + Theano RMSprop: learning rate=0.001, batch size=3000 Early stopping: maximum iterations=60, patience=5 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Model Evaluation 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Model Evaluation 2017/7/24 Xu Min, Tsinghua University

Visualization of k-mer embedding IEEE BIBM 2016 Visualization of k-mer embedding 2017/7/24 Xu Min, Tsinghua University

Visualization of k-mer embedding IEEE BIBM 2016 Visualization of k-mer embedding 2017/7/24 Xu Min, Tsinghua University

Efficacy of k-mer embedding IEEE BIBM 2016 Efficacy of k-mer embedding 2017/7/24 Xu Min, Tsinghua University

Efficacy of convolution and BLSTM IEEE BIBM 2016 Efficacy of convolution and BLSTM 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Sensitivity analysis 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Conclusion Main contributions summarized as below: We innovatively introduce an effective embedding representation of input DNA sequences using the unsupervised learning algorithm GloVe in the deep learning framework. We are capable of handling variable-length sequences as input and capturing complex long-range dependencies on them by exploiting the BLSTM network. We prove our model produces state-of-the-art performance in sequence classification tasks, compared to other recent methods including gkmSVM and DeepSEA. 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Reference Alipanahi,B. et al. (2015) Predicting the sequence specificities of DNA-and RNA-binding proteins by deep learning. Nat. Biotechnol., 33(8), 831–838. Ghandi,M. et al. (2014) Enhanced regulatory sequence prediction using gapped k-mer features. PLoS Comput. Biol., 10, e1003711. Kelley,D.R. et al. (2016) Basset: learning the regulatory code of the accessible genome with deep convolutional neural networks. Genome Res., 26(7), 990–999. Kim,Y. (2014). Convolutional neural networks for sentence classification. In: Conference on Empirical Methods on Natural Language Processing (EMNLP), Association for Computational Linguistics (ACL), pp.1746– 1751. Lee,D. et al. (2011) Discriminative prediction of mammalian enhancers from dna sequence. Genome Res., 21, 2167–2180. Mikolov,T. et al. (2013). Distributed representations of words and phrases and their compositionality. In: Burges, C.J.C. et al. (eds) Advances in Neural Information Processing Systems, NIPS. Curran Associates, NY 12571. pp. 3111–3119. Min,X. et al. (2016). DeepEnhancer predicting enhancers by convolutional neural networks. In: IEEE International Conference on Bioinformatics and Biomedicine, IEEE, pp. 637–644. Pennington,J. et al. (2014). GloVe: global vectors for word representation. In: EMNLP, volume 14, p.1532–43. Wang,Y. et al. (2016) Modeling the causal regulatory network by integrating chromatin accessibility and transcriptome data. Natl. Sci. Rev., 3(2), 240–251. Zhou,J., and Troyanskaya,O.G. (2015) Predicting effects of noncoding variants with deep learning-based sequence model. Nat. Methods, 12, 931–934. 2017/7/24 Xu Min, Tsinghua University

Xu Min, Tsinghua University IEEE BIBM 2016 Thank you! Q&A Travel Fellowship Generously Supported by HitSEQ COSI: High Throughput Sequencing Algorithms & Applications 2017/7/24 Xu Min, Tsinghua University