The experiments based on Recurrent Neural Networks

Slides:



Advertisements
Similar presentations
Richard Socher Cliff Chiung-Yu Lin Andrew Y. Ng Christopher D. Manning
Advertisements

Kai Sheng-Tai, Richard Socher, Christopher D. Manning
Predicting the dropouts rate of online course using LSTM method
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SHOW.
Survey on state-of-the-art approaches: Neural Network Trends in Speech Recognition Survey on state-of-the-art approaches: Neural Network Trends in Speech.
A Sentence Interaction Network for Modeling Dependence between Sentences Biao Liu, Minlie Huang Tsinghua University.
Audio-Based Multimedia Event Detection Using Deep Recurrent Neural Networks Yun Wang, Leonardo Neves, Florian Metze 3/23/2016.
Convectional Neural Networks
S.Bengio, O.Vinyals, N.Jaitly, N.Shazeer
Course Outline (6 Weeks) for Professor K.H Wong
R-NET: Machine Reading Comprehension With Self-Matching Networks
Why industry cares about nlp for tamil?
CNN-RNN: A Unified Framework for Multi-label Image Classification
End-To-End Memory Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
CS 4501: Introduction to Computer Vision Computer Vision + Natural Language Connelly Barnes Some slides from Fei-Fei Li / Andrej Karpathy / Justin Johnson.
Sentence Modeling Representation of sentences is the heart of Natural Language Processing A sentence model is a representation and analysis of semantic.
Recurrent Neural Networks for Natural Language Processing
Summary of Week 1 (May 23 – May 27, 2016)
Future-Oriented Benchmarking through Social Media Analysis
Recurrent Neural Networks
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
Visualizing and Understanding Neural Models in NLP
Artificial Intelligence fuelled Sentiment Analysis
Deep Learning with TensorFlow online Training at GoLogica Technologies
Giuseppe Attardi Dipartimento di Informatica Università di Pisa
Different Units Ramakrishna Vedantam.
Deep Learning Workshop
Master’s Thesis defense Ming Du Advisor: Dr. Yi Shang
convolutional neural networkS
Convolutional Neural Networks for sentence classification
A critical review of RNN for sequence learning Zachary C
Neural network systems
A Comparative Study of Convolutional Neural Network Models with Rosenblatt’s Brain Model Abu Kamruzzaman, Atik Khatri , Milind Ikke, Damiano Mastrandrea,
Paraphrase Generation Using Deep Learning
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
convolutional neural networkS
Recurrent Neural Networks
Giuseppe Attardi Dipartimento di Informatica Università di Pisa
Understanding LSTM Networks
Seminar Topics and Projects
Word embeddings based mapping
Word embeddings based mapping
The Big Health Data–Intelligent Machine Paradox
Technical Capabilities
Lecture 16: Recurrent Neural Networks (RNNs)
The experiments based on CNN
Machine Translation(MT)
Textual Video Prediction
Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler
Giuseppe Attardi Dipartimento di Informatica Università di Pisa
Please enjoy.
The experiments based on word-embedding and SVM
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Learn to Comment Mentor: Mahdi M. Kalayeh
Lecture 21: Machine Learning Overview AP Computer Science Principles
Recurrent Neural Networks (RNNs)
Automatic Handwriting Generation
The Updated experiment based on LSTM
Ask and Answer Questions
Neural Machine Translation using CNN
Question Answering System
Weeks 1 and 2 Aaron Ott.
Recurrent Neural Networks
Bidirectional LSTM-CRF Models for Sequence Tagging
Neural Machine Translation by Jointly Learning to Align and Translate
Listen Attend and Spell – a brief introduction
Huawei CBG AI Challenges
Lecture 9: Machine Learning Overview AP Computer Science Principles
The experiment based on hier-attention
Presentation transcript:

The experiments based on Recurrent Neural Networks 2018-10-23 Raymond ZHAO Wenlong

Content Background The experiments based on RNN-LSTM (long short-Term memory) TODO

Background See the Introduction - Intelligent Configurations (iCON) Design develop a new product configuration approach in e-commerce industry to elicit customer needs Text Classification

The experiments based on LSTM (on Amazon Dataset) Better than CNN and SWEM Algs Update the source code: textClassifierRNN.py from Richard Liao And why? RNN alg uses time-series information => ideal for text and speech analysis Need to know more about RNN-LSTM

Text Classification Two main types of DNN architectures: CNN & RNN from the paper “Comparative Study of CNN and RNN for Natural Language Processing” by Wenpeng Yin etc., 2017 Sentiment Classification (SentiC) on Stanford Sentiment Treebank of movie reviews how much the task is dependent upon long semantics or feature detection =>For tasks where length of text is important, it makes sense to go with RNN variants. These types of tasks include: question-answering, translation etc. =>For tasks where feature detection in text is more important, for example, searching for angry terms, sadness, abuses, named entities etc. Convnets work well.

TODO Knowledge about RNN-LSTM ALL Experiments Data preprocessing from Dr. David LSTM with attention? Using PyTorch v1.0, supported by Facebook a lower-level approach and more flexibility => custom layers Keras is a high-level API

Thanks