Download presentation
Presentation is loading. Please wait.
Published byLídia Némethné Modified over 5 years ago
1
The experiments based on Recurrent Neural Networks
Raymond ZHAO Wenlong
2
Content Background The experiments based on RNN-LSTM (long short-Term memory) TODO
3
Background See the Introduction - Intelligent Configurations (iCON) Design develop a new product configuration approach in e-commerce industry to elicit customer needs Text Classification
4
The experiments based on LSTM (on Amazon Dataset)
Better than CNN and SWEM Algs Update the source code: textClassifierRNN.py from Richard Liao And why? RNN alg uses time-series information => ideal for text and speech analysis Need to know more about RNN-LSTM
5
Text Classification Two main types of DNN architectures:
CNN & RNN from the paper “Comparative Study of CNN and RNN for Natural Language Processing” by Wenpeng Yin etc., 2017 Sentiment Classification (SentiC) on Stanford Sentiment Treebank of movie reviews how much the task is dependent upon long semantics or feature detection =>For tasks where length of text is important, it makes sense to go with RNN variants. These types of tasks include: question-answering, translation etc. =>For tasks where feature detection in text is more important, for example, searching for angry terms, sadness, abuses, named entities etc. Convnets work well.
6
TODO Knowledge about RNN-LSTM ALL Experiments
Data preprocessing from Dr. David LSTM with attention? Using PyTorch v1.0, supported by Facebook a lower-level approach and more flexibility => custom layers Keras is a high-level API
7
Thanks
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.