Recurrent Neural Networks

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks 2. Overview  The McCulloch-Pitts neuron  Pattern space  Limitations  Learning.
Advertisements

Introduction to Recurrent neural networks (RNN), Long short-term memory (LSTM) Wenjie Pei In this coffee talk, I would like to present you some basic.
Deep Learning Neural Network with Memory (1)
Convolutional LSTM Networks for Subcellular Localization of Proteins
Predicting the dropouts rate of online course using LSTM method
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SHOW.
Learning to Answer Questions from Image Using Convolutional Neural Network Lin Ma, Zhengdong Lu, and Hang Li Huawei Noah’s Ark Lab, Hong Kong
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
Audio-Based Multimedia Event Detection Using Deep Recurrent Neural Networks Yun Wang, Leonardo Neves, Florian Metze 3/23/2016.
Lecture 6 Smaller Network: RNN
Deep Learning RUSSIR 2017 – Day 3
Hierarchical Question-Image Co-Attention for Visual Question Answering
End-To-End Memory Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
CS 4501: Introduction to Computer Vision Computer Vision + Natural Language Connelly Barnes Some slides from Fei-Fei Li / Andrej Karpathy / Justin Johnson.
Recurrent Neural Networks
Jie Wu1, Dongyan Huang2, Lei Xie1 and Haizhou Li2,3
Recursive Neural Networks
Recurrent Neural Networks for Natural Language Processing
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
Progress Report WANG XUN 2015/10/02.
A Hierarchical Model of Reviews for Aspect-based Sentiment Analysis
Visualizing and Understanding Neural Models in NLP
Figure 1. Examples of e-cigarette discussions in social media
ICS 491 Big Data Analytics Fall 2017 Deep Learning
Fig. 2 System flowchart. Each of the four iterations contains two models (SS, and ASA/HSE/CN/ANGLES), for a total of eight LSTM-BRNN based models. The.
Different Units Ramakrishna Vedantam.
The UK Tier 1 Entrepreneur Visa and the UK Representative of Overseas Business Visa - SmartMove2UK
Neural Networks 2 CS446 Machine Learning.
Memory: Activities First
Recursive Structure.
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
A First Look at Music Composition using LSTM Recurrent Neural Networks
Recurrent Neural Networks
A Fast Unified Model for Parsing and Sentence Understanding
Final Presentation: Neural Network Doc Summarization
RNNs & LSTM Hadar Gorodissky Niv Haim.
Understanding LSTM Networks
ECE599/692 - Deep Learning Lecture 14 – Recurrent Neural Network (RNN)
Introduction to RNNs for NLP
Recurrent Neural Networks
The Big Health Data–Intelligent Machine Paradox
Other Classification Models: Recurrent Neural Network (RNN)
REU student: Domonique Cox Graduate mentors: Kaiqiang Song
Lecture 16: Recurrent Neural Networks (RNNs)
Recurrent Encoder-Decoder Networks for Time-Varying Dense Predictions
Project 1: Smart Home REU student: Jason Ling Graduate mentors: Safa Bacanli Faculty mentor(s): Damla Turgut Week 8 (July 2 – July ) Accomplishments:
RCNN, Fast-RCNN, Faster-RCNN
Report by: 陆纪圆.
Attention.
Please enjoy.
LSTM: Long Short Term Memory
Cat.
Meta Learning (Part 2): Gradient Descent as LSTM
C.2.10 Sample Questions.
NEGATIVE VOLTAGE POSITIVE.
C.2.8 Sample Questions.
Learn to Comment Mentor: Mahdi M. Kalayeh
C.2.8 Sample Questions.
Recurrent Neural Networks (RNNs)
The Updated experiment based on LSTM
The experiments based on Recurrent Neural Networks
Question Answering System
Recurrent Neural Networks
Sequence-to-Sequence Models
Modeling Price of ethereum with machine learning
Deep learning: Recurrent Neural Networks CV192
Bidirectional LSTM-CRF Models for Sequence Tagging
LHC beam mode classification
CRCV REU 2019 Aaron Honculada.
Presentation transcript:

Recurrent Neural Networks

Today Recurrent Neural Network Cell Recurrent Neural Networks (unenrolled) LSTMs, Bi-LSTMs, Stacked Bi-LSTMs

Recurrent Neural Network Cell 𝑅𝑁𝑁 ℎ 0 ℎ 1 𝑥 1

Recurrent Neural Network Cell ℎ 1 =tanh⁡( 𝑊 ℎℎ ℎ 0 + 𝑊 ℎ𝑥 𝑥 1 ) 𝑅𝑁𝑁 ℎ 0 ℎ 1 𝑥 1

Recurrent Neural Network Cell 𝑦 1 ℎ 1 𝑅𝑁𝑁 ℎ 0 ℎ 1 ℎ 1 =tanh⁡( 𝑊 ℎℎ ℎ 0 + 𝑊 ℎ𝑥 𝑥 1 ) 𝑥 1 𝑦 1 =softmax⁡( 𝑊 ℎ𝑦 ℎ 1 )

Recurrent Neural Network Cell 𝑦 1 ℎ 1 𝑅𝑁𝑁 ℎ 0 ℎ 1 𝑥 1

Recurrent Neural Network Cell 𝑦 1 =[0.1, 0.05, 0.05, 0.1, 0.7] ℎ 1 =[1 2 0 3 0 0 1 ] 𝑅𝑁𝑁 ℎ 0 =[0 0 0 0 0 0 0 ] 𝑥 1 = [0 0 1 0 0] a b c d e c

Generating Samples from the Recurrent Neural Network Cell

LSTM Cell (Long Short-Term Memory) ℎ 0 ℎ 1 𝐿𝑆𝑇𝑀 𝑐 0 𝑐 1 𝑥 1

Recurrent Neural Network Cell 𝑦 1 ℎ 1 𝑅𝑁𝑁 ℎ 0 ℎ 1 𝑥 1

Recurrent Neural Network Cell ℎ 1 𝑅𝑁𝑁 ℎ 0 ℎ 1 𝑥 1

(Unrolled) Recurrent Neural Network <<space>> 𝑦 1 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 𝑥 1 𝑥 2 𝑥 3 c a t

(Unrolled) Recurrent Neural Network cat likes eating 𝑦 1 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 𝑥 1 𝑥 1 𝑥 1 the cat likes

(Unrolled) Recurrent Neural Network positive / negative sentiment rating 𝑦 1 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 𝑥 1 𝑥 1 𝑥 1 the cat likes

(Unrolled) Recurrent Neural Network 𝑥 1 𝑅𝑁𝑁 ℎ 0 ℎ 1 ℎ 2 ℎ 3 c a t <<space>> 𝑦 1

Bidirectional Recurrent Neural Network gato quiere comer 𝑦 1 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝐵𝑅𝑁𝑁 ℎ 1 B𝑅𝑁𝑁 ℎ 2 𝐵𝑅𝑁𝑁 ℎ 3 𝑥 1 𝑥 1 𝑥 1 the cat wants

Stacked Recurrent Neural Network 𝑦 1 𝑦 1 𝑦 1 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 𝑥 1 𝑥 1 𝑥 1 c a t

Bidirectional Stacked Recurrent Neural Network 𝑦 1 𝑦 1 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 ℎ 1 ℎ 2 ℎ 3 ℎ 0 𝑅𝑁𝑁 ℎ 1 𝑅𝑁𝑁 ℎ 2 𝑅𝑁𝑁 ℎ 3 𝑥 1 𝑥 1 𝑦 1 𝑥 1 c a t

Questions?