A Sentence Interaction Network for Modeling Dependence between Sentences Biao Liu, Minlie Huang Tsinghua University.

Slides:



Advertisements
Similar presentations
Spatial Pyramid Pooling in Deep Convolutional
Advertisements

Longbiao Kang, Baotian Hu, Xiangping Wu, Qingcai Chen, and Yan He Intelligent Computing Research Center, School of Computer Science and Technology, Harbin.
Richard Socher Cliff Chiung-Yu Lin Andrew Y. Ng Christopher D. Manning
Deep Learning for Speech and Language Yoshua Bengio, U. Montreal NIPS’2009 Workshop on Deep Learning for Speech Recognition and Related Applications December.
Deep Learning Neural Network with Memory (1)
School of Engineering and Computer Science Victoria University of Wellington Copyright: Peter Andreae, VUW Image Recognition COMP # 18.
Text Summarization via Semantic Representation 吳旻誠 2014/07/16.
Predicting the dropouts rate of online course using LSTM method
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SHOW.
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
Course Outline (6 Weeks) for Professor K.H Wong
R-NET: Machine Reading Comprehension With Self-Matching Networks
Convolutional Sequence to Sequence Learning
CNN-RNN: A Unified Framework for Multi-label Image Classification
Faster R-CNN – Concepts
Hierarchical Question-Image Co-Attention for Visual Question Answering
End-To-End Memory Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Natural Language and Text Processing Laboratory
Sentence Modeling Representation of sentences is the heart of Natural Language Processing A sentence model is a representation and analysis of semantic.
Recursive Neural Networks
Neural Machine Translation by Jointly Learning to Align and Translate
Attention Is All You Need
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
Progress Report WANG XUN 2015/10/02.
Visualizing and Understanding Neural Models in NLP
Implementing Boosting and Convolutional Neural Networks For Particle Identification (PID) Khalid Teli .
Deep Learning with TensorFlow online Training at GoLogica Technologies
Different Units Ramakrishna Vedantam.
Deep learning and applications to Natural language processing
Lecture 5 Smaller Network: CNN
Generating Natural Answers by Incorporating Copying and Retrieving Mechanisms in Sequence-to-Sequence Learning Shizhu He, Cao liu, Kang Liu and Jun Zhao.
AI in Cyber-security: Examples of Algorithms & Techniques
Hybrid computing using a neural network with dynamic external memory
Image Question Answering
Attention Is All You Need
زبان بدن Body Language.
Zhu Han University of Houston
Recursive Structure.
convolutional neural networkS
A critical review of RNN for sequence learning Zachary C
Convolutional Neural Networks
Lei Sha, Jing Liu, Chin-Yew Lin, Sujian Li, Baobao Chang, Zhifang Sui
Grid Long Short-Term Memory
Paraphrase Generation Using Deep Learning
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
convolutional neural networkS
Recurrent Neural Networks
RNNs & LSTM Hadar Gorodissky Niv Haim.
Understanding LSTM Networks
Seminar Topics and Projects
Error control coding for wireless communication technologies
The Big Health Data–Intelligent Machine Paradox
Forward and Backward Max Pooling
Unsupervised Pretraining for Semantic Parsing
Graph Neural Networks Amog Kamsetty January 30, 2019.
Attention.
Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler
Word embeddings (continued)
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Topic: Semantic Text Mining
The experiments based on Recurrent Neural Networks
Question Answering System
Baseline Model CSV Files Pandas DataFrame Sentence Lists
Weeks 1 and 2 Aaron Ott.
Recurrent Neural Networks
SUCCESS CRITERIA: UNDERSTAND BASIC SENTENCES FOR GRADE E.
Neural Machine Translation by Jointly Learning to Align and Translate
Visual Grounding.
Presentation transcript:

A Sentence Interaction Network for Modeling Dependence between Sentences Biao Liu, Minlie Huang Tsinghua University

Motivation In answer selection: The semantic relations between sentences is crucial for some NLP tasks.

Motivation We want to model the semantic relation between two sentences. What do cats look like? Cats have large eyes and furry bodies.

Motivation Convolutional Neural Network Architectures for Matching Natural Language Sentences

Motivation Dynamic Pooling and Unfolding Recursive Autoencoders for Paraphrase Detection

Background RNN

Background LSTM

Method Step 1 Use a LSTM to model the two sentences

Method Step 2: Introduce interactions

Method Step 2

Method Much complex semantic relations can be modeled with a vector. Through gates, different words can have different weights for classification.

Method SIN It’s powerful to model interactions between words, but not strong enough for phrase interactions. SIN-CONV Add a convolution layer to model phrases.

Experiments Answer Selection To select correct answers from a set of candidates for a given question.

Experiments Answer Selection Results

Experiments Dialogue Act Analysis To identify the dialogue act of a sentence in a dialogue.

Experiments Dialogue Act Analysis

Interaction Analysis Interaction Mechanism Analysis

Thanks