A Fast Unified Model for Parsing and Sentence Understanding

Slides:



Advertisements
Similar presentations
Compiler construction in4020 – lecture 4 Koen Langendoen Delft University of Technology The Netherlands.
Advertisements

CPSC Compiler Tutorial 9 Review of Compiler.
1 Semantic Processing. 2 Contents Introduction Introduction A Simple Compiler A Simple Compiler Scanning – Theory and Practice Scanning – Theory and Practice.
Elkhound: A Fast, Practical GLR Parser Generator Scott McPeak 9/16/02 OSQ Lunch.
Lecture #8, Feb. 7, 2007 Shift-reduce parsing,
ANTLR with ASTs. Abstract Syntax Trees ANTLR can be instructed to produce ASTs for the output of the parser ANTLR uses a prefix notation for representing.
Java Programming Introduction & Concepts. Introduction to Java Developed at Sun Microsystems by James Gosling in 1991 Object Oriented Free Compiled and.
UNIT - 1Topic - 3. Computer software is a program that tells a computer what to do. Computer software, or just software, is any set of machine-readable.
CISC 471 First Exam Review Game Questions. Overview 1 Draw the standard phases of a compiler for compiling a high level language to machine code, showing.
Part-Of-Speech Tagging using Neural Networks Ankur Parikh LTRC IIIT Hyderabad
1 CDT314 FABER Formal Languages, Automata and Models of Computation Lecture 5 Mälardalen University 2010.
Lesson 9 CDT301 – Compiler Theory, Spring 2011 Teacher: Linus Källberg.
Dependency Parser for Swedish Project for EDA171 by Jonas Pålsson Marcus Stamborg.
Daisy Arias Math 382/Lab November 16, 2010 Fall 2010.
INTRODUCTION TO COMPILERS(cond….) Prepared By: Mayank Varshney(04CS3019)
Intermediate Code Representations
Addressing the Rare Word Problem in Neural Machine Translation
By Sameer Allan Alawnah Salman Jamil Shtayeh.  Replacement for Relay-Based Analog Controlling System  Become New type of controlling system ( digital.
Haitham Elmarakeby.  Speech recognition
CS3230R. What is a parser? What is an LR parser? A bottom-up parser that efficiently handles deterministic context-free languages in guaranteed linear.
8.5.3 – Unit Vectors, Linear Combinations. In the case of vectors, we have a special vector known as the unit vector – Unit Vector = any vector with a.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SHOW.
PROBABILISTIC GRAPH-BASED DEPENDENCY PARSING WITH CONVOLUTIONAL NEURAL NETWORK Zhisong Zhang, Hai Zhao and Lianhui QIN Shanghai Jiao Tong University
Learning to Answer Questions from Image Using Convolutional Neural Network Lin Ma, Zhengdong Lu, and Hang Li Huawei Noah’s Ark Lab, Hong Kong
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
A Hierarchical Deep Temporal Model for Group Activity Recognition
Graph-based Dependency Parsing with Bidirectional LSTM Wenhui Wang and Baobao Chang Institute of Computational Linguistics, Peking University.
Open Source Compiler Construction (for the JVM)
Attention Model in NLP Jichuan ZENG.
Olivier Siohan David Rybach
End-To-End Memory Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
G. Pullaiah College of Engineering and Technology
Parsing and Parser Parsing methods: top-down & bottom-up
Introduction to Parsing (adapted from CS 164 at Berkeley)
Recursive Neural Networks
Recurrent Neural Networks for Natural Language Processing
An overview of decoding techniques for LVCSR
Recurrent Neural Networks
Show and Tell: A Neural Image Caption Generator (CVPR 2015)
Visualizing and Understanding Neural Models in NLP
ZAS Tandem Workshop December, 11 – 13, 2010 Peter beim Graben
Formal Language Theory
ICS 491 Big Data Analytics Fall 2017 Deep Learning
Deep Learning with TensorFlow online Training at GoLogica Technologies
Are End-to-end Systems the Ultimate Solutions for NLP?
Compiler Construction
Bit-Pragmatic Deep Neural Network Computing
Course supervisor: Lubna Siddiqui
Recursive Structure.
Grid Long Short-Term Memory
Paraphrase Generation Using Deep Learning
Neural Networks Advantages Criticism
Final Presentation: Neural Network Doc Summarization
Understanding LSTM Networks
MathWorks Compiler Course – Day 4
Natural Language to SQL(nl2sql)
Report by: 陆纪圆.
Compiler Construction
Attention.
实习生汇报 ——北邮 张安迪.
Representation Probabilistic Graphical Models Local Structure Overview.
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Attention for translation
Learn to Comment Mentor: Mahdi M. Kalayeh
Recurrent Neural Networks (RNNs)
Artificial Intelligence 2004 Speech & Natural Language Processing
Bidirectional LSTM-CRF Models for Sequence Tagging
Week 7 Presentation Ngoc Ta Aidean Sharghi
Visual Grounding.
Presentation transcript:

A Fast Unified Model for Parsing and Sentence Understanding Compilation theory A Fast Unified Model for Parsing and Sentence Understanding Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

Summary Problematic SPINN overview Shift-reduce parser SPINN and TreeLSTM Advantages of the SPINN Conclusion Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

Problematic Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

SPINN overview Stack-augmented Parser-Interpreter Neural Network Combines parsing and interpretation Shift-reduce parser Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

Shift-reduce parser A = B + 3 Reduce Shift Step Parse Stack Look Ahead Unscanned Parser Action   id = B + 3 Shift 1 = B + 3 2 id = + 3 3 id = id + 4 id = value Reduce by value 5 id = sums Reduce by sums 6 id = sums + int 7 id = sums + int eof 8 id = sums + value 9 10 assign Done Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

SPINN Designed to produce a vector representation of a sentence as its output SPINN = Shift-reduce + TreeLong Short-Term Memory based on neural network Neural network : improve encoding of sentences by using their structure Sentence are not linear sequence Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

Advantages of the SPINN Most words have multiple senses or meanings : Use the context (hybrid). Supports batched computation Speedup of up to 25x over other tree-structured models Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

Conclusion Recent (2016) Part of deep learning expansion Antoine SOUSTELLE - Pierre RAINERO 02/12/2018

Bibliography https://arxiv.org/abs/1603.06021 https://fr.slideshare.net/tuvistavie/tree-lstm https://www.wikipedia.org/ Antoine SOUSTELLE - Pierre RAINERO 02/12/2018