A Deep Memory Network for Chinese Zero Pronoun Resolution

Slides:



Advertisements
Similar presentations
1 Opinion Summarization Using Entity Features and Probabilistic Sentence Coherence Optimization (UIUC at TAC 2008 Opinion Summarization Pilot) Nov 19,
Advertisements

Specialized models and ranking for coreference resolution Pascal Denis ALPAGE Project Team INRIA Rocquencourt F Le Chesnay, France Jason Baldridge.
Proceedings of the Conference on Intelligent Text Processing and Computational Linguistics (CICLing-2007) Learning for Semantic Parsing Advisor: Hsin-His.
Duyu Tang, Furu Wei, Nan Yang, Ming Zhou, Ting Liu, Bing Qin
Enhance legal retrieval applications with an automatically induced knowledge base Ka Kan Lo.
Longbiao Kang, Baotian Hu, Xiangping Wu, Qingcai Chen, and Yan He Intelligent Computing Research Center, School of Computer Science and Technology, Harbin.
Andreea Bodnari, 1 Peter Szolovits, 1 Ozlem Uzuner 2 1 MIT, CSAIL, Cambridge, MA, USA 2 Department of Information Studies, University at Albany SUNY, Albany,
Illinois-Coref: The UI System in the CoNLL-2012 Shared Task Kai-Wei Chang, Rajhans Samdani, Alla Rozovskaya, Mark Sammons, and Dan Roth Supported by ARL,
On the Issue of Combining Anaphoricity Determination and Antecedent Identification in Anaphora Resolution Ryu Iida, Kentaro Inui, Yuji Matsumoto Nara Institute.
Incident Threading for News Passages (CIKM 09) Speaker: Yi-lin,Hsu Advisor: Dr. Koh, Jia-ling. Date:2010/06/14.
This work is supported by the Intelligence Advanced Research Projects Activity (IARPA) via Department of Interior National Business Center contract number.
2014 EMNLP Xinxiong Chen, Zhiyuan Liu, Maosong Sun State Key Laboratory of Intelligent Technology and Systems Tsinghua National Laboratory for Information.
1 Learning Sub-structures of Document Semantic Graphs for Document Summarization 1 Jure Leskovec, 1 Marko Grobelnik, 2 Natasa Milic-Frayling 1 Jozef Stefan.
Opinion Holders in Opinion Text from Online Newspapers Youngho Kim, Yuchul Jung and Sung-Hyon Myaeng Reporter: Chia-Ying Lee Advisor: Prof. Hsin-Hsi Chen.
1 Opinion Retrieval from Blogs Wei Zhang, Clement Yu, and Weiyi Meng (2007 CIKM)
An Entity-Mention Model for Coreference Resolution with Inductive Logic Programming Xiaofeng Yang 1 Jian Su 1 Jun Lang 2 Chew Lim Tan 3 Ting Liu 2 Sheng.
CIKM Opinion Retrieval from Blogs Wei Zhang 1 Clement Yu 1 Weiyi Meng 2 1 Department of.
Multilingual Opinion Holder Identification Using Author and Authority Viewpoints Yohei Seki, Noriko Kando,Masaki Aono Toyohashi University of Technology.
Support Vector Machines and Kernel Methods for Co-Reference Resolution 2007 Summer Workshop on Human Language Technology Center for Language and Speech.
26/01/20161Gianluca Demartini Ranking Categories for Faceted Search Gianluca Demartini L3S Research Seminars Hannover, 09 June 2006.
A Multilingual Hierarchy Mapping Method Based on GHSOM Hsin-Chang Yang Associate Professor Department of Information Management National University of.
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
Facial Smile Detection Based on Deep Learning Features Authors: Kaihao Zhang, Yongzhen Huang, Hong Wu and Liang Wang Center for Research on Intelligent.
University Of Seoul Ubiquitous Sensor Network Lab Query Dependent Pseudo-Relevance Feedback based on Wikipedia 전자전기컴퓨터공학 부 USN 연구실 G
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation EMNLP’14 paper by Kyunghyun Cho, et al.
A Hierarchical Deep Temporal Model for Group Activity Recognition
A Sentence Interaction Network for Modeling Dependence between Sentences Biao Liu, Minlie Huang Tsinghua University.
Graph-based Dependency Parsing with Bidirectional LSTM Wenhui Wang and Baobao Chang Institute of Computational Linguistics, Peking University.
Ensembling Diverse Approaches to Question Answering
CNN-RNN: A Unified Framework for Multi-label Image Classification
End-To-End Memory Networks
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Object Detection based on Segment Masks
Bag-of-Visual-Words Based Feature Extraction
Deep Compositional Cross-modal Learning to Rank via Local-Global Alignment Xinyang Jiang, Fei Wu, Xi Li, Zhou Zhao, Weiming Lu, Siliang Tang, Yueting.
Progress Report WANG XUN 2015/10/02.
A Hierarchical Model of Reviews for Aspect-based Sentiment Analysis
NYU Coreference CSCI-GA.2591 Ralph Grishman.
mengye ren, ryan kiros, richard s. zemel
Generating Natural Answers by Incorporating Copying and Retrieving Mechanisms in Sequence-to-Sequence Learning Shizhu He, Cao liu, Kang Liu and Jun Zhao.
Bird-species Recognition Using Convolutional Neural Network
Jun Xu Harbin Institute of Technology China
The Components of an Expository Essay
Eiji Aramaki* Sadao Kurohashi* * University of Tokyo
Table Cell Search for Question Answering Huan Sun
Word Embedding Word2Vec.
MEgo2Vec: Embedding Matched Ego Networks for User Alignment Across Social Networks Jing Zhang+, Bo Chen+, Xianming Wang+, Fengmei Jin+, Hong Chen+, Cuiping.
Automatic Detection of Causal Relations for Question Answering
Introduction to Machine Reading Comprehension
Memory-augmented Chinese-Uyghur Neural Machine Translation
Papers 15/08.
Y2Seq2Seq: Cross-Modal Representation Learning for 3D Shape and Text by Joint Reconstruction and Prediction of View and Word Sequences 1, Zhizhong.
Socialized Word Embeddings
View Inter-Prediction GAN: Unsupervised Representation Learning for 3D Shapes by Learning Global Shape Memories to Support Local View Predictions 1,2 1.
Report by: 陆纪圆.
Word embeddings (continued)
Deep Learning for the Soft Cutoff Problem
Attention for translation
The Winograd Schema Challenge Hector J. Levesque AAAI, 2011
Visual Question Answering
Topic: Semantic Text Mining
The experiments based on Recurrent Neural Networks
Presented By: Harshul Gupta
Week 3 Presentation Ngoc Ta Aidean Sharghi.
Sequence-to-Sequence Models
Week 7 Presentation Ngoc Ta Aidean Sharghi
Visual Grounding.
CVPR 2019 Poster.
Shengcong Chen, Changxing Ding, Minfeng Liu 2018
Presentation transcript:

A Deep Memory Network for Chinese Zero Pronoun Resolution Qingyu Yin, Yu Zhang, Weinan Zhang and Ting Liu Research Center for Social Computing and Information Retrieval, Harbin Institute of Technology, Harbin, China

1 Introduction

What is Zero Pronoun Zero Pronoun (ZP) The gap in the sentence Express the sentence Represent something that is omitted Certain entities

Introduction Goal of this paper ZP is ubiquitous in Chinese Recover the ZP in the sentence Entities :Antecedent Noun Phrases ZP – Antecedents 我吃了一个苹果,<ZP> 很甜。 ZP is ubiquitous in Chinese Overt Pronoun 96%(English) 64%(Chinese)

A common way to ZP resolution Classification Problem ZP (我吃了一个苹果,<ZP> 很甜) Select a set of NP candidates Mention pair approach Classify for each pair ZP,NP1 (鸭梨) ZP,NP2 (苹果) … ZP,NPn (小明的书) ZP – NP2

Challenges of ZP resolution Overlook Semantic information Difficult to represent ZPs zero pronoun –overt pronouns No descriptive information Gender(男、女) Number(单数、复数) Represent gaps with some available components Context information <ZP> 很甜

Challenges of ZP resolution Overlook Semantic information Represent gaps with some available components Context information Potential candidate antecedents Only some subsets of candidate antecedents are needed Select importance candidates Utilize them to build up representations for the ZP Memory Network Use the importance of candidate explicitly

2 The approach

The approach -- ZP Semantic information is overlooked ZP has no descriptive information No actual content

The approach -- ZP Semantic information is overlooked Represent ZP ZP has no descriptive information No actual content Represent ZP contextual information <ZP> taste sweet. apples books

The approach -- ZP For ZP ZP-centered LSTM Employ two LSTM one to model the preceding context one to model the following context

The approach – Select NP Represent NP Average content Head word of an NP LSTM-based approach for modeling NP Content information Context information

The approach – Memory Network Memory: { r(np1), r(np2), r(np3), …, r(npn) } Select NP to fill in the gap (ZP)

The approach – Memory Network Memory: { r(np1), r(np2), r(np3), …, r(npn) } Select NP to fill in the gap (ZP)

The approach – Memory Network Memory: { r(np1), r(np2), r(np3), …, r(npn) } Select NP to fill in the gap (ZP)

The approach Get attention score for each NP candidate r(ZP), r(npi) feature vector ve(ZP,npi) si = tanh(W(r(ZP), r(npi) , ve(ZP,npi) )) For all the candidate NPs Add a softmax layer to gain the final attention score

3 Experimental results

Experimental Results Data set: Ontonotes 5.0 Experimental results R P Baseline: Chinese zero pronoun resolution: A deep learning approach. [C]. 2016 ACL. - Chen chen and Ng R P F Baseline system 51.0 51.4 51.2 Our approach (hop 1) 53.0 53.3 53.1 Our approach (hop 2) 53.7 54.0 53.9 Our approach (hop 3) 54.2 54.1 Our approach (hop 4) 54.4 54.7 54.3

Experimental Results Effectiveness of modeling ZPs and NPs R P F ZPContextFree 52.0 51.7 51.9 AntContextAvg 51.4 51.5 AntContHead 52.2 52.5 52.3 Our approach (hop 1) 53.0 53.3 53.1

Experimental Results Visualize Attention

4 Conclusion

Conclusion Effective memory network for modeling ZPs Future work contextual information ZP-centered LSTM candidate antecedents Modeling candidate antecedents multi-layer attention Memory network Future work Embeddings UNK embeddings Avoid Feature engineering

Thanks ! Q&A