Text Summarization via Semantic Representation 吳旻誠 2014/07/16.

Slides:



Advertisements
Similar presentations
The Chinese Room: Understanding and Correcting Machine Translation This work has been supported by NSF Grants IIS Solution: The Chinese Room Conclusions.
Advertisements

Object recognition and scene “understanding”
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
Languages & The Media, 5 Nov 2004, Berlin 1 New Markets, New Trends The technology side Stelios Piperidis
Connectionist models. Connectionist Models Motivated by Brain rather than Mind –A large number of very simple processing elements –A large number of weighted.
Semantic text features from small world graphs Jure Leskovec, IJS + CMU John Shawe-Taylor, Southampton.
Distributed Representations of Sentences and Documents
Xiaomeng Su & Jon Atle Gulla Dept. of Computer and Information Science Norwegian University of Science and Technology Trondheim Norway June 2004 Semantic.
Graphical models for part of speech tagging
Artificial Intelligence
LexRank: Graph-based Centrality as Salience in Text Summarization
Varieties of Helmholtz Machine Peter Dayan and Geoffrey E. Hinton, Neural Networks, Vol. 9, No. 8, pp , 1996.
Connectionist Models of Language Development: Grammar and the Lexicon Steve R. Howell McMaster University, 1999.
KNOWLEDGE BASED TECHNIQUES INTRODUCTION many geographical problems are ill-structured an ill-structured problem "lacks a solution algorithm.
Markov Logic and Deep Networks Pedro Domingos Dept. of Computer Science & Eng. University of Washington.
Constructing Knowledge Graph from Unstructured Text Image Source: Kundan Kumar Siddhant Manocha.
Research Topics CSC Parallel Computing & Compilers CSC 3990.
Efficient Estimation of Word Representations in Vector Space
1 CS 385 Fall 2006 Chapter 1 AI: Early History and Applications.
Application of Machine Learning for Sequential Data Peter Uherek.
Natural Language Processing Menu Based Natural Language Interfaces -Kyle Neumeier.
A Practical Web-based Approach to Generating Topic Hierarchy for Text Segments CIKM2004 Speaker : Yao-Min Huang Date : 2005/03/10.
Information Retrieval CSE 8337 Spring 2005 Modeling (Part II) Material for these slides obtained from: Modern Information Retrieval by Ricardo Baeza-Yates.
DeepBET Reverse-Engineering the Behavioral Targeting mechanisms of Ad Networks via Deep Learning Sotirios Chatzis Cyprus University of Technology.
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Vector Semantics Dense Vectors.
Efficient Estimation of Word Representations in Vector Space By Tomas Mikolov, Kai Chen, Greg Corrado, Jeffrey Dean. Google Inc., Mountain View, CA. Published.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
Chapter 13 Artificial Intelligence. Artificial Intelligence – Figure 13.1 The Turing Test.
Learning to Answer Questions from Image Using Convolutional Neural Network Lin Ma, Zhengdong Lu, and Hang Li Huawei Noah’s Ark Lab, Hong Kong
Xintao Wu University of Arkansas Introduction to Deep Learning 1.
DeepWalk: Online Learning of Social Representations
Bassem Makni SML 16 Click to add text 1 Deep Learning of RDF rules Semantic Machine Learning.
A Sentence Interaction Network for Modeling Dependence between Sentences Biao Liu, Minlie Huang Tsinghua University.
Distributed Representations for Natural Language Processing
Convolutional Neural Network
Hierarchical Question-Image Co-Attention for Visual Question Answering
End-To-End Memory Networks
Deep learning David Kauchak CS158 – Fall 2016.
Sentence Modeling Representation of sentences is the heart of Natural Language Processing A sentence model is a representation and analysis of semantic.
Recursive Neural Networks
Syntax-based Deep Matching of Short Texts
Web News Sentence Searching Using Linguistic Graph Similarity
Spring Courses CSCI 5922 – Probabilistic Models (Mozer) CSCI Mind Reading Machines (Sidney D’Mello) CSCI 7000 – Human Centered Machine Learning.
Implementing Boosting and Convolutional Neural Networks For Particle Identification (PID) Khalid Teli .
Unsupervised Learning and Autoencoders
Efficient Estimation of Word Representation in Vector Space
Orthogonal Range Searching and Kd-Trees
Hybrid computing using a neural network with dynamic external memory
Word2Vec CS246 Junghoo “John” Cho.
Distributed Representation of Words, Sentences and Paragraphs
Jun Xu Harbin Institute of Technology China
NETWORK-BASED MODEL OF LEARNING
Design of Hierarchical Classifiers for Efficient and Accurate Pattern Classification M N S S K Pavan Kumar Advisor : Dr. C. V. Jawahar.
Word Embedding Word2Vec.
Text Categorization Berlin Chen 2003 Reference:
Vector Representation of Text
实习生汇报 ——北邮 张安迪.
Word2Vec.
Presentation By: Eryk Helenowski PURE Mentor: Vincent Bindschaedler
Word embeddings (continued)
Word Embedding 모든 단어를 vector로 표시 Word vector Word embedding Word
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
CSE 291G : Deep Learning for Sequences
Vector Representation of Text
Presented by Nick Janus
CSC 578 Neural Networks and Deep Learning
CS249: Neural Language Model
Presentation transcript:

Text Summarization via Semantic Representation 吳旻誠 2014/07/16

Gist-content Question Ask the main idea of the talk. The correct answers describe the closest overall theme of the content, and the distractors refer to only small portions of the content.

Gist-content Question Q. Which of the following is closest to the main idea of this talk? – (A)We've had three explanations for why we might sleep. – (B)When you're tired, and you lack sleep, you have poor memory, poor creativity, increased impulsiveness, and overall poor judgment. – (C)If you have good sleep, it increases your concentration, attention, decision-making, creativity, social skills, health. – (D)You do not do anything much while you're asleep.

Gist-content question generation Seem the most important sentence as the main idea of the talk. Use LexRank to Measure the importance of sentences.

LexRank Measure the importance of sentences. Graph-based Model(Undirected). The nodes represent the sentences. The edges are the cosine similarity between nodes.

LexRank

Conditions that should be satisfied Stochastic matrix. Irreducible. Aperiodic.

LexRank

Similarity Between Sentences But…what’s the similarity between the following sentences? – I will fully support you. – I'll back you up all the way.

Deep Learning A set of algorithms in Machine Learning area. Learning representations of data. Has been applied to fields like computer vision, automatic speech recognition, natural language processing.

Reduce the Dimensionality of Data with Neural Networks

Word2Vec A Google open source tool. Compute vector representations of words. Provides an efficient implementation of – Continuous Bag-of-Words(CBOW) architecture. – Skip-gram architecture.

Continuous Bag-of-Words Model(CBOW)

Skip-gram Model

Softmax

Hierarchical Softmax Uses a binary tree representation of the output layer with the W words as its leaves. Each word w can be reached by an appropriate path from the root of the tree.

Sentence Representations Now we have the representations of words, but how can we represent sentences by these representations? A recursive Deep Learning model was proposed by Stanford Natural Language Processing Group.

Recursive Autoencoder

Dynamic Pooling

Paraphrase_Identification_(State_of_the_art)