Semantic History Embedding in Online Generative Topic Models Pu Wang (presenter) Authors: Loulwah AlSumait Daniel Barbará

Slides:



Advertisements
Similar presentations
+ Multi-label Classification using Adaptive Neighborhoods Tanwistha Saha, Huzefa Rangwala and Carlotta Domeniconi Department of Computer Science George.
Advertisements

Simultaneous Image Classification and Annotation Chong Wang, David Blei, Li Fei-Fei Computer Science Department Princeton University Published in CVPR.
Title: The Author-Topic Model for Authors and Documents
LDA Training System 8/22/2012.
Probabilistic Clustering-Projection Model for Discrete Data
Probabilistic inference in human semantic memory Mark Steyvers, Tomas L. Griffiths, and Simon Dennis 소프트컴퓨팅연구실오근현 TRENDS in Cognitive Sciences vol. 10,
Statistical Topic Modeling part 1
A Joint Model of Text and Aspect Ratings for Sentiment Summarization Ivan Titov (University of Illinois) Ryan McDonald (Google Inc.) ACL 2008.
MICHAEL PAUL AND ROXANA GIRJU UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN A Two-Dimensional Topic-Aspect Model for Discovering Multi-Faceted Topics.
Decoupling Sparsity and Smoothness in the Discrete Hierarchical Dirichlet Process Chong Wang and David M. Blei NIPS 2009 Discussion led by Chunping Wang.
Language Model based Information Retrieval: University of Saarland 1 A Hidden Markov Model Information Retrieval System Mahboob Alam Khalid.
Bayesian Nonparametric Matrix Factorization for Recorded Music Reading Group Presenter: Shujie Hou Cognitive Radio Institute Friday, October 15, 2010 Authors:
Generative Topic Models for Community Analysis
Caimei Lu et al. (KDD 2010) Presented by Anson Liang.
Topic Modeling with Network Regularization Md Mustafizur Rahman.
Stochastic Collapsed Variational Bayesian Inference for Latent Dirichlet Allocation James Foulds 1, Levi Boyles 1, Christopher DuBois 2 Padhraic Smyth.
Distributed Representations of Sentences and Documents
Topic models for corpora and for graphs. Motivation Social graphs seem to have –some aspects of randomness small diameter, giant connected components,..
Integrating Topics and Syntax Paper by Thomas Griffiths, Mark Steyvers, David Blei, Josh Tenenbaum Presentation by Eric Wang 9/12/2008.
STRUCTURED PERCEPTRON Alice Lai and Shi Zhi. Presentation Outline Introduction to Structured Perceptron ILP-CRF Model Averaged Perceptron Latent Variable.
Introduction The large amount of traffic nowadays in Internet comes from social video streams. Internet Service Providers can significantly enhance local.
1 A Discriminative Approach to Topic- Based Citation Recommendation Jie Tang and Jing Zhang Presented by Pei Li Knowledge Engineering Group, Dept. of Computer.
Correlated Topic Models By Blei and Lafferty (NIPS 2005) Presented by Chunping Wang ECE, Duke University August 4 th, 2006.
Topic Models in Text Processing IR Group Meeting Presented by Qiaozhu Mei.
1 Wikification CSE 6339 (Section 002) Abhijit Tendulkar.
Modeling Documents by Combining Semantic Concepts with Unsupervised Statistical Learning Author: Chaitanya Chemudugunta America Holloway Padhraic Smyth.
Online Learning for Latent Dirichlet Allocation
Popularity-Aware Topic Model for Social Graphs Junghoo “John” Cho UCLA.
(Infinitely) Deep Learning in Vision Max Welling (UCI) collaborators: Ian Porteous (UCI) Evgeniy Bart UCI/Caltech) Pietro Perona (Caltech)
Annealing Paths for the Evaluation of Topic Models James Foulds Padhraic Smyth Department of Computer Science University of California, Irvine* *James.
1 ENTROPY-BASED CONCEPT SHIFT DETECTION PETER VORBURGER, ABRAHAM BERNSTEIN IEEE ICDM 2006 Speaker: Li HueiJyun Advisor: Koh JiaLing Date:2007/11/6 1.
Eric H. Huang, Richard Socher, Christopher D. Manning, Andrew Y. Ng Computer Science Department, Stanford University, Stanford, CA 94305, USA ImprovingWord.
1 Linmei HU 1, Juanzi LI 1, Zhihui LI 2, Chao SHAO 1, and Zhixing LI 1 1 Knowledge Engineering Group, Dept. of Computer Science and Technology, Tsinghua.
Topic Modelling: Beyond Bag of Words By Hanna M. Wallach ICML 2006 Presented by Eric Wang, April 25 th 2008.
Category Discovery from the Web slide credit Fei-Fei et. al.
27. May Topic Models Nam Khanh Tran L3S Research Center.
Intelligent Database Systems Lab Presenter: WU, MIN-CONG Authors: Zhiyuan Liu, Wenyi Huang, Yabin Zheng and Maosong Sun 2010, ACM Automatic Keyphrase Extraction.
Style & Topic Language Model Adaptation Using HMM-LDA Bo-June (Paul) Hsu, James Glass.
Eric Xing © Eric CMU, Machine Learning Latent Aspect Models Eric Xing Lecture 14, August 15, 2010 Reading: see class homepage.
Graph-based Text Classification: Learn from Your Neighbors Ralitsa Angelova , Gerhard Weikum : Max Planck Institute for Informatics Stuhlsatzenhausweg.
Prediction of Influencers from Word Use Chan Shing Hei.
Jun Li, Peng Zhang, Yanan Cao, Ping Liu, Li Guo Chinese Academy of Sciences State Grid Energy Institute, China Efficient Behavior Targeting Using SVM Ensemble.
Stable Multi-Target Tracking in Real-Time Surveillance Video
Probabilistic Models for Discovering E-Communities Ding Zhou, Eren Manavoglu, Jia Li, C. Lee Giles, Hongyuan Zha The Pennsylvania State University WWW.
Introduction to LDA Jinyang Gao. Outline Bayesian Analysis Dirichlet Distribution Evolution of Topic Model Gibbs Sampling Intuition Analysis of Parameter.
Topic Models Presented by Iulian Pruteanu Friday, July 28 th, 2006.
One-class Classification of Text Streams with Concept Drift
School of Computer Science 1 Information Extraction with HMM Structures Learned by Stochastic Optimization Dayne Freitag and Andrew McCallum Presented.
Threshold Setting and Performance Monitoring for Novel Text Mining Wenyin Tang and Flora S. Tsai School of Electrical and Electronic Engineering Nanyang.
Relevance-Based Language Models Victor Lavrenko and W.Bruce Croft Department of Computer Science University of Massachusetts, Amherst, MA SIGIR 2001.
Dependence Language Model for Information Retrieval Jianfeng Gao, Jian-Yun Nie, Guangyuan Wu, Guihong Cao, Dependence Language Model for Information Retrieval,
CS246 Latent Dirichlet Analysis. LSI  LSI uses SVD to find the best rank-K approximation  The result is difficult to interpret especially with negative.
1 Adaptive Subjective Triggers for Opinionated Document Retrieval (WSDM 09’) Kazuhiro Seki, Kuniaki Uehara Date: 11/02/09 Speaker: Hsu, Yu-Wen Advisor:
Discovering Evolutionary Theme Patterns from Text - An Exploration of Temporal Text Mining Qiaozhu Mei and ChengXiang Zhai Department of Computer Science.
Automatic Labeling of Multinomial Topic Models
Xiangnan Kong,Philip S. Yu An Ensemble-based Approach to Fast Classification of Multi-label Data Streams Dept. of Computer Science University of Illinois.
Meta-Path-Based Ranking with Pseudo Relevance Feedback on Heterogeneous Graph for Citation Recommendation By: Xiaozhong Liu, Yingying Yu, Chun Guo, Yizhou.
RELATION EXTRACTION, SYMBOLIC SEMANTICS, DISTRIBUTIONAL SEMANTICS Heng Ji Oct13, 2015 Acknowledgement: distributional semantics slides from.
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Online Multiscale Dynamic Topic Models
Topic Significance Ranking for LDA Generative Models
Bayesian Inference for Mixture Language Models
Stochastic Optimization Maximization for Latent Variable Models
Topic models for corpora and for graphs
Feifei Li, Ching Chang, George Kollios, Azer Bestavros
Michal Rosen-Zvi University of California, Irvine
Dynamic Supervised Community-Topic Model
Nonparametric Bayesian Texture Learning and Synthesis
Topic Models in Text Processing
Jinwen Guo, Shengliang Xu, Shenghua Bao, and Yong Yu
Presentation transcript:

Semantic History Embedding in Online Generative Topic Models Pu Wang (presenter) Authors: Loulwah AlSumait Daniel Barbará Carlotta Domeniconi Department of Computer Science George Mason University SDM 2009

Outline  Introduction and related work  Online LDA (OLDA)  Parameter Generation Sliding history window Contribution weights  Experiments  Conclusion and future work

Introduction  When a topic is observed at a certain time, it is more likely to appear in the future  previously discovered topics hold important information about the underlying structure of data  Incorporating such information in future knowledge discovery can enhance the inferred topics

Related Work  Q. Sun, R. Li et al. ACL LDA-based Fisher kernel to measure the text semantic similarity between blocks of LDA documents  X. Wang et al. ICDM 2007 Topical N-Gram model that automatically identified feasible N-grams based on the context that surround it  X. Phan et al. IW3C a classifier on both a small set of labeled documents in addition to an LDA topic model estimated from Wikipedia.

Tracking Topics NdNd M t K z t i w t i  t  t  t  t t Time (time between t & t+1 = ε ) StSt Topic Evolution Tracking Priors Construction Emerging Topic Detection t t  t t+ 1 NdNd M t+ 1 K z i t+ 1 w i t+ 1  t+ 1  t+ 1  t+ 1  t+ 1 S t+ 1 Emerging Topic List  t+ 1  t+ 1 t+1 t+1  t + 1 Online LDA (OLDA)

Inference Process Current stream Historic observations  Parameter Generation  Simple inference problem Gibbs Sampling Current stream Historic observations

Topic Evolution Tracking  Topic alignment over time  Handles changes in lexicon, topic drift Topic 1 (0.65)Bank (0.44), money (0.35), loan (0.21) Topic 2 (0.35)Factory (0.53), production (0.34), labor (0.13) Topic 1 (0.43)Bank (0.5), credit (0.32), money (0.18) Topic 2 (0.57)Factory (0.48), cost (0.32), manufacturing (0.2) t Time t+ 1 P(topic) P(word|topic) Aligned topics over time

Sliding History Window  Consider all topic-word distributions within a “ sliding history window ” (δ)  Alternatives for keeping track of history at time t full memory, δ= t short memory, δ=1 Intermediate memory, δ= c  Matrix Evolution Matrix Dictionary Topic distribution over time

Contribution Control  Evolution Tuning Parameters ω Individual weights of models  Decaying history: ω 1 < ω 2 < … < ω δ  Equal contributions: ω 1 = ω 2 = … = ω δ Total weight of history (vs. weight of new observations)  Balanced weights (sum=1)  Biased toward the past (sum>1)  Biased toward the future (sum<1)

Parameter Generation  Priors of Topic distribution over words at time t+1  Generate topic distribution

Experimental Design  “Matlab Topic Modeling Toolbox”, by Mark Steyvers and Tom Griffiths  Datasets: NIPS  Proceedings from  1,740 papers, 13,649 unique words, 2,301,375 word tokens  13 streams, size from 90 to 250 doc’s per stream Reuters  News from 26-FEB-1987 to 19-OCT-1987  10,337 documents; 12,112 unique words; 793,936 word tokens  30 streams (29/340 doc’s, 1/517 doc’s)  Baselines: OLDAfixed: no memory OLDA (ω(1) ): short memory  Performance Evaluation measure: Perplexity Test set: documents of next year or stream

Reuters OLDA with fixed β vs. OLDA with semantic β No memory

Reuters OLDA with different window size and weights Increasing window size enhanced prediction Incremental history information (δ>1,sum>1) did not improve topic estimation at all Increase window size short memory Equal contribution Incremental History Information

NIPS OLDA with Different Window No memory Short memory Increasing window size enhanced prediction w.r.t. short memory Window size greater than 3 enhanced prediction Effect of total weight

NIPS OLDA with Different Total Weight No memory Sum of weight = 1 Decrease sum of weights  Models with lower total weight resulted in better prediction

NIPS & Reuters OLDA with Different Total Weight Variable sum(ω) δ = 2 Decrease total sum of weights Increase total sum of weights

NIPS OLDA with Equal vs Decaying History Contribution

Conclusions  the effect of embedding semantic information in LDA topic modeling of text streams  Parameter generation based on topical structures inferred in the past  Semantic embedding enhances OLDA prediction  Effect of Total influence of history, History window size, and Equal or decaying contributions  Future work use of prior-knowledge effect of embedded historic semantics on detecting emerging and/or periodic topics