Joint Inference of Multiple Label Types in Large Networks Deepayan Chakrabarti Stanislav Funiak Jonathan.

Slides:



Advertisements
Similar presentations
CS5038 The Electronic Society Lecture: Social Networking Lecture Outline Social Networking Service Social Networking Sites –Bebo –Friendster –MySpace Social.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
+ Multi-label Classification using Adaptive Neighborhoods Tanwistha Saha, Huzefa Rangwala and Carlotta Domeniconi Department of Computer Science George.
Unsupervised Learning
Clustering Beyond K-means
Comments on Hierarchical models, and the need for Bayes Peter Green, University of Bristol, UK IWSM, Chania, July 2002.
Universal Learning over Related Distributions and Adaptive Graph Transduction Erheng Zhong †, Wei Fan ‡, Jing Peng*, Olivier Verscheure ‡, and Jiangtao.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Social Media Mining Chapter 5 1 Chapter 5, Community Detection and Mining in Social Media. Lei Tang and Huan Liu, Morgan & Claypool, September, 2010.
Consistent probabilistic outputs for protein function prediction William Stafford Noble Department of Genome Sciences Department of Computer Science and.
NetSci07 May 24, 2007 Entity Resolution in Network Data Lise Getoor University of Maryland, College Park.
Modeling Relationship Strength in Online Social Networks Rongjian Xiang 1, Jennifer Neville 1, Monica Rogati 2 1 Purdue University, 2 LinkedIn WWW 2010.
Machine Learning in Practice Lecture 3 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
1 Yuxiao Dong *$, Jie Tang $, Sen Wu $, Jilei Tian # Nitesh V. Chawla *, Jinghai Rao #, Huanhuan Cao # Link Prediction and Recommendation across Multiple.
Abstract We present a model of curvilinear grouping using piecewise linear representations of contours and a conditional random field to capture continuity.
INFERRING NETWORKS OF DIFFUSION AND INFLUENCE Presented by Alicia Frame Paper by Manuel Gomez-Rodriguez, Jure Leskovec, and Andreas Kraus.
Recommendations via Collaborative Filtering. Recommendations Relevant for movies, restaurants, hotels…. Recommendation Systems is a very hot topic in.
Purnamrita Sarkar (UC Berkeley) Deepayan Chakrabarti (Yahoo! Research) Andrew W. Moore (Google, Inc.) 1.
Computer Vision Group University of California Berkeley 1 Cue Integration in Figure/Ground Labeling Xiaofeng Ren, Charless Fowlkes and Jitendra Malik.
Cue Integration in Figure/Ground Labeling Xiaofeng Ren, Charless Fowlkes and Jitendra Malik, U.C. Berkeley We present a model of edge and region grouping.
Maximizing Product Adoption in Social Networks
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Like like alike Joint friendship and interest propagation in social networks Shuang Hong Yang, Bo Long, Alex Smola Nara Sadagopan, Zhaohui Zheng, Hongyan.
Topic 13 Network Models Credits: C. Faloutsos and J. Leskovec Tutorial
Modeling Information Diffusion in Networks with Unobserved Links Quang Duong Michael P. Wellman Satinder Singh Computer Science and Engineering University.
Purnamrita Sarkar (Carnegie Mellon) Deepayan Chakrabarti (Yahoo! Research) Andrew W. Moore (Google, Inc.)
Modeling Relationship Strength in Online Social Networks Rongjing Xiang: Purdue University Jennifer Neville: Purdue University Monica Rogati: LinkedIn.
Popularity versus Similarity in Growing Networks Fragiskos Papadopoulos Cyprus University of Technology M. Kitsak, M. Á. Serrano, M. Boguñá, and Dmitri.
Using the Margins Command to Estimate and Interpret Adjusted Predictions and Marginal Effects Richard Williams
Using Transactional Information to Predict Link Strength in Online Social Networks Indika Kahanda and Jennifer Neville Purdue University.
Agenda Path smoothing PID Graph slam.
8/25/05 Cognitive Computations Software Tutorial Page 1 SNoW: Sparse Network of Winnows Presented by Nick Rizzolo.
WALKING IN FACEBOOK: A CASE STUDY OF UNBIASED SAMPLING OF OSNS junction.
Classical Music for Rock Fans?: Novel Recommendations for Expanding User Interests Makoto Nakatsuji, Yasuhiro Fujiwara, Akimichi Tanaka, Toshio Uchiyama,
Information Spread and Information Maximization in Social Networks Xie Yiran 5.28.
Logistic Regression Debapriyo Majumdar Data Mining – Fall 2014 Indian Statistical Institute Kolkata September 1, 2014.
User Cooperation via Rateless Coding Mahyar Shirvanimoghaddam, Yonghui Li, and Branka Vucetic The University of Sydney, Australia IEEE GLOBECOM 2012 &
Online Learning for Collaborative Filtering
 Building Networks. First Decisions  What do the nodes represent?  What do the edges represent?  Know this before doing anything with data!
The Matrix: Using Intermediate Features to Classify and Predict Friends in a Social Network Michael Matczynski Status Report April 14, 2006.
The Database and Info. Systems Lab. University of Illinois at Urbana-Champaign User Profiling in Ego-network: Co-profiling Attributes and Relationships.
Today Ensemble Methods. Recap of the course. Classifier Fusion
Interactive Deduplication using Active Learning Sunita Sarawagi and Anuradha Bhamidipaty Presented by Doug Downey.
Maximizing the Spread of Influence through a Social Network Authors: David Kempe, Jon Kleinberg, É va Tardos KDD 2003.
Multiple Location Profiling for Users and Relationships from Social Network and Content Rui Li, Shengjie Wang, Kevin Chen-Chuan Chang University of Illinois.
Link Prediction Topics in Data Mining Fall 2015 Bruno Ribeiro
Xutao Li1, Gao Cong1, Xiao-Li Li2
Single-Pass Belief Propagation
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Supervised Random Walks: Predicting and Recommending Links in Social Networks Lars Backstrom (Facebook) & Jure Leskovec (Stanford) Proc. of WSDM 2011 Present.
Privacy Preserving in Social Network Based System PRENTER: YI LIANG.
Online Social Networks and Media Absorbing random walks Label Propagation Opinion Formation.
Enhanced hypertext categorization using hyperlinks Soumen Chakrabarti (IBM Almaden) Byron Dom (IBM Almaden) Piotr Indyk (Stanford)
Photoconsistency constraint C2 q C1 p l = 2 l = 3 Depth labels If this 3D point is visible in both cameras, pixels p and q should have similar intensities.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
Alan Mislove Bimal Viswanath Krishna P. Gummadi Peter Druschel.
To Join or Not to Join: The Illusion of Privacy in social Networks with Mixed Public and Private User Profiles Paper by Elena Zhelea and Lise Getoor.
Rodney Nielsen Many of these slides were adapted from: I. H. Witten, E. Frank and M. A. Hall Data Science Algorithms: The Basic Methods Clustering WFH:
StressSense: Detecting Stress in Unconstrained Acoustic Environments using Smartphones Hong Lu, Mashfiqui Rabbi, Gokul T. Chittaranjan, Denise Frauendorfer,
Lecture 7: Constrained Conditional Models
Community detection in graphs
Location Recommendation — for Out-of-Town Users in Location-Based Social Network Yina Meng.
Joint Inference of Multiple Label Types in Large Networks
Comparing Slopes of the Lines using the Microsoft Kinect 2.0
Jinhong Jung, Woojung Jin, Lee Sael, U Kang, ICDM ‘16
Logistic Regression.
Graph-based Security and Privacy Analytics via Collective Classification with Joint Weight Learning and Propagation Binghui Wang, Jinyuan Jia, and Neil.
GANG: Detecting Fraudulent Users in OSNs
--WWW 2010, Hongji Bao, Edward Y. Chang
Joint Label Inference in Networks
Presentation transcript:

Joint Inference of Multiple Label Types in Large Networks Deepayan Chakrabarti Stanislav Funiak Jonathan Chang Sofus A. Macskassy 1

Profile Inference  Profile:  Hometown: Palo Alto  High School: Gunn  College: Stanford  Employer: Facebook  Current city: Sunnyvale  Hobbies, Politics, Music, …  A complete profile is a boon:  People are easily searchable  Tailored news recommendations  Group recommendations  Ad targeting (especially local)  How can we fill in missing profile fields? ? ? ? 2

Profile Inference  Use the social network  and the assumption of homophily  Friendships form between “similar” people Infer missing labels to maximize similarity u v1v1 v2v2 v3v3 v4v4 v5v5 H = Palo Alto E = Microsoft H = MPK E = FB H = Atlanta E = Google H = Palo Alto E = ? H = ? E = ? 3

Previous Work  Random walks [Talukdar+/09, Baluja+/08]  Statistical Relational Learning [Lu+/03, Macskassy+/07]  Relational Dependency Networks [Neville+/07]  Latent models [Palla+/12]  Either:  too generic; require too much labeled data;  do not handle multiple label types;  are outperformed by label propagation [Macskassy+/07] 4

Previous Work  Label Propagation [Zhu+/02, Macskassy+/07]  “Propagate” labels through the network  Probability (I have hometown H) = fraction of my friends whose hometown is H  Iterate until convergence  Repeat for current city, college, and all other label types u v1v1 v2v2 v3v3 v4v4 v5v5 H = Palo Alto H = MPK H = Atlanta H = Palo Alto H = ? H = Palo Alto (0.5) MPK (0.25) Atlanta (0.25) H = Palo Alto (…) MPK (…) Atlanta (…) 5

Problem u H = Calcutta H = Calcutta CC = Bangalore CC = Berkeley H = ? CC = ? H = Calcutta CC = Bangalore Interactions between label types are not considered 6

The EdgeExplain Model  Instead of taking friendships as given, explain friendships using labels  A friendship u ∼ v is explained if: u and v share the same hometown OR current city OR high school OR college OR employer 7

The EdgeExplain Model u H = Calcutta H = Calcutta CC = Bangalore CC = Berkeley H = ? CC = ? H = Calcutta CC = Berkeley Hometown friends Current City friends We set H and CC so as to jointly explain all friendships 8

 Find f to maximize ∏ explained (f u, f v ) The EdgeExplain Model u∼v “Soft” OR over label types Probability distribution for each label type Explain all friendships 9

 Find f to maximize ∏ explained (f u, f v ) The EdgeExplain Model u∼v “Soft” OR over label types explained (f u, f v ) = softmax( is_reason t (f ut, f vt ) ) t∊Τt∊Τ is_reason t (f ut, f vt ) = ∑ f utℓ. f vtℓ ℓ ∊L(t) softmax( is_reason t (f ut, f vt ) ) = σ ( α. ∑ is_reason t (f ut, f vt ) + c ) t∊Τt∊Τ t∊Τt∊Τ Is u ∼v explained by label type t? Chances of sharing a label of type t Sigmoid for softmax 10

H = ? CC = ? The EdgeExplain Model ∑ t is_reason t u H = Calcutta CC = Berkeley H = Calcutta CC = Bangalore ∑ t is_reason t 11 softmax( is_reason t (f ut, f vt ) ) = σ ( α. ∑ is_reason t (f ut, f vt ) + c )

The EdgeExplain Model H = Calcutta CC = ? ∑ t is_reason t u H = Calcutta CC = Berkeley H = Calcutta CC = Bangalore ∑ t is_reason t H=Cal 12 softmax( is_reason t (f ut, f vt ) ) = σ ( α. ∑ is_reason t (f ut, f vt ) + c )

H = Calcutta CC = Bangalore The EdgeExplain Model ∑ t is_reason t H=Cal Marginal gain with CC = Bangalore u H = Calcutta CC = Berkeley H = Calcutta CC = Bangalore ∑ t is_reason t H=Cal H=Cal CC=B’lore 13 softmax( is_reason t (f ut, f vt ) ) = σ ( α. ∑ is_reason t (f ut, f vt ) + c )

H=Cal CC=Berkeley H = Calcutta CC = Berkeley The EdgeExplain Model ∑ t is_reason t H=Cal u H = Calcutta CC = Berkeley H = Calcutta CC = Bangalore ∑ t is_reason t H=Cal H=Cal CC=Berkeley More gain with CC = Berkeley 14 softmax( is_reason t (f ut, f vt ) ) = σ ( α. ∑ is_reason t (f ut, f vt ) + c )

 α controls the slope  high α  steep  one reason per edge is enough  low α  linear  consider multiple reasons per edge H=Cal CC=Berkeley The EdgeExplain Model softmax( is_reason t (f ut, f vt ) ) = σ ( α. ∑ is_reason t (f ut, f vt ) + c ) ∑ t is_reason t H=Cal ∑ t is_reason t H=Cal H=Cal CC=Berkeley 15

Experiments  1.1B users of the Facebook social network  O( 10M ) labels  5-fold cross-validation  Measure recall  Did we get the correct label in our top prediction? Top-3?  Inference:  proximal gradient descent  implemented via message-passing in Apache Giraph [Ching/13]  Sparsify graph by considering K closest friends by age 16

Results (varying closest friends K)  K=100 or K=200 closest friends is best  K=400 hurts; these friendships are probably due to other factors 17 Hometown Current city High school College Employer Lift of EdgeExplain over K=20 Hometown Current city High school College Employer

Results (versus Label Propagation)  Joint modeling helps most for employer  Significant gains for high school and college as well 18 Hometown Current city High school College Employer Lift of EdgeExplain over Label Propagation Hometown Current city High school College Employer Lift of EdgeExplain over Label Propagation

Conclusions  Assumption: each friendship has one reason  Model: explain friendships via user attributes  Results: up to 120% lift for 1 and 60% for 19

Result (effect of α ) 20  High α is best  one reason per friendship is enough Lift of EdgeExplain over α =0. 1 Hometown Current city High school College Employer

Results (varying closest friends K)  K=100 or K=200 closest friends is best  K=400 hurts; these friendships are probably due to other factors 21

Results (versus Label Propagation)  Joint modeling helps most for employer  Significant gains for high-school and college as well 22