Progress Report ekker. Problem Definition In cases such as object recognition, we can not include all possible objects for training. So transfer learning.

Slides:



Advertisements
Similar presentations
Sparsification and Sampling of Networks for Collective Classification
Advertisements

Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Overview of this week Debugging tips for ML algorithms
+ Multi-label Classification using Adaptive Neighborhoods Tanwistha Saha, Huzefa Rangwala and Carlotta Domeniconi Department of Computer Science George.
Online Max-Margin Weight Learning for Markov Logic Networks Tuyen N. Huynh and Raymond J. Mooney Machine Learning Group Department of Computer Science.
Markov Models.
Linked data: P redicting missing properties Klemen Simonic, Jan Rupnik, Primoz Skraba {klemen.simonic, jan.rupnik,
Analysis and Modeling of Social Networks Foudalis Ilias.
Maximum Margin Markov Network Ben Taskar, Carlos Guestrin Daphne Koller 2004.
Markov Logic Networks: Exploring their Application to Social Network Analysis Parag Singla Dept. of Computer Science and Engineering Indian Institute of.
Markov Logic Networks Instructor: Pedro Domingos.
Trust and Profit Sensitive Ranking for Web Databases and On-line Advertisements Raju Balakrishnan (Arizona State University)
Markov Logic: Combining Logic and Probability Parag Singla Dept. of Computer Science & Engineering Indian Institute of Technology Delhi.
1 Unsupervised Semantic Parsing Hoifung Poon and Pedro Domingos EMNLP 2009 Best Paper Award Speaker: Hao Xiong.
Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u
1 Modularity and Community Structure in Networks* Final project *Based on a paper by M.E.J Newman in PNAS 2006.
Structural Inference of Hierarchies in Networks BY Yu Shuzhi 27, Mar 2014.
DATA MINING LECTURE 12 Link Analysis Ranking Random walks.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Mining and Searching Massive Graphs (Networks)
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Lecture 21: Spectral Clustering
Using Structure Indices for Efficient Approximation of Network Properties Matthew J. Rattigan, Marc Maier, and David Jensen University of Massachusetts.
Introduction to Information Retrieval Introduction to Information Retrieval Hinrich Schütze and Christina Lioma Lecture 21: Link Analysis.
Zdravko Markov and Daniel T. Larose, Data Mining the Web: Uncovering Patterns in Web Content, Structure, and Usage, Wiley, Slides for Chapter 1:
CSE 574: Artificial Intelligence II Statistical Relational Learning Instructor: Pedro Domingos.
Inference. Overview The MC-SAT algorithm Knowledge-based model construction Lazy inference Lifted inference.
Relational Models. CSE 515 in One Slide We will learn to: Put probability distributions on everything Learn them from data Do inference with them.
Learning, Logic, and Probability: A Unified View Pedro Domingos Dept. Computer Science & Eng. University of Washington (Joint work with Stanley Kok, Matt.
On the Proper Treatment of Quantifiers in Probabilistic Logic Semantics Islam Beltagy and Katrin Erk The University of Texas at Austin IWCS 2015.
PageRank Identifying key users in social networks Student : Ivan Todorović, 3231/2014 Mentor : Prof. Dr Veljko Milutinović.
Memoplex Browser: Searching and Browsing in Semantic Networks CPSC 533C - Project Update Yoel Lanir.
Statistical Relational Learning Pedro Domingos Dept. Computer Science & Eng. University of Washington.
Piyush Kumar (Lecture 2: PageRank) Welcome to COT5405.
Markov Logic Parag Singla Dept. of Computer Science University of Texas, Austin.
1 Applications of Relative Importance  Why is relative importance interesting? Web Social Networks Citation Graphs Biological Data  Graphs become too.
Random Walks and Semi-Supervised Learning Longin Jan Latecki Based on : Xiaojin Zhu. Semi-Supervised Learning with Graphs. PhD thesis. CMU-LTI ,
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Markov Logic And other SRL Approaches
The PageRank Citation Ranking: Bringing Order to the Web Lawrence Page, Sergey Brin, Rajeev Motwani, Terry Winograd Presented by Anca Leuca, Antonis Makropoulos.
The Influence Mobility Model: A Novel Hierarchical Mobility Modeling Framework Muhammad U. Ilyas and Hayder Radha Michigan State University.
BioSnowball: Automated Population of Wikis (KDD ‘10) Advisor: Dr. Koh, Jia-Ling Speaker: Lin, Yi-Jhen Date: 2010/11/30 1.
Markov Logic Networks Pedro Domingos Dept. Computer Science & Eng. University of Washington (Joint work with Matt Richardson)
Link Prediction Topics in Data Mining Fall 2015 Bruno Ribeiro
CPSC 322, Lecture 31Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 33 Nov, 25, 2015 Slide source: from Pedro Domingos UW & Markov.
CPSC 422, Lecture 32Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 32 Nov, 27, 2015 Slide source: from Pedro Domingos UW & Markov.
CPSC 322, Lecture 30Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 30 Nov, 23, 2015 Slide source: from Pedro Domingos UW.
Single-Pass Belief Propagation
Kijung Shin Jinhong Jung Lee Sael U Kang
Complexity and Efficient Algorithms Group / Department of Computer Science Testing the Cluster Structure of Graphs Christian Sohler joint work with Artur.
Supervised Random Walks: Predicting and Recommending Links in Social Networks Lars Backstrom (Facebook) & Jure Leskovec (Stanford) Proc. of WSDM 2011 Present.
1 Random Walks on the Click Graph Nick Craswell and Martin Szummer Microsoft Research Cambridge SIGIR 2007.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
CIS750 – Seminar in Advanced Topics in Computer Science Advanced topics in databases – Multimedia Databases V. Megalooikonomou Link mining ( based on slides.
Markov Random Fields in Vision
Meta-Path-Based Ranking with Pseudo Relevance Feedback on Heterogeneous Graph for Citation Recommendation By: Xiaozhong Liu, Yingying Yu, Chun Guo, Yizhou.
Ariel Fuxman, Panayiotis Tsaparas, Kannan Achan, Rakesh Agrawal (2008) - Akanksha Saxena 1.
Scalable Statistical Relational Learning for NLP William Y. Wang William W. Cohen Machine Learning Dept and Language Technologies Inst. joint work with:
New Rules for Domain Independent Lifted MAP Inference
An Introduction to Markov Logic Networks in Knowledge Bases
Random Walks on Graphs.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 30
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 29
Lifted First-Order Probabilistic Inference [de Salvo Braz, Amir, and Roth, 2005] Daniel Lowd 5/11/2005.
Asymmetric Transitivity Preserving Graph Embedding
GANG: Detecting Fraudulent Users in OSNs
Learning to Rank Typed Graph Walks: Local and Global Approaches
Markov Networks.
Random Neural Network Texture Model
Presentation transcript:

Progress Report ekker

Problem Definition In cases such as object recognition, we can not include all possible objects for training. So transfer learning could deal with this kind of problem. Here we divide the complete transfer learning into two steps: node(link) classification,transfer to other domain.

Related Solution Graph labeling – SNARE : A Link Analytic System for Graph Labeling and Risk Detection,Mary McGlohon et al. KDD Markov Logic Network – Markov logic network,Matthew Richarson,PedroDomingos,Machine Learning 2006

Overview of Graph Labeling

1.A graph G=(V,E),V is the entities, E is the interactions between them. 2.Binary Class label X. 3. A set of flags based on node attributes G=(V,E) Given: Output: A mapping between each node and its class label. Information about this node is inferred from its neighbors.

Overview of Graph Labeling G=(V,E) ViVi VjVj Information about this node is inferred from its neighbors. Upon convergence, belief scores are determined by : Message to node i edge potential from I to j node i potential

Overview of Markov Logic Network Using the first-order logic to capture the relation(attributes) of data. Using the entities(constant in predicate) and formulas build up the MLN network. Learn the weight of each formula. Using MLN to inference the query probability.

Overview of Markov Logic Network Cancer(A) Smokes(A) Friends(A,A) Friends(B,A) Smokes(B) Friends(A,B) Cancer(B) Friends(B,B) Two constants: Anna (A) and Bob (B) Constants Weights Using MLN to inference query, such as P(Smokes(A)=>Cancer(A)|MLN)

Ideas But for MLN using the weight and first order to capture the characteristic of data. Could we extend the graph labeling method with more generality. – In real data, the relation between nodes is not only one type and the node type is node only binary,too. => How to do graph labeling on heterogeneous network.

Recommendation over a heterogeneous Social Network Recommendation over a heterogeneous Social Network,Jin Zhang,Jie,Tang, et al., WAIM08 This papers goal is to investigated the recommendation system on a general heterogeneous Web social network. – Browsing : do recommendation s when a person is browsing one object – Search : do recommendation of different types of object when a person searches for one type of object by query.

Approach Global importance estimation. – Similar to PageRank. – Concerned with a homogenous graph.

Pair-wise learning Algorithm Build up a transition graph of the homogenous graph.

Pair-wise learning Algorithm Build up a transition matrix between each pair of two types of nodes. – For example, in the previous figure, we may have 13 transition matrixes. Then it can using the transition probability and the transition matrix to compute the score. – But for compute the score we need to compute the transition probability.

Pair-wise learning Algorithm To learn the transition probability λ. – Using the training data A ={(i,j)} the selected pair of object of the same type which important score of i larger than j. – Try to make the importance score in random walk algorithm as in training data.