Download presentation
Published byStuart Malone Modified over 8 years ago
1
DeepWalk: Online Learning of Social Representations
Bryan Perozzi, Rami Al-Rfou, Steven Skiena Stony Brook University KDD 2014
2
Graph Representation For machine learning tasks, the first step is to extract graph features such as adjacency matrix, but too sparse We can create features by transforming the adjacency matrix of a graph into a lower dimensional latent representation
3
Their algorithm: DeepWalk
DeepWalk learns a latent representation of adjacency matrices using deep learning techniques developed for language modeling
4
Example Zachary’s Karate Network:
5
Language modeling Learning a representation of a word from documents (word co-occurrence): word2vec: The learned representations capture inherent structure Example:
6
From language modeling to graphs
Idea: Nodes <--> Words Node sequences <--> Sentences Generating node sequences: Using random walks short random walks = sentences Connection: Words frequency in a natural language corpus follows a power law. Vertex frequency in random walks on scale free graphs also follows a power law.
7
Framework
8
Representation Mapping
9
Deep Learning Structure: Skip-gram model [Mikolov+2013]
Skip-gram: The input to the model is wi, and the output could be wi−1,wi−2,wi+1,wi+2 Φ(v1) v3
10
Experiments Node Classification DataSet
Some nodes have labels, some don’t DataSet BlogCatalog Flickr YouTube
11
Results: BlogCatalog
12
Learning to Rank Short Text Pairs with Convolutional Deep Neural Networks
Aliaksei Severyn and Alessandro Moschitti Google Inc, Qatar Computing Research Institute SIGIR 2015
13
Learning to Rank text pairs
Task: Given two text pairs, extracting similarity features, and rank the answers Previous work on QA: build complex feature based models This paper: use Deep Learning to free oneself from designing complex similarity features
14
Deep Learning Methods Rely only on words as input
Words are represented as vectors (word2vec) Automatically represent sentences from the input words only capture syntactic/semantic information
15
Architecture
16
ConvNet Sentence Model
Answer Part
17
Encoding word overlap for answers
18
Pair Similarity
19
Pair representation
20
Experiments Data: Answer Sentence Selection [Wang et al. 2007]
21
Result
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.