Presentation is loading. Please wait.

Presentation is loading. Please wait.

Keshav Balasubramanian

Similar presentations


Presentation on theme: "Keshav Balasubramanian"— Presentation transcript:

1 Keshav Balasubramanian
Inductive Representation Learning on Large Graphs William L. Hamilton, Rex Ying, Jure Leskovec Keshav Balasubramanian

2 Outline Main goal: generating node embeddings Survey of past methods
GCNs GraphSAGE Algorithm Optimization and learning Aggregators Experiments and results Conclusion

3 Node Embeddings Fixed length vector representations of nodes in a graph (similar to word embeddings) Can be used in downstream machine learning and graph mining tasks Need to learn smaller dense embeddings from higher dimensional information

4 Methods Non Deep Learning based models: DeepWalk, node2vec
Biased random walk based approaches Attempt to linearize a graph by viewing the results of the walks as sentences Deep Learning models: Vanilla graph neural networks, GCNs Neural Networks learn the representations

5 Graph Convolution Networks
Type of deep learning model that learns node representations Two approaches: Spectral – Involves operating in the spectral domain of the graph, specifically on the Laplacian and Adjacency matrix. Spatial – Convolution is defined directly based on the spatial neighborhood of a node Drawbacks of the spectral approach Involves operating on entire Laplacian Transductive, generalizes poorly to unseen nodes

6 GraphSAGE Spatial and inductive graph convolution network
Nodes learn from neighborhood in graph A fixed size neighborhood is randomly sampled for each node

7 Algorithm GraphSAGE forward pass

8 Learning the parameters of the network
Supervised Final embeddings are passed through a dense layer to get class probabilities Eg: Labelling the nodes of a graph End goal is labelling the node, not generating the embedding Unsupervised Use custom loss function that ensure “nearby” nodes have similar representations, “far away” nodes have dissimilar representations Negative sample to ensure latter Nearness defined by cooccurrence score on a fixed length random walk

9 Aggregators Mean: Pooling: LSTM based aggregator

10 Experiments and Results

11 Conclusion Proposed an inductive framework to learn node representations based on spatial graph convolutions


Download ppt "Keshav Balasubramanian"

Similar presentations


Ads by Google