Keshav Balasubramanian

Slides:



Advertisements
Similar presentations
January 23 rd, Document classification task We are interested to solve a task of Text Classification, i.e. to automatically assign a given document.
Advertisements

Distributed Representations of Sentences and Documents
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
Eric H. Huang, Richard Socher, Christopher D. Manning, Andrew Y. Ng Computer Science Department, Stanford University, Stanford, CA 94305, USA ImprovingWord.
Clustering Spatial Data Using Random Walks Author : David Harel Yehuda Koren Graduate : Chien-Ming Hsiao.
Deep Learning for Efficient Discriminative Parsing Niranjan Balasubramanian September 2 nd, 2015 Slides based on Ronan Collobert’s Paper and video from.
Predicting Voice Elicited Emotions
A Tutorial on Spectral Clustering Ulrike von Luxburg Max Planck Institute for Biological Cybernetics Statistics and Computing, Dec. 2007, Vol. 17, No.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Parsing Natural Scenes and Natural Language with Recursive Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING (ICML 2011) RICHARD SOCHER CLIFF.
DeepWalk: Online Learning of Social Representations
Introduction to Machine Learning Nir Ailon Lecture 12: EM, Clustering and More.
Sparse Coding: A Deep Learning using Unlabeled Data for High - Level Representation Dr.G.M.Nasira R. Vidya R. P. Jaia Priyankka.
S.Bengio, O.Vinyals, N.Jaitly, N.Shazeer
Data Mining Practical Machine Learning Tools and Techniques
Sentiment analysis using deep learning methods
Convolutional Sequence to Sequence Learning
Convolutional Neural Network
Deep Feedforward Networks
The Relationship between Deep Learning and Brain Function
Deep Learning Amin Sobhani.
Randomness in Neural Networks
Sentence Modeling Representation of sentences is the heart of Natural Language Processing A sentence model is a representation and analysis of semantic.
Data Mining, Neural Network and Genetic Programming
Adversarial Learning for Neural Dialogue Generation
Neural Machine Translation by Jointly Learning to Align and Translate
Restricted Boltzmann Machines for Classification
Computing Network Centrality Measures using Neural Networks
Intro to NLP and Deep Learning
Intelligent Information System Lab
Basic machine learning background with Python scikit-learn
Natural Language Processing of Knee MRI Reports
Neural networks (3) Regularization Autoencoder
Deep learning and applications to Natural language processing
Background & Overview Proposed Model Experimental Results Future Work
This Talk 1) Node embeddings 2) Graph neural networks 3) Applications
Hybrid computing using a neural network with dynamic external memory
An Introduction to Support Vector Machines
Distributed Representations of Subgraphs
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
A critical review of RNN for sequence learning Zachary C
Learning with information of features
Lecture 22 Clustering (3).
Image Captions With Deep Learning Yulia Kogan & Ron Shiff
CSE572, CBS598: Data Mining by H. Liu
Very Deep Convolutional Networks for Large-Scale Image Recognition
Graph and Tensor Mining for fun and profit
Neural Networks Geoff Hulten.
Biased Random Walk based Social Regularization for Word Embeddings
Asymmetric Transitivity Preserving Graph Embedding
Generalized Locality Preserving Projections
Unsupervised Pretraining for Semantic Parsing
Enriching Taxonomies With Functional Domain Knowledge
Graph Neural Networks Amog Kamsetty January 30, 2019.
Microarray Data Set The microarray data set we are dealing with is represented as a 2d numerical array.
Neural networks (3) Regularization Autoencoder
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Graph Attention Networks
Multiple DAGs Learning with Non-negative Matrix Factorization
VERY DEEP CONVOLUTIONAL NETWORKS FOR LARGE-SCALE IMAGE RECOGNITION
Neural Machine Translation using CNN
Motivation State-of-the-art two-stage instance segmentation methods depend heavily on feature localization to produce masks.
Heterogeneous Graph Convolutional Network
Peng Cui Tsinghua University
Example of training and deployment of deep convolutional neural networks. Example of training and deployment of deep convolutional neural networks. During.
Learning to Detect Human-Object Interactions with Knowledge
Learning to Cluster Faces on an Affinity Graph
BSMDMA IJCAI 2019 Prediction of Information Cascades via Content and Structure Integrated Whole Graph Embedding Feng Xiaodong, Zhao Qihang,
Graph Convolutional Neural Networks
Patterson: Chap 1 A Review of Machine Learning
Presentation transcript:

Keshav Balasubramanian Inductive Representation Learning on Large Graphs William L. Hamilton, Rex Ying, Jure Leskovec Keshav Balasubramanian

Outline Main goal: generating node embeddings Survey of past methods GCNs GraphSAGE Algorithm Optimization and learning Aggregators Experiments and results Conclusion

Node Embeddings Fixed length vector representations of nodes in a graph (similar to word embeddings) Can be used in downstream machine learning and graph mining tasks Need to learn smaller dense embeddings from higher dimensional information

Methods Non Deep Learning based models: DeepWalk, node2vec Biased random walk based approaches Attempt to linearize a graph by viewing the results of the walks as sentences Deep Learning models: Vanilla graph neural networks, GCNs Neural Networks learn the representations

Graph Convolution Networks Type of deep learning model that learns node representations Two approaches: Spectral – Involves operating in the spectral domain of the graph, specifically on the Laplacian and Adjacency matrix. Spatial – Convolution is defined directly based on the spatial neighborhood of a node Drawbacks of the spectral approach Involves operating on entire Laplacian Transductive, generalizes poorly to unseen nodes

GraphSAGE Spatial and inductive graph convolution network Nodes learn from neighborhood in graph A fixed size neighborhood is randomly sampled for each node

Algorithm GraphSAGE forward pass

Learning the parameters of the network Supervised Final embeddings are passed through a dense layer to get class probabilities Eg: Labelling the nodes of a graph End goal is labelling the node, not generating the embedding Unsupervised Use custom loss function that ensure “nearby” nodes have similar representations, “far away” nodes have dissimilar representations Negative sample to ensure latter Nearness defined by cooccurrence score on a fixed length random walk

Aggregators Mean: Pooling: LSTM based aggregator

Experiments and Results

Conclusion Proposed an inductive framework to learn node representations based on spatial graph convolutions