Heterogeneous Graph Convolutional Network

Slides:



Advertisements
Similar presentations
A brief review of non-neural-network approaches to deep learning
Advertisements

NEURAL NETWORKS Biological analogy
Memristor in Learning Neural Networks
Advanced topics.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Neural Nets Using Backpropagation Chris Marriott Ryan Shirley CJ Baker Thomas Tannahill.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Technion –Israel Institute of Technology Software Systems Laboratory A Comparison of Peer-to-Peer systems by Gomon Dmitri and Kritsmer Ilya under Roi Melamed.
Chapter 6: Multilayer Neural Networks
1 st Neural Network: AND function Threshold(Y) = 2 X1 Y X Y.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Backpropagation An efficient way to compute the gradient Hung-yi Lee.
Linear Discrimination Reading: Chapter 2 of textbook.
Linear Classification with Perceptrons
Gene-Disease Associations Based on network
Artificial Neural Networks (ANN). Artificial Neural Networks First proposed in 1940s as an attempt to simulate the human brain’s cognitive learning processes.
Progress Report ekker. Problem Definition In cases such as object recognition, we can not include all possible objects for training. So transfer learning.
Deep Learning Overview Sources: workshop-tutorial-final.pdf
Today’s Topics 11/10/15CS Fall 2015 (Shavlik©), Lecture 21, Week 101 More on DEEP ANNs –Convolution –Max Pooling –Drop Out Final ANN Wrapup FYI:
Yann LeCun Other Methods and Applications of Deep Learning Yann Le Cun The Courant Institute of Mathematical Sciences New York University
Combining Models Foundations of Algorithms and Machine Learning (CS60020), IIT KGP, 2017: Indrajit Bhattacharya.
Big data classification using neural network
Fall 2004 Backpropagation CS478 - Machine Learning.
RNNs: An example applied to the prediction task
Convolutional Neural Network
CS 388: Natural Language Processing: LSTM Recurrent Neural Networks
Deep Learning Amin Sobhani.
Recursive Neural Networks
Convolutional Neural Fabrics by Shreyas Saxena, Jakob Verbeek
Restricted Boltzmann Machines for Classification
Soft Computing Applied to Finite Element Tasks
Computing Network Centrality Measures using Neural Networks
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
CS 188: Artificial Intelligence
Background & Overview Proposed Model Experimental Results Future Work
This Talk 1) Node embeddings 2) Graph neural networks 3) Applications
CSE 473 Introduction to Artificial Intelligence Neural Networks
RNNs: Going Beyond the SRN in Language Prediction
Normalized Cut Loss for Weakly-supervised CNN Segmentation
General Aspects of Learning
Knowledge Base Completion
Learning Emoji Embeddings Using Emoji Co-Occurrence Network Graph
network of simple neuron-like computing elements
Graph and Tensor Mining for fun and profit
Jiawei Han Department of Computer Science
Neural Networks Geoff Hulten.
Lecture Notes for Chapter 4 Artificial Neural Networks
Asymmetric Transitivity Preserving Graph Embedding
Graph-based Security and Privacy Analytics via Collective Classification with Joint Weight Learning and Propagation Binghui Wang, Jinyuan Jia, and Neil.
Graph Neural Networks Amog Kamsetty January 30, 2019.
实习生汇报 ——北邮 张安迪.
Autoencoders Supervised learning uses explicit labels/correct output in order to train a network. E.g., classification of images. Unsupervised learning.
COSC 4335: Part2: Other Classification Techniques
Machine Learning Perceptron: Linearly Separable Supervised Learning
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
Attention for translation
David Kauchak CS158 – Spring 2019
The Updated experiment based on LSTM
Introduction to Neural Networks
Keshav Balasubramanian
Sanguthevar Rajasekaran University of Connecticut
Coronary Artery Bypass Risk Prediction Using Neural Networks
Heterogeneous Graph Attention Network
Peng Cui Tsinghua University
Artificial Neural Network learning
Overall Introduction for the Lecture
Patterson: Chap 1 A Review of Machine Learning
ONNX Training Discussion
Presentation transcript:

Heterogeneous Graph Convolutional Network 论文分享报告 Heterogeneous Graph Convolutional Network 乔子越 2019-06-10

Reference Kipf, Thomas N., and Max Welling. “Semi-supervised classification with graph convolutional networks.” arXiv preprint arXiv:1609.02907 (2016).  (ICLR 2017) Schlichtkrull, M., Kipf, T. N., Bloem, P., Van Den Berg, R., Titov, I., & Welling, M. (2018, June). Modeling relational data with graph convolutional networks. In European Semantic Web Conference (pp. 593-607). Springer, Cham.

Graph Convolutional Network[1] Graph Convolutional Network and related graph neural network models can be understood as special cases of a simple differentiable message passing model that operate on local graph neighborhoods to concatenate self embedding and neighbor embedding to generate each node’s embedding.

Graph Convolutional Network Aggregator (Convolution layer): 1) Overall: 2) Node level: A tilde N(u) denotes the degree of node u

Graph Convolutional Network We can feed these embeddings into any loss function and run stochastic gradient descent to train the aggregation parameters. Output model: Task specific. 1) supervised task (e.g., node classification): where 2) unsupervised task (network embedding) Random walks (node2vec, DeepWalk) Graph factorization i.e., train the model so that “similar” nodes have similar embeddings.

Relational Graph Convolutional Network[2] Applied on graph with multiple relations (heterogeneous network) Aggregator (Convolution layer): Output model: 1) Link prediction (recovery of missing facts, i.e. subject-predicate-object triples) 2) entity classification (recovery of missing entity attributes) Two supervised task, no solution for heterogeneous network embedding(unsupervised task).

Heterogeneous Graph Convolutional Network Embedding Aggregator (Convolution layer): Output model: RW: paths generated by meta-path and relation weight guided random walk

Heterogeneous Graph Convolutional Network Embedding The application of HGCN embedding on author name disambiguation: