Manuel Gomez Rodriguez

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

Some recent evolutions in order book modelling Frédéric Abergel Chair of Quantitative Finance École Centrale Paris
MATH 1107 Introduction to Statistics
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Jensen’s Inequality (Special Case) EM Theorem.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Artificial Intelligence Lecture 2 Dr. Bo Yuan, Professor Department of Computer Science and Engineering Shanghai Jiaotong University
1 Introduction to Load Balancing: l Definition of Distributed systems. Collection of independent loosely coupled computing resources. l Load Balancing.
Probability theory 2011 Outline of lecture 7 The Poisson process  Definitions  Restarted Poisson processes  Conditioning in Poisson processes  Thinning.
3-1 Introduction Experiment Random Random experiment.
Estimating Congestion in TCP Traffic Stephan Bohacek and Boris Rozovskii University of Southern California Objective: Develop stochastic model of TCP Necessary.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Space-time Modelling Using Differential Equations Alan E. Gelfand, ISDS, Duke University (with J. Duan and G. Puggioni)
 1  GSLM System Simulation Yat-wah Wan Room: B317; ywan; Ext: 3166.
Hamid R. Rabiee Fall 2009 Stochastic Processes Review of Elementary Probability Lecture I.
Information Networks Power Laws and Network Models Lecture 3.
Tennessee Technological University1 The Scientific Importance of Big Data Xia Li Tennessee Technological University.
1 Performance Evaluation of Computer Networks: Part II Objectives r Simulation Modeling r Classification of Simulation Modeling r Discrete-Event Simulation.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
Random Variables & Probability Distributions Outcomes of experiments are, in part, random E.g. Let X 7 be the gender of the 7 th randomly selected student.
Modeling and Simulation CS 313
Internet Engineering Czesław Smutnicki Discrete Mathematics – Discrete Convolution.
Random Sampling, Point Estimation and Maximum Likelihood.
Artificial Intelligence
1 Institute of Engineering Mechanics Leopold-Franzens University Innsbruck, Austria, EU H.J. Pradlwarter and G.I. Schuëller Confidence.
Distributed Computing Rik Sarkar. Distributed Computing Old style: Use a computer for computation.
Biological Modeling of Neural Networks Week 8 – Noisy input models: Barrage of spike arrivals Wulfram Gerstner EPFL, Lausanne, Switzerland 8.1 Variation.
1 Statistical Distribution Fitting Dr. Jason Merrick.
Complex Networks First Lecture TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA TexPoint fonts used in EMF. Read the.
Nonlinear localization of light in disordered optical fiber arrays
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Continuous Variables Write message update equation as an expectation: Proposal distribution W t (x t ) for each node Samples define a random discretization.
Lévy copulas: Basic ideas and a new estimation method J L van Velsen, EC Modelling, ABN Amro TopQuants, November 2013.
EAS31116/B9036: Statistics in Earth & Atmospheric Sciences Lecture 3: Probability Distributions (cont’d) Instructor: Prof. Johnny Luo
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
Probability Refresher COMP5416 Advanced Network Technologies.
CS Statistical Machine learning Lecture 24
Learning Simio Chapter 10 Analyzing Input Data
S TOCHASTIC M ODELS L ECTURE 2 P ART II P OISSON P ROCESSES Nan Chen MSc Program in Financial Engineering The Chinese University of Hong Kong (ShenZhen)
4.3 More Discrete Probability Distributions NOTES Coach Bridges.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Manuel Gomez Rodriguez Bernhard Schölkopf I NFLUENCE M AXIMIZATION IN C ONTINUOUS T IME D IFFUSION N ETWORKS , ICML ‘12.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Mean Field Methods for Computer and Communication Systems Jean-Yves Le Boudec EPFL Network Science Workshop Hong Kong July
The Pure Birth Process Derivation of the Poisson Probability Distribution Assumptions events occur completely at random the probability of an event occurring.
Machine learning for Dynamic Social Network Analysis
Introduction to Probability - III John Rundle Econophysics PHYS 250
Machine learning for Dynamic Social Network Analysis
Review of Probability Theory
Manuel Gomez Rodriguez
HST 583 fMRI DATA ANALYSIS AND ACQUISITION
Introduction to Load Balancing:
Machine learning for Dynamic Social Network Analysis
Opracowanie językowe dr inż. J. Jarnicki
Table 1. Advantages and Disadvantages of Traditional DM/ML Methods
Of Probability & Information Theory
Chapter 7: Sampling Distributions
Data Science Process Chapter 2 Rich's Training 11/13/2018.
CSCI 5822 Probabilistic Models of Human and Machine Learning
Filtering and State Estimation: Basic Concepts
Distributions and Densities: Gamma-Family and Beta Distributions
Introduction.
LECTURE 07: BAYESIAN ESTIMATION
CS723 - Probability and Stochastic Processes
Human-centered Machine Learning
Human-centered Machine Learning
Human-centered Machine Learning
Human-centered Machine Learning
Human-centered Machine Learning
Human-centered Machine Learning
Presentation transcript:

Manuel Gomez Rodriguez Machine learning for Dynamic Social Network Analysis Manuel Gomez Rodriguez Max Planck Institute for Software Systems University of Sydney, January 2017

Transportation Networks Interconnected World Social Networks World Wide Web Information Networks Transportation Networks Protein Interactions Internet of Things

Many discrete events in continuous time Qmee, 2013

Variety of processes behind these events Events are (noisy) observations of a variety of complex dynamic processes… Product reviews and sales in Amazon News spread in Twitter A user gains recognition in Quora Video becomes viral in Youtube Article creation in Wikipedia Fast Slow …in a wide range of temporal scales.

Example I: Idea adoption/viral marketing S D Christine means D follows S Bob 3.00pm 3.25pm Beth 3.27pm Joe David 4.15pm Friggeri et al., 2014 They can have an impact in the off-line world

Example II: Information creation & curation ✗ Addition Refutation Question Answer Upvote

1st year computer science student Example III: Learning trajectories 1st year computer science student Introduction to programming Discrete math Project presentation Powerpoint vs. Keynote Graph Theory Class inheritance For/do-while loops Set theory Define functions Geometry Export pptx to pdf t How to write switch Logic PP templates If … else Private functions Class destructor Plot library

Detailed event traces Detailed Traces of Activity The availability of event traces boosts a new generation of data-driven models and algorithms #greece retweets

Previously: discrete-time models & algorithms Epoch 1 Epoch 2 Epoch 3 Epoch 4 Discrete-time models artificially introduce epochs: 1. How long is each epoch? Data is very heterogeneous. 2. How to aggregate events within an epoch? 3. What if no event within an epoch? 4. Time is treated as index or conditioning variable, not easy to deal with time-related queries.

Outline of the Seminar Representation: Temporal Point processes 1. Intensity function 2. Basic building blocks 3. Superposition 4. Marks and SDEs with jumps Applications: Models 1. Information propagation 2. Opinion dynamics 3. Information reliability 4. Knowledge acquisition Outline of tutorial Applications: Control 1. Influence maximization 2. Activity shaping 3. When-to-post Slides/references: learning.mpi-sws.org/sydney-seminar

Representation: Temporal Point Processes 4. Marks and SDEs with jumps 1. Intensity function 2. Basic building blocks 3. Superposition 4. Marks and SDEs with jumps

Temporal point processes Temporal point process: A random process whose realization consists of discrete events localized in time Discrete events time History, Dirac delta function Formally:

Model time as a random variable density Prob. between [t, t+dt) time Prob. not before t History, Likelihood of a timeline:

Problem of parametrizing density time Likelihood is not concave in w: It is difficult for model design and interpretability: Densities need to integrate to 1 Combination of timelines results in density convolutions

Intensity function History, density Prob. between [t, t+dt) time Prob. not before t History, Intensity: Probability between [t, t+dt) but not before t Observation:

Advantages of parametrizing intensity time Likelihood is concave in w: Suitable for model design and interpretable: Intensities only need to be nonnegative Combination of timelines results in intensity additions

Relation between f*, F*, S*, λ* Central quantity we will use!

Representation: Temporal Point Processes 4. Marks and SDEs with jumps 1. Intensity function 2. Basic building blocks 3. Superposition 4. Marks and SDEs with jumps Outline of tutorial

Poisson process Intensity of a Poisson process Observations: time Intensity of a Poisson process Observations: Intensity independent of history Uniformly random occurrence Time interval follows exponential distribution

Inhomogeneous Poisson process time Intensity of an inhomogeneous Poisson process Observations: Intensity independent of history

Nonparametric inhomogeneous Poisson process Positive combination of (Gaussian) RFB kernels:

Terminating (or survival) process time Intensity of a terminating (or survival) process Observations: Limited number of occurrences

Self-exciting (or Hawkes) process time History, Triggering kernel Intensity of self-exciting (or Hawkes) process: Observations: Clustered (or bursty) occurrence of events Intensity is stochastic and history dependent

How do we sample from a Hawkes process? time Thinning procedure (similar to rejection sampling): Sample from Poisson process with intensity Keep the sample with probability

Summary Building blocks to represent different dynamic processes: Poisson processes: Inhomogeneous Poisson processes: Terminating point processes: Self-exciting point processes:

Representation: Temporal Point Processes 4. Marks and SDEs with jumps 1. Intensity function 2. Basic building blocks 3. Superposition 4. Marks and SDEs with jumps Outline of tutorial

Superposition of processes time Sample each intensity + take minimum = Additive intensity

Mutually exciting process time Bob History Christine time History Clustered occurrence affected by neighbors

Mutually exciting terminating process time Bob Christine time History Clustered occurrence affected by neighbors

Representation: Temporal Point Processes 4. Marks and SDEs with jumps 1. Intensity function 2. Basic building blocks 3. Superposition 4. Marks and SDEs with jumps Outline of tutorial

Marked temporal point processes Marked temporal point process: A random process whose realization consists of discrete marked events localized in time time time History,

Independent identically distributed marks time Distribution for the marks: Observations: Marks independent of the temporal dynamics Independent identically distributed (I.I.D.)

Dependent marks: SDEs with jumps time History, Marks given by stochastic differential equation with jumps: Observations: Drift Event influence Marks dependent of the temporal dynamics Defined for all values of t

Dependent marks: distribution + SDE with jumps time History, Distribution for the marks: Observations: Drift Event influence Marks dependent on the temporal dynamics Distribution represents additional source of uncertainty

Mutually exciting + marks Bob time Christine Marks affected by neighbors Drift Neighbor influence

This lecture Next lecture Representation: Temporal Point processes 1. Intensity function 2. Basic building blocks 3. Superposition 4. Marks and SDEs with jumps This lecture Applications: Models 1. Information propagation 2. Opinion dynamics 3. Information reliability 4. Knowledge acquisition Next lecture Outline of tutorial Applications: Control 1. Influence maximization 2. Activity shaping 3. When-to-post Slides/references: learning.mpi-sws.org/sydney-seminar