Download presentation
Presentation is loading. Please wait.
Published byAnissa Andrews Modified over 9 years ago
1
CS8803-NS Network Science Fall 2013 Instructor: Constantine Dovrolis constantine@gatech.edu http://www.cc.gatech.edu/~dovrolis/Courses/NetSci/
2
The following slides include only the figures or videos that we use in class; they do not include detailed explanations, derivations or descriptions covered in class. Many of the following figures are copied from open sources at the Web. I do not claim any intellectual property for the following material. Disclaimers
3
Outline Network science and statistics Four important problems: 1.Sampling from large networks 2.Sampling bias in traceroute-like probing 3.Network inference based on temporal correlations 4.Prediction of missing & spurious links
11
Also learn about: Traceroute-like network discovery A couple of nice examples of constructing hypothesis tests – One of them is based on an interesting Chernoff bound – The other is based on the Pearson chi- squared goodness of fit test
24
Also learn about: Stochastic graph models and how to fit them to data in Bayesian framework Maximum-Likelihood-Estimation Markov-Chain-Monte-Carlo (MCMC) sampling Metropolis-Hastings rule Area-Under-Curve (ROC) evaluation of a classifier
30
Appendix – some background
31
ROC and Area-Under-Curve http://gim.unmc.edu/dxtests/roc3.htm http://www.intechopen.com/books/data-mining-applications-in-engineering-and-medicine/examples-of-the-use-of-data-mining-methods-in-animal-breeding
32
Markov Chain Monte Carlo sampling – Metropolis-Hasting algorithm http://upload.wikimedia.org/wikipedia/commons/5/5e/Metropolis_algorithm_convergence_example.png http://en.wikipedia.org/wiki/Metropolis%E2%80%93Hastings_algorithm The result of three Markov chains running on the 3D Rosenbrock function using the Metropolis-Hastings algorithm. The algorithm samples from regions where the posterior probability is high and the chains begin to mix in these regions. The approximate position of the maximum has been illuminated. Note that the red points are the ones that remain after the burn-in process. The earlier ones have been discarded.Markov chainsRosenbrock functionposterior probability
34
Also learn about: More advanced coupling metrics (than Pearson’s cross-correlation) – Coherence, synchronization likelihood, wavelet coherence, Granger causality, directed transfer function, and others Bootstrap to calculate a p-value – And frequency-domain bootstrap for timeseries The Fisher transformation A result from Extreme Value Theory Multiple Testing Problem – False Discovery Rate (FDR) – The linear step-up FDR control method Pink noise
40
Appendix – some background
41
http://paulbourke.net/miscellaneous/correlate/
42
Fisher transformation http://en.wikipedia.org/wiki/File:Fisher_transformation.svg
43
P-value in one-sided hypothesis tests http://us.litroost.net/?p=889
44
Bootstraping http://www-ssc.igpp.ucla.edu/personnel/russell/ESS265/Ch9/autoreg/node6.html
45
1-over-f noise (pink noise) http://www.scholarpedia.org/article/1/f_noise
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.