CS8803-NS Network Science Fall 2013 Instructor: Constantine Dovrolis

Slides:



Advertisements
Similar presentations
CS8803-NS Network Science Fall 2013
Advertisements

METHODS FOR HAPLOTYPE RECONSTRUCTION
Bayesian Estimation in MARK
Randomized Algorithms Randomized Algorithms CS648 Lecture 9 Random Sampling part-I (Approximating a parameter) Lecture 9 Random Sampling part-I (Approximating.
Dynamic Bayesian Networks (DBNs)
Introduction of Markov Chain Monte Carlo Jeongkyun Lee.
Bayesian Methods with Monte Carlo Markov Chains III
Introduction Paleontologists often need to model complex systems with many variables and complex relationships. In such models, information is often characterized.
Bayesian statistics – MCMC techniques
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
BAYESIAN INFERENCE Sampling techniques
1 Vertically Integrated Seismic Analysis Stuart Russell Computer Science Division, UC Berkeley Nimar Arora, Erik Sudderth, Nick Hay.
. PGM: Tirgul 8 Markov Chains. Stochastic Sampling  In previous class, we examined methods that use independent samples to estimate P(X = x |e ) Problem:
Maximum Likelihood Network Topology Identification Mark Coates McGill University Robert Nowak Rui Castro Rice University DYNAMICS May 5 th,2003.
Machine Learning CMPT 726 Simon Fraser University CHAPTER 1: INTRODUCTION.
Machine Learning CUNY Graduate Center Lecture 7b: Sampling.
Exploring subjective probability distributions using Bayesian statistics Tom Griffiths Department of Psychology Cognitive Science Program University of.
Course overview Tuesday lecture –Those not presenting turn in short review of a paper using the method being discussed Thursday computer lab –Turn in short.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
End of Chapter 8 Neil Weisenfeld March 28, 2005.
CS 561, Session 29 1 Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
July 3, A36 Theory of Statistics Course within the Master’s program in Statistics and Data mining Fall semester 2011.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
CS8803-NS Network Science Fall 2013
Robin McDougall, Ed Waller and Scott Nokleby Faculties of Engineering & Applied Science and Energy Systems & Nuclear Science 1.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
WSEAS AIKED, Cambridge, Feature Importance in Bayesian Assessment of Newborn Brain Maturity from EEG Livia Jakaite, Vitaly Schetinin and Carsten.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-1 Review and Preview.
Introduction to MCMC and BUGS. Computational problems More parameters -> even more parameter combinations Exact computation and grid approximation become.
Exam I review Understanding the meaning of the terminology we use. Quick calculations that indicate understanding of the basis of methods. Many of the.
Machine Learning Lecture 23: Statistical Estimation with Sampling Iain Murray’s MLSS lecture on videolectures.net:
Comparison of Bayesian Neural Networks with TMVA classifiers Richa Sharma, Vipin Bhatnagar Panjab University, Chandigarh India-CMS March, 2009 Meeting,
Slides to accompany Weathington, Cunningham & Pittenger (2010), Chapter 3: The Foundations of Research 1.
Perceptual Multistability as Markov Chain Monte Carlo Inference.
Markov Random Fields Probabilistic Models for Images
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Lecture 2: Statistical learning primer for biologists
Overview G. Jogesh Babu. Overview of Astrostatistics A brief description of modern astronomy & astrophysics. Many statistical concepts have their roots.
MCMC reconstruction of the 2 HE cascade events Dmitry Chirkin, UW Madison.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Reducing MCMC Computational Cost With a Two Layered Bayesian Approach
URBDP 591 I Lecture 4: Research Question Objectives How do we define a research question? What is a testable hypothesis? How do we test an hypothesis?
Multi-target Detection in Sensor Networks Xiaoling Wang ECE691, Fall 2003.
Inference of Non-Overlapping Camera Network Topology by Measuring Statistical Dependence Date :
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
The Unscented Particle Filter 2000/09/29 이 시은. Introduction Filtering –estimate the states(parameters or hidden variable) as a set of observations becomes.
Stats Term Test 4 Solutions. c) d) An alternative solution is to use the probability mass function and.
04/21/2005 CS673 1 Being Bayesian About Network Structure A Bayesian Approach to Structure Discovery in Bayesian Networks Nir Friedman and Daphne Koller.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Bayesian Modelling Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Bayesian statistics named after the Reverend Mr Bayes based on the concept that you can estimate the statistical properties of a system after measuting.
Hypothesis Testing Steps for the Rejection Region Method State H 1 and State H 0 State the Test Statistic and its sampling distribution (normal or t) Determine.
Introduction: Metropolis-Hasting Sampler Purpose--To draw samples from a probability distribution There are three steps 1Propose a move from x to y 2Accept.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
Markov-Chain-Monte-Carlo (MCMC) & The Metropolis-Hastings Algorithm P548: Intro Bayesian Stats with Psych Applications Instructor: John Miyamoto 01/19/2016:
Bayesian Inference and Visual Processing: Image Parsing & DDMCMC. Alan Yuille (Dept. Statistics. UCLA) Tu, Chen, Yuille & Zhu (ICCV 2003).
HW7: Evolutionarily conserved segments ENCODE region 009 (beta-globin locus) Multiple alignment of human, dog, and mouse 2 states: neutral (fast-evolving),
Density Estimation in R Ha Le and Nikolaos Sarafianos COSC 7362 – Advanced Machine Learning Professor: Dr. Christoph F. Eick 1.
1 CISC 841 Bioinformatics (Fall 2008) Review Session.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
A Method to Approximate the Bayesian Posterior Distribution in Singular Learning Machines Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Section Copyright © 2015, 2011, 2008 Pearson Education, Inc. Lecture Slides Essentials of Statistics 5 th Edition and the Triola Statistics Series.
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
Markov Chain Monte Carlo in R
MCMC Output & Metropolis-Hastings Algorithm Part I
Network Inference Chris Holmes Oxford Centre for Gene Function, &,
Presentation transcript:

CS8803-NS Network Science Fall 2013 Instructor: Constantine Dovrolis

The following slides include only the figures or videos that we use in class; they do not include detailed explanations, derivations or descriptions covered in class. Many of the following figures are copied from open sources at the Web. I do not claim any intellectual property for the following material. Disclaimers

Outline Network science and statistics Four important problems: 1.Sampling from large networks 2.Sampling bias in traceroute-like probing 3.Network inference based on temporal correlations 4.Prediction of missing & spurious links

Also learn about: Traceroute-like network discovery A couple of nice examples of constructing hypothesis tests – One of them is based on an interesting Chernoff bound – The other is based on the Pearson chi- squared goodness of fit test

Also learn about: Stochastic graph models and how to fit them to data in Bayesian framework Maximum-Likelihood-Estimation Markov-Chain-Monte-Carlo (MCMC) sampling Metropolis-Hastings rule Area-Under-Curve (ROC) evaluation of a classifier

Appendix – some background

ROC and Area-Under-Curve

Markov Chain Monte Carlo sampling – Metropolis-Hasting algorithm The result of three Markov chains running on the 3D Rosenbrock function using the Metropolis-Hastings algorithm. The algorithm samples from regions where the posterior probability is high and the chains begin to mix in these regions. The approximate position of the maximum has been illuminated. Note that the red points are the ones that remain after the burn-in process. The earlier ones have been discarded.Markov chainsRosenbrock functionposterior probability

Also learn about: More advanced coupling metrics (than Pearson’s cross-correlation) – Coherence, synchronization likelihood, wavelet coherence, Granger causality, directed transfer function, and others Bootstrap to calculate a p-value – And frequency-domain bootstrap for timeseries The Fisher transformation A result from Extreme Value Theory Multiple Testing Problem – False Discovery Rate (FDR) – The linear step-up FDR control method Pink noise

Appendix – some background

Fisher transformation

P-value in one-sided hypothesis tests

Bootstraping

1-over-f noise (pink noise)