Global Synchronization in Sensornets Jeremy Elson, Richard Karp, Christos Papadimitriou, Scott Shenker.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
The Maximum Likelihood Method
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Max Flow Problem Given network N=(V,A), two nodes s,t of V, and capacities on the arcs: uij is the capacity on arc (i,j). Find non-negative flow fij for.
Introduction to Algorithms
R. Johnsonbaugh Discrete Mathematics 5 th edition, 2001 Chapter 8 Network models.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
CPSC 689: Discrete Algorithms for Mobile and Wireless Systems Spring 2009 Prof. Jennifer Welch.
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
Basic Feasible Solutions: Recap MS&E 211. WILL FOLLOW A CELEBRATED INTELLECTUAL TEACHING TRADITION.
1 s-t Graph Cuts for Binary Energy Minimization  Now that we have an energy function, the big question is how do we minimize it? n Exhaustive search is.
Visual Recognition Tutorial
HW2 Solutions. Problem 1 Construct a bipartite graph where, every family represents a vertex in one partition, and table represents a vertex in another.
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Minimaxity & Admissibility Presenting: Slava Chernoi Lehman and Casella, chapter 5 sections 1-2,7.
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
Principles of the Global Positioning System Lecture 10 Prof. Thomas Herring Room A;
Random Sampling, Point Estimation and Maximum Likelihood.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
DATA MINING LECTURE 13 Pagerank, Absorbing Random Walks Coverage Problems.
8 Sampling Distribution of the Mean Chapter8 p Sampling Distributions Population mean and standard deviation,  and   unknown Maximal Likelihood.
ELEC 303 – Random Signals Lecture 18 – Classical Statistical Inference, Dr. Farinaz Koushanfar ECE Dept., Rice University Nov 4, 2010.
CY3A2 System identification1 Maximum Likelihood Estimation: Maximum Likelihood is an ancient concept in estimation theory. Suppose that e is a discrete.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
BCS547 Neural Decoding.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Confidence Interval & Unbiased Estimator Review and Foreword.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Introduction to Statistics − Day 4 Glen Cowan Lecture 1 Probability Random variables, probability densities, etc. Lecture 2 Brief catalogue of probability.
Machine Learning 5. Parametric Methods.
1 EE5900 Advanced Embedded System For Smart Infrastructure Static Scheduling.
Gaussian Process and Prediction. (C) 2001 SNU CSE Artificial Intelligence Lab (SCAI)2 Outline Gaussian Process and Bayesian Regression  Bayesian regression.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Learning Theory Reza Shadmehr Distribution of the ML estimates of model parameters Signal dependent noise models.
Approximation Algorithms Duality My T. UF.
A Binary Linear Programming Formulation of the Graph Edit Distance Presented by Shihao Ji Duke University Machine Learning Group July 17, 2006 Authors:
Approximation Algorithms based on linear programming.
Distributed Systems Lecture 5 Time and synchronization 1.
Conditional Expectation
Institute for Experimental Mathematics Ellernstrasse Essen - Germany DATA COMMUNICATION introduction A.J. Han Vinck May 10, 2003.
Presentation : “ Maximum Likelihood Estimation” Presented By : Jesu Kiran Spurgen Date :
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Estimator Properties and Linear Least Squares
Lap Chi Lau we will only use slides 4 to 19
Max-flow, Min-cut Network flow.
12. Principles of Parameter Estimation
Probability Theory and Parameter Estimation I
Topics in Algorithms Lap Chi Lau.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Ch3: Model Building through Regression
Special Topics In Scientific Computing
Density Independent Algorithms for Sparsifying
Max-flow, Min-cut Network flow.
5.2 Least-Squares Fit to a Straight Line
Resilient Aggregation in Sensor Networks
EE5900 Advanced Embedded System For Smart Infrastructure
Parametric Methods Berlin Chen, 2005 References:
Network Flow.
Maximum Flow Neil Tang 4/8/2008
12. Principles of Parameter Estimation
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Network Flow.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Presentation transcript:

Global Synchronization in Sensornets Jeremy Elson, Richard Karp, Christos Papadimitriou, Scott Shenker

latin, 8/4/42 Clocks in networks Difficult problem, largely ignored in practice

latin, 8/4/43 …but in sensornets: synchronization is crucial transmission delays are negligible, and so very tight synchronization can (in principle) be achieved

latin, 8/4/44 each vertex i has a local offset T i T 0 = 0 a synchronization signal is sent at (unknown) time U k received by i at time U k +T i + e ki Gaussian error with mean zero and known variance V ki

latin, 8/4/45 Need to estimate T i - T j Estimate must be globally consistent: (T i - T j ) + (T j - T k ) + (T k - T i ) = 0 Must minimize total variance (Or maximize likelihood?)

latin, 8/4/46 Variance minimization by flows The unbiased estimators of T i – T j are flows from i to j in the network The minimum variance flow corresponds to the min-cost flow with costs V ij x 2 But this is the same as the power in electric networks!

latin, 8/4/47 effective variance  effective resistance! min variance estimate  current

latin, 8/4/48 Computation? By analog computer… By approximation of min-cost flow Idea: cost = x 2 capacities = 

latin, 8/4/49 Maximum likelihood? error is Gaussian  likelihood is exp(sum of squares)  optimum is by least squares Surprise: Same estimates as with min variance! Random walk method gives same expectation (but huge variance)

latin, 8/4/410 Synchronization design Suppose that we have a network How do you design an optimum synchronization protocol which minimizes –variance of the estimates (assume is bounded) –total synchronization activity (messages)?

latin, 8/4/411 Resistive network design You are given a graph Allocate metal to the edges (resistance inversely proportional to weight) So as to achieve small effective resistance between all nodes (or small weighted sum) With the minimum amount of copper

latin, 8/4/412 Theorem: Optimum can be found in polynomial time Idea: minimize Σ x i subject to x  K K: all weight allocations x that achieve effective resistance  b Surprise: K is convex, and has a polynomial-time separation oracle

latin, 8/4/413 Also: Clocks with different drifts t i = a i t + b i can be also synchronized by a reduction to the equal drift case (as it turns out, by setting T i = log a i ) Soon-to-be-deployed distributed clock synchronization protocol based on these ideas