Quickest Detection of a Change Process Across a Sensor Array Vasanthan Raghavan and Venugopal V. Veeravalli Presented by: Kuntal Ray.

Slides:



Advertisements
Similar presentations
Detection Chia-Hsin Cheng. Wireless Access Tech. Lab. CCU Wireless Access Tech. Lab. 2 Outlines Detection Theory Simple Binary Hypothesis Tests Bayes.
Advertisements

1 COMM 301: Empirical Research in Communication Lecture 15 – Hypothesis Testing Kwan M Lee.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
2 – In previous chapters: – We could design an optimal classifier if we knew the prior probabilities P(wi) and the class- conditional probabilities P(x|wi)
Bayesian Decision Theory
In this handout Stochastic Dynamic Programming
Quickest Detection and Its Allications Zhu Han Department of Electrical and Computer Engineering University of Houston, Houston, TX, USA.
AGC DSP AGC DSP Professor A G Constantinides© Estimation Theory We seek to determine from a set of data, a set of parameters such that their values would.
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Location Estimation in Sensor Networks Moshe Mishali.
Introduction to Signal Estimation. 94/10/142 Outline 
Introduction to Signal Detection
Inferences About Process Quality
Data Selection In Ad-Hoc Wireless Sensor Networks Olawoye Oyeyele 11/24/2003.
Chapter 5 Sampling and Statistics Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
An algorithm for dynamic spectrum allocation in shadowing environment and with communication constraints Konstantinos Koufos Helsinki University of Technology.
Sequential Detection Overview & Open Problems George V. Moustakides, University of Patras, GREECE.
The paired sample experiment The paired t test. Frequently one is interested in comparing the effects of two treatments (drugs, etc…) on a response variable.
ECES 741: Stochastic Decision & Control Processes – Chapter 1: The DP Algorithm 1 Chapter 1: The DP Algorithm To do:  sequential decision-making  state.
POWER CONTROL IN COGNITIVE RADIO SYSTEMS BASED ON SPECTRUM SENSING SIDE INFORMATION Karama Hamdi, Wei Zhang, and Khaled Ben Letaief The Hong Kong University.
Statistical Decision Theory
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Gregory Gurevich and Albert Vexler The Department of Industrial Engineering and Management, SCE- Shamoon College of Engineering, Beer-Sheva 84100, Israel.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Statistical Hypotheses & Hypothesis Testing. Statistical Hypotheses There are two types of statistical hypotheses. Null Hypothesis The null hypothesis,
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005 Dr. John Lipp Copyright © Dr. John Lipp.
Optimal Bayes Classification
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
On optimal quantization rules for some sequential decision problems by X. Nguyen, M. Wainwright & M. Jordan Discussion led by Qi An ECE, Duke University.
Copyright ©2013 Pearson Education, Inc. publishing as Prentice Hall 9-1 σ σ.
Statistical Decision Theory Bayes’ theorem: For discrete events For probability density functions.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
1 CONTEXT DEPENDENT CLASSIFICATION  Remember: Bayes rule  Here: The class to which a feature vector belongs depends on:  Its own value  The values.
Ch15: Decision Theory & Bayesian Inference 15.1: INTRO: We are back to some theoretical statistics: 1.Decision Theory –Make decisions in the presence of.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
1 URBDP 591 A Lecture 12: Statistical Inference Objectives Sampling Distribution Principles of Hypothesis Testing Statistical Significance.
Sampling and estimation Petter Mostad
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Brief Review Probability and Statistics. Probability distributions Continuous distributions.
Spectrum Sensing In Cognitive Radio Networks
Lecture 3: MLE, Bayes Learning, and Maximum Entropy
Introduction to Estimation Theory: A Tutorial
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Copyright © Cengage Learning. All rights reserved. Hypothesis Testing 9.
Hypothesis Tests. An Hypothesis is a guess about a situation that can be tested, and the test outcome can be either true or false. –The Null Hypothesis.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Inferences Concerning Means.
SECTION 1 TEST OF A SINGLE PROPORTION
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Introducing Information into RM to Model Market Behavior INFORMS 6th RM and Pricing Conference, Columbia University, NY Darius Walczak June 5, 2006.
Lecture 1.31 Criteria for optimal reception of radio signals.
12. Principles of Parameter Estimation
Probability Theory and Parameter Estimation I
Parameter Estimation 主講人:虞台文.
Dynamical Statistical Shape Priors for Level Set Based Tracking
Statistical Process Control
Chapter 9 Hypothesis Testing.
LESSON 20: HYPOTHESIS TESTING
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
ELEC6111: Detection and Estimation Theory Course Objective
CONTEXT DEPENDENT CLASSIFICATION
11. Conditional Density Functions and Conditional Expected Values
Information Theoretical Analysis of Digital Watermarking
11. Conditional Density Functions and Conditional Expected Values
12. Principles of Parameter Estimation
Kalman Filter: Bayes Interpretation
Presentation transcript:

Quickest Detection of a Change Process Across a Sensor Array Vasanthan Raghavan and Venugopal V. Veeravalli Presented by: Kuntal Ray

Outline Introduction Problem Formulation DP Frame Work Recursion for Sufficient Statistics Structure for Optimal Stopping Rule References

Outline

INTRODUCTION Sensors take observations,responds to disruptive change The goal is to detect this change point, subjected to false alarm constraints Sequence of observations,densities changes at unknown time has to be detected.

Two Approaches to change detection: - Bayesian Approach ▫Change point is assumed to be a random variable with a prior density known as a priori ▫Goal is to minimize the expected detection delay subject to a bound on the false alarm Probability Minimax Approach ▫Goal is to minimize the worst case delay subject to lower bound on the mean time between false alarm

Introduction… Significant advances in theory of change detection has been made using single sensor Also extension of those framework to the multi- sensor case has been studied, where information available for decision making is decentralized. The above work assume that the statistical properties of the sensors’ observations change at the same time. However, in many scenarios, it is more suitable to consider the case where each sensor’s observations may change at different points in time.

Introduction… An application to such a model is detection of pollutants where the change process is governed by the movement of the agent through the medium under consideration. This paper considers Bayesian version of this problem and assumes point of disruption is a random variable with Geometric Distribution.

INTRODUCTION… Assume L sensors placed in an array Fusion center has complete information about the observations. This is applicable when ample bandwidth is available for communication between the sensors and the fusion center

INTRODUCTION… The goal of the fusion center is to come up with a strategy to declare change, subject to false alarm constraints Towards this goal, pose the problem in a dynamic programming (DP) framework and first obtain sufficient statistics for the DP under consideration We then establish a recursion for the sufficient statistics which generalizes the recursion established in previous paper Following along the logic of previous work they establish the optimality of a more general stopping rule for change detection.

Outline

PROBLEM FORMULATION Consider L sensors that observes L dimensional Discrete time Stochastic process Disruption in the sensing environment occurs at random time constant Γ 1 Hence the density of the observations at each sensor undergoes a change from the null density f 0 to the alternate density f 1.

Problem Formulation… Previous work considers change to be instantaneous to all the sensors at time Γ 1 In this paper they consider change process which evolves across the sensor array and the change seen by the l th sensor is given by Γ l Also assume the evolution of the change process is Markovian Process across the sensor

Problem Formulation… Under this model, the change point evolves as a geometric random variable with parameter ρ. ▫P({Γ1 = m}) = ρ (1 − ρ)m, m ≥ 0. As ρ 1 corresponds to case where instantaneous disruption has high probability of occurrence As ρ 0 uniformizes the change point in the sense that the disruption is equally likely to happen at any point at any time

Problem Formulation… Observations at every sensor are independent and identically distributed (i.i.d.) conditioned on the change hypothesis corresponding to that sensor. ▫Zk, ∼ i.i.d. f 0 if k < Γ, i.i.d. f 1 if k ≥ Γ. Consider a centralized, Bayesian setup where a fusion center has complete knowledge of the observations from all the sensors ▫ Ik {Z1,...,Zk}

Problem Formulation… The fusion center decides whether a change has happened or not based on the information, Ik, available to it at time instant k (equivalently, it provides a stopping time τ)

Problem Formulation… Two conflicting performance measures on change detection are: - ▫Probability of false Alarm  PFA = P({τ < Γ1}) ▫The average detection delay,  EDD = E [(τ − Γ1)+] where x+ = max(x, 0).

Problem Formulation The previous two conflicts are captured by Bayes Risk which is defined as: ▫R(c) = PFA + cEDD ▫For an appropriate choice of per-unit delay cost ‘c’ The goal of the fusion center is to come up with a strategy (a stopping time τ) to minimize the Bayes risk

Outline

DP Framework In their previous paper they had rewritten Bayes Risk as: The state of the system at time k is the vector ▫Sk =[Sk,1,..., Sk,L] With Sk, denoting the state at sensor. The state Sk, can take the value 1 (post-change), 0 (prechange), or t (terminal). The system goes to the terminal state t, once a change-point decision τ has been declared.

Outline

Recursion for Sufficient Statistics Consider case where changes to all sensors happen at same instant. In this setting, it can be shown that Random Variable P({Γ1 ≤ k}|Ik) serves as the sufficient statistics for the dynamic program and affords a recursion But, we consider general case

Recursion for Sufficient Statistics But as we consider general case, i.e. slow propagation of change

Outline

Structure for Optimal Stopping Rule

Outline

References [1] M. Basseville and I. V. Nikiforov, Detection of Abrupt Changes: Theory and Applications. Prentice Hall, Englewood Cliffs, [2] T. L. Lai, “Sequential changepoint detection in quality control and dynamical systems,” J. R. Statist. Soc. B, Vol. 57, No. 4, pp. 613–658, [3] G. Lorden, “Procedures for reacting to a change in distribution,” Ann. Math. Statist., Vol. 42, pp. 1987– 1908, [4] G. V. Moustakides, “Optimal stopping times for detecting changes in distributions,” Ann. Statist., Vol. 14, pp. 1379–1387, [5] M. Pollak, “Optimal detection of a change in distribution,” Ann. Statist., Vol. 13, pp. 206–227, [6] A. N. Shiryaev, “On optimum methods in quickest detection problems,” Theory Probab. Appl., Vol. 8, pp. 22–46, [7] A. N. Shiryaev, Optimal Stopping Rules. Springer- Verlag, NY, [8] A. G. Tartakovsky, Sequential Methods in the Theory of Information Systems. Radio i Svyaz’, Moscow, 1991 (In Russian).