Presentation is loading. Please wait.

Presentation is loading. Please wait.

Monitoring rivers and lakes [IJCAI ‘07]

Similar presentations


Presentation on theme: "Monitoring rivers and lakes [IJCAI ‘07]"— Presentation transcript:

1 Monitoring rivers and lakes [IJCAI ‘07]
Need to monitor large spatial phenomena Temperature, nutrient distribution, fluorescence, … NIMS Kaiser et.al. (UCLA) Color indicates actual temperature Predicted temperature Depth Location across lake Predict at unobserved locations 1 1

2 Monitoring water networks [J Wat Res Mgt 2008]
Contamination of drinking water could affect millions of people Hach Sensor Sensors Simulator from EPA Place sensors to detect contaminations “Battle of the Water Sensor Networks” competition ~$14K Where should we place sensors to quickly detect contamination?

3 Example reward functions
Entropy Reward[ P(X) ] = -H(X) = x P(x) log2 P(x) Expected mean squared prediction error (EMSE) Reward[ P(X) ] = -1/n s Var(Xs), Many other objectives possible and useful… 3

4 Properties and Examples
Submodularity Properties and Examples

5 Key observation: Diminishing returns
Placement B = {Y1,…, Y5} Placement A = {Y1, Y2} Y2 Y1 Y‘ Adding Y’ will help a lot! Adding Y’ doesn’t help much New sensor Y’ 5

6 Set functions Finite set V = {1,2,…,n} Function F: 2V ! R
Will always assume F(;) = 0 (w.l.o.g.) Assume black-box that can evaluate F for any input A Approximate (noisy) evaluation of F is ok (e.g., [37]) 6

7 Submodular set functions
Submodularity: B A For AµB, sB, F(A [ {s}) – F(A) ¸ F(B [ {s}) – F(B)

8 Set cover is submodular
A={S1,S2} S1 S2 S’ F(A[{S’})-F(A) F(B[{S’})-F(B) S1 S2 S3 S’ S4 B = {S1,S2,S3,S4} 8

9 Example: Mutual information
Given random variables X1,…,Xn F(A) = I(XA; XVnA) = H(XVnA) – H(XVnA |XA) Lemma: Mutual information F(A) is submodular 9 9

10 Closedness properties
F1,…,Fm submodular functions on V and 1,…,m > 0 Then: F(A) = i i Fi(A) is submodular! Submodularity closed under nonnegative linear combinations! Extremely useful fact!! F(A) submodular )  P() F(A) submodular! Multicriterion optimization: F1,…,Fm submodular, i¸0 ) i i Fi(A) submodular 10

11 Monotonicity A set function is called monotonic if AµBµV ) F(A) · F(B)
11

12 Approximate maximization
Greedy algorithm: Start with A0 = ; For i = 1 to k si := argmaxs F(Ai-1 [ {s}) - F(Ai-1) Ai := Ai-1 [ {si} 12 12

13 One reason submodularity is useful
Theorem [Nemhauser et al ‘78] Greedy algorithm gives constant factor approximation F(Agreedy) ¸ (1-1/e) F(Aopt) Greedy algorithm gives near-optimal solution! For information gain: Guarantees best possible unless P = NP! [Krause & Guestrin ’05] ~63% 13 13 13 13

14 Temperature data from sensor network
Performance of greedy Optimal Greedy Temperature data from sensor network Greedy empirically close to optimal. Why? 14

15 Example: Sensor Placement

16 Lazy greedy algorithm [Minoux ’78]
First iteration as usual Keep an ordered list of marginal benefits i from previous iteration Re-evaluate i only for top element If i stays on top, use it, otherwise re-sort Benefit s(A) a a a d b b b c c e d d c e e 16 16 16

17 Result of lazy evaluation
1 2 3 4 5 6 7 8 9 10 100 200 300 Number of sensors selected Running time (minutes) Exhaustive search (All subsets) Naive greedy Fast greedy Lower is better 17 17

18 Data dependent bounds [Minoux ’78]
Suppose A is candidate solution to argmax F(A) s.t. |A| · k and A* = {s1,…,sk} be an optimal solution Then F(A*) · F(A [ A*) = F(A)+i F(A[{s1,…,si})-F(A[ {s1,…,si-1}) · F(A) + i (F(A[{si})-F(A)) = F(A) + i si For each s 2 VnA, let s = F(A[{s})-F(A) Order such that 1 ¸ 2 ¸ … ¸ n Then: F(A*) · F(A) + i=1k i 18

19 Bounds on optimal solution [Krause et al., J Wat Res Mgt ’08]
5 10 15 20 0.2 0.4 0.6 0.8 1 1.2 1.4 Offline (Nemhauser) bound Data-dependent bound Greedy solution Water networks data Number of sensors placed Sensing quality F(A) Higher is better Submodularity gives data-dependent bounds on the performance of any algorithm 19

20 The pSPIEL Algorithm [K, Guestrin, Gupta, Kleinberg IPSN ‘06]
pSPIEL: Efficient nonmyopic algorithm (padded Sensor Placements at Informative and cost- Effective Locations) Select starting and ending location s1 and sB Decompose sensing region into small, well-separated clusters Solve cardinality constrained problem per cluster (greedy) Combine solutions using orienteering algorithm Smooth resulting path 20 20

21 Locality For two sets separated by at least distance r:

22 Example: Sensor Placement

23 pSPIEL

24 Approximation


Download ppt "Monitoring rivers and lakes [IJCAI ‘07]"

Similar presentations


Ads by Google