Download presentation
Presentation is loading. Please wait.
Published byEaster Newton Modified over 9 years ago
1
Dynamic control of sensor networks with inferential ecosystem models Jim Clark, Environm, Biol, Stat Pankaj Agarwal, Comp Sci David Bell, Environment Carla Ellis, Comp Sci Paul Flikkema, EE, NAU Alan Gelfand, Stat Gabriel Katul, Environment Kamesh Munagala, Comp Sci Gavino Puggioni, Comp Sci Adam Silberstein, Comp Sci Jun Yang, Comp Sci
2
Motivation Understanding forest response to global change (climate, CO 2 ) Forces at many scales –Complex interactions –lagged responses Uneven data needs: occasionally dense, at different scales Wireless networks can provide dense data, across landscapes
3
Ecosystem models that could use wireless data Physiology –PSN, respiration responses to weather, climate C/H 2 O/energy –Atmosphere/biosphere exchange (pool sizes, fluxes) Biodiversity –Differential demographic responses to weather/climate, CO 2, H 2 O
4
Physiological responses to weather H 2 O, N, P H 2 O, CO 2 light, CO 2 Temp PSN Resp Sap flux Allocation Fast, fine scales Precip
5
Sensors for ecosystem variables Soil moisture W j,t Precip P t Evap E j,t Transpir Tr j,t Drainage D t Light I j,t Temp T j,t VPD V j,t C/H 2 O/energy DemographyBiodiversity Physiology
6
WisardNet: a wireless network Multihop, self-organizing –Sensors for light, soil & air T, soil moisture, sap flux –Tower weather station Minimal in-network processing node sensor gateway Self-organizing wireless connections
7
Mapped stands All life history stages Seed rain Seed banks Seedlings Saplings Mature trees Interventions Canopy gaps Nutrient additions Herbivore exclosures Fire Environmental monitoring Canopy photos Soil moisture Temperature Wireless sensor networks Remote sensing
8
node sensor gateway Fluid topology Blackwood Division, Duke Forest
9
The goods and the bads The good: -Potential to collect dense data -Adapts to changing communication potential The bad: -Most data uninformative, redundant, or both -Battery life of weeks to months, depending on transmission rate -Checking and replacing batteries is the primary maintenance cost of network
10
…the ugly Unreliable! Battery life at 13 nodes Network partially down Failures Junk
11
A dynamic control problem What is an observation worth? (How to quantify learning?) The answer recognizes: –Transmission cost of an observation –Need to assess value in (near) real time Based on model(s) Minimal in-network computation capacity Use (mostly) local information –Potential for periodic out-of-network input
12
A framework for data collection ‘Predict or collect’: –Transmit an observation if it could not have been predicted by a model. Fast decisions (real-time) Must rely on (mostly) local information –Minimize transmission
13
Predictability of ecosystem data Where could a model stand in for data? Slow variables Predictable variables Events Less predictable
14
Which observations are ‘informative’? Shared vs unique data features (within nodes, among nodes) Exploit relationships among variables/nodes? Slow, predictable relationships? Light Precipitation Soil moisture
15
Model-dependent learning Exploit relations in space, time, and with other variables Learning from previous data collection If predictable, data have reduced value PAR at 3 nodes, 3 days observations
16
Controlling measurement with models Inferential modeling concerns –Some parameters ‘local’, some ‘global’ –Estimates of global parameters need transmission –Data can’t arrive faster than model converges Simple rules for local control of transmission –Rely mostly on local variables –Periodic updating from out of network –‘Transmit if you can’t predict’
17
In network data suppression An ‘acceptable error’, A standard reactive model based on change, Alternative: is the observation ‘predictable’, {z} j local sensor data (no transmission) { , z, w} t global data, periodically updated from full model M F full, out-of-network model M I simplified, in-network model
18
{y,E,Tr,D} t Data Calibration data (sparse!) Process Parameters Hyperparameters heterogeneity Process parameters Location effects time effect t-1 time effect t time effect t+1 Measurement errors Process error {y,E,Tr,D} t-1 {y,E,Tr,D} t+1 Out-of-network model is complex Sensor data z j,t
19
Soil moisture example Simulated process, parameters unknown Simulated data –TDR calibration, error known (sparse), –5 sensors, error/drift unknown (dense, but unreliable), Out-of-network estimate process/parameters Use estimates for in-network prediction Transmit only when predictions exceed threshold
20
Model summary Process: Sensor j: Rand eff: TDR calibration: Inference: Back to node j:
21
‘truth y ’ 95% CI 5 sensors z Calibration w Colors: Drift parameters { } Estimates and truth (dashed lines) Red dots: Simulated process & data
22
Process parameters Estimates and truth (dashed lines) Estimates from training phase
23
Keepers (40%), very few until drift accumulates Increasing drift reduces predictive capacity Keepers based on acceptable error Transmit obs only if From plug-in values:
24
Better ‘data’, few ‘observations’ ‘truth y ’ 95% CI 5 sensors z Calibration w Colors: Red dots: Reanalysis model for missing data: Known constraint on missing data
25
Additional variables
26
‘Collect or predict’ Inferential ecosystem models: a currency for learning assessment In-network simplicity: point predictions based on local info, periodic out-of- network inputs Out-of-network predictive distributions for all variables (‘reanalysis’ step) A role for inference in data collection, not just data analysis
27
Advantages over ‘reactive’ data collection in wireless networks ‘Change’ in a variable is not directly linked to its information content –Example: all soil moisture sensors may change at similar rates, making them largely redundant ‘Predictability’ emphasizes change that contains information The capacity to predict an observation summarizes its value and assures that it can be estimated to known precision in the reanalysis
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.