Download presentation
Presentation is loading. Please wait.
Published byLauren Heath Modified over 9 years ago
1
Embedded Networked Sensing for Environmental Monitoring: Applications and Challenges Deborah Estrin Center for Embedded Networked Sensing (CENS), Director UCLA Computer Science Department, Professor Work summarized here is largely that of students and staff at CENS
2
Embedded Networked Sensing Potential Micro-sensors, on- board processing, wireless interfaces feasible at very small scale--can monitor phenomena “up close” Enables spatially and temporally dense environmental monitoring Embedded Networked Sensing will reveal previously unobservable phenomena Contaminant TransportEcosystems, Biocomplexity Marine Microorganisms Seismic Structure Response
3
ENS enabled by Networked Sensor Node Developments LWIM III UCLA, 1996 Geophone, RFM radio, PIC, star network AWAIRS I UCLA/RSC 1998 Geophone, DS/SS Radio, strongARM, Multi-hop networks Sensor Mote UCB, 2000 RFM radio, Atmel Medusa, MK-2 UCLA NESL 2002 Predecessors in DARPA Packet Radio program USC-ISI Distributed Sensor Network Project (DSN)
4
ENS: Technology Design Themes Long-lived systems that can be untethered (wireless) and unattended Communication will be the persistent primary consumer of scarce energy resources (Mote: 720nJ/bit xmit, 4nJ/op) Autonomy requires robust, adaptive, self-configuring systems Leverage data processing inside the network Exploit computation near data to reduce communication, achieve scalability Collaborative signal processing Achieve desired global behavior with localized algorithms (distributed control) “The network is the sensor” (Manges&Smith, Oakridge Natl Labs, 10/98) Requires robust distributed systems of hundreds of physically-embedded, unattended, and often untethered, devices.
5
ENS Architecture Drivers Energy and scalability Heterogeneity of devices Smaller component size and cost Smaller component size and cost Embeddable Microsensors Embeddable Microsensors Networked Info-Mechanical Systems Networked Info-Mechanical Systems Distributed Signal and Information Processing Distributed Signal and Information Processing DRIVERSTECHNICAL CAPABILITIES Adaptive Self-Configuring Wireless Systems Adaptive Self-Configuring Wireless Systems Varied and variable environments
6
CENS Systems under design/construction Biology/Biocomplexity Microclimate monitoring Triggered image capture Canopy-net (Wind River Canopy Crane Site) Contaminant Transport County of Los Angeles Sanitation Districts (CLASD) wastewater recycling project, Palmdale, CA Seismic monitoring 50 node ad hoc, wireless, multi-hop seismic network Structure response in USGS- instrumented Factor Building w/ augmented wireless sensors
7
Ecosystem Monitoring Sensor system logical components Tasking, configuration (sample rates, event definition, triggering) Data Transport Device management, sample manipulation and caching with timing Duty cycling Other important examples of habitat monitoring systems Berkeley/Intel GDI and Botanical gardens
8
Extensible Sensing System (ESS) Software* Tiered architecture components Mica2 )motes (8 bit microcontrollers w/TOS with Sensor Interface Board hosting in situ sensors Microservers are solar powered, run linux, 32-bit processors Pub/sub bus over 802.11 to Databases, visualization and analysis tools, GUI/Web interfaces Data multicast over Internet on publish-and-subscribe bus system (called Subject Servers) to databases, GUIs, other data analysis tools, clients. * Osterweil, Rahimi, Mysore, Wimbrow
9
Common theme: local adaptation and redundancy Irregular deployment and environment Dynamic network topology Hand configuration will fail Scale, variability, maintenance Event Detection Localization & Time Synchronization Calibration Programming Model Information Transport, Aggregation and Storage Long-lived, Self-configuring Systems
10
Network Architecture: Can we adapt Internet protocols and “end to end” architecture? Internet routes data using IP Addresses in Packets and Lookup tables in routers Humans get data by “naming data” to a search engine Many levels of indirection between name and IP address Works well for the Internet, and for support of Person-to-Person communication Embedded, energy-constrained (un-tethered, small- form-factor), unattended systems cant tolerate communication overhead of indirection
11
Directed Diffusion*--Data Centric Routing Data centric approach has the right scaling properties name data (not nodes) with externally relevant attributes ( data type, time, location of node, SNR, etc) diffuse requests and responses across network using application driven routing (e.g., geo sensitive) support in-network aggregation and processing Not end to end data delivery Not just a database query * Heidemann et.al. SOSP ‘01, ** Krishnamachari et al. ‘02
12
Sink Sources Interest Gradient Routed Data Optimized version of general diffusion (Heidemann et al.) Pulls data out to only one sink at a time (saves energy) Used in Ecosystem application over Mica 2 motes: TinyDiffusion (Osterweil et al) Diffusion: One Phase Pull *
13
Voronoi Scoping: Restricted Floods from Multiple Sinks* Benefits of multiple sinks Reduce average path length Equalize load over multiple trees Tiered architecture, redundancy BUT: Linear increase in interests flooded! Voronoi clusters: partition topology, each subset contains nodes closest to associated sink. Only fwd interests from closest sink No overlap between floods Motes receive interest from their closest sink Scalable: both tiers grow, load per mote remains constant. Live network (emstar/emview) 3 sinks, 55 motes color-coded clusters *With Henri Dubois-Ferrière, EPFL
14
Multi-hop data extraction characteristics using Tiny Diffusion Collected basic network characteristics to verify readiness for sensor deployment Average system loss rates analyzed over fixed intervals and related to nodes of with various: average, minimum, and maximum hop counts (under 3% end to end) Additional nodes deployed to augment persistent ESS topology to study effects such as loss experienced by nodes introduced with less ground clearance. UCB/Intel GDI deployment has good results from their fielded borrow monitoring system using same Mote platform
15
Characterizing wireless channels* Great variability over distance (50-80% of communication range, vertical lines). Reception rate not normally distributed around mean and standard deviation. Real communication channel is not circular. 5 to 30% asymmetric links. Not correlated with distance or transmission power. Primary cause: differences in hardware calibration (rx sensitivity, energy levels, etc.). Time variations correlated to mean reception rate, not distance from transmitter. * Cerpa, Busek et. al
16
NIMS Architecture: Robotic, aerial access to full 3-D environment Enable sample acquisition Coordinated Mobility Enables self-awareness of Sensing Uncertainty Sensor Diversity Diversity in sensing resources, locations, perspectives, topologies Enable reconfiguration to reduce uncertainty and calibrate NIMS Infrastructure Enables speed, efficiency Low-uncertainty mobility Provides resource transport for sustainable presence * (Kaiser, Pottie, Estrin, Srivastava, Sukhatme, Villasenor) Research Challenge: Networked Info Mechanical Systems (NIMS) *
17
* P. Davis Core requirement is multi-hop time synchronization to eliminate dependence on GPS access at every node Broadband ad hoc seismic array *
18
GPS is the usual way to time-sync data collection -- but satellites are blocked in some interesting places Under Foliage CanyonsCanyons UnderwaterUnderwater IndoorsIndoors Sensor networks can propagate time from nodes that have a sky view to those that don’t. Enabling technology: “RBS” -- a new form of synchronization that exploits the nature of a wireless channel to achieve exceptional precision* * Elson et al. OSDI 12/02
19
Time Synchronization in Sensor Networks Also crucial in many other contexts Ranging, tracking, beamforming, security, MAC, aggregation etc. Global time not always needed NTP: often not accurate or flexible enough; diverse requirements! New ideas Local timescales Receiver-receiver sync Multihop time translation Post-facto sync Mote implementation ~10 s single hop Error grows slowly over hops SenderReceiver NIC Physical Media NIC Propagation Time Receiver NIC I saw it at t=4 I saw it at t=5 1 3 2 A 4 8 C 5 7 6 B 10 D 11 9 1 3 2 4 8 5 7 6 1011 9 * Elson et al. OSDI 12/02
20
Regulators require proof that the nitrate-laden treated water will not impact groundwater if used for irrigation. monitoring wells cost of $75K each Vertical array of sensors will measure rate of diffusion of water and nitrate levels Observed nitrate levels, local model will trigger contribute to field-wide estimate of hazardous Nitrate levels Field wide estimate re. concentrations and trends fed back to sprinkler quantity * T. Harmon Contaminant Transport Monitoring: Palmdale Pivot Study *
21
Research Challenge: Distributed Representation, Storage, Processing In network interpretation of spatially distributed data Statistical or model based filtering In network “event” detection and reporting Direct queries towards nodes with relevant data Trigger autonomous behavior based on events Expensive operations: high end sensors or sampling Robotic sensing, sampling Support for Pattern-Triggered Data Collection Multi-resolution data storage and retrieval Index data for easy temporal and spatial searching Spatial and temporal pattern matching Trigger in terms of global statistics (e.g., distribution) Exploit tiered architectures K V Time
22
Tiered Data Processing* Processing uses a two tiered network. Task divided into local computation and cluster head computation. Scope of local computation depends on relative cost of local (blue-blue) and cluster-head (blue-red) communication Example: identify regions over which large gradient occurring Locally, large gradients detected and traversed (up to some scope) Gradients paths over length threshold identified and reported Each cluster head combines identification results and classifies * T. Schoellhammer, et al
23
Research Challenge: Calibration, or lack thereof Storage, forwarding, aggregation, triggering useless unless data values calibrated Calibration = correcting systematic errors Sources of error: noise, systematic Causes: manufacturing, environment, age, crud Traditional in-factory calibration not sufficient must account for coupling of sensors to environment Nearer term: identify faulty sensors and flag data, discard for in-network processing Significant concern that faulty sensors can wreak havoc on in network processing * Bychkovskiy, Megerian, Potkonjak 70º 85º 69º 73º 61º 72º Un-calibrated Sensors 72º 62º Factory Calibrated Sensors; Later Dust 72º Factory Calibrated Sensors: T0 70º 71º
24
Research Challenge: Macroprogramming* How to specify what, where and when? data modality and representation, spatial/temporal resolution, frequency, and extent How to describe desired processing? Aggregation, Interpolation, Model parameters Triggering across modalities and nodes Adaptive sampling Primitives Annotated topology/resource discovery Region identification and characterization Intra-region coordination/synch System health data, alerts Topology, Resources (energy, link, storage) Sensor data management (buffering, timing) … * Greenstein, Culler, Kohler…
25
Lessons Channel models Simplistic circular channel models can be very deceiving so experimentation and emulation are critical Named data Is the right model but its only a small step toward the bigger problem of Macroprogramming Duty cycling Critical from the outset…and tricky to get right--granularity, level (application or communication) Tiered Architectures One size doesn’t fit all and maybe it doesn’t fit any-- distribution of resources (energy, storage, comm, cpu) across the distributed system is an interesting problem Its all a lot harder, and even more interesting than it looked 5 years ago
26
Follow up regarding IT aspects Embedded Everywhere: A Research Agenda for Networked Systems of Embedded Computers, Computer Science and Telecommunications Board, National Research Council - Washington, D.C., http://www.cstb.org/http://www.cstb.org/ Conferences: ACM Sensys (Nov 03), WSNA (today), IPSN, SNPA (ICC), Mobihoc, Mobicom, Mobisys, Sigcomm, Infocom, SOSP, OSDI, ASPLOS, ICASSP, … Whose involved: Active research programs in many CS (networking, databases, systems, theory, languages) and EE (low power, signal processing, comm, information theory) departments Industrial research activities at Intel, PARC, Sun, HP, Agilent, Motorola… Startup activity at Crossbow, Sensicast, Dust Inc, Ember, … Related Funding Programs DARPA SenseIT, NEST NSF ITR, Sensors and sensor networks
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.