Remote Sensing of Optically Shallow Waters Retrieval of Bathymetry, Bottom Classification, and Water Optical Properties from Hyperspectral Imagery Curtis.

Slides:



Advertisements
Similar presentations
A Graphical Operator Framework for Signature Detection in Hyperspectral Imagery David Messinger, Ph.D. Digital Imaging and Remote Sensing Laboratory Chester.
Advertisements

Characterization of radiance uncertainties for SeaWiFS and Modis-Aqua Introduction The spectral remote sensing reflectance is arguably the most important.
Resolution.
GlobColour CDR Meeting ESRIN July 2006 Merging Algorithm Sensitivity Analysis ACRI-ST/UoP.
Liang APEIS Capacity Building Workshop on Integrated Environmental Monitoring of Asia-Pacific Region September 2002, Beijing,, China Atmospheric.
REMOTE SENSING Presented by: Anniken Lydon. What is Remote Sensing? Remote sensing refers to different methods used for the collection of information.
August 5 – 7, 2008NASA Habitats Workshop Optical Properties and Quantitative Remote Sensing of Kelp Forest and Seagrass Habitats Richard C. Zimmerman -
Spatial Statistics in Ecology: Case Studies Lecture Five.
CORAL REEF MAPPING IN THE RED SEA (HURGHADA, EGYPT) BASED ON REMOTE SENSING Presented by: Justin Prosper s
Thesis Committee Dr. Nitin Kumar TripathiChairperson Prof. Seishiro KibeMember Dr. Wenresti GallardoMember 14-May-2008 MAPPING CHANGES IN THE MARINE ENVIRONMENT.
Radiometric and Geometric Errors
Retrieval of smoke aerosol loading from remote sensing data Sean Raffuse and Rudolf Husar Center for Air Pollution Impact and Trends Analysis Washington.
Remote sensing in meteorology
2 Remote sensing applications in Oceanography: How much we can see using ocean color? Adapted from lectures by: Martin A Montes Rutgers University Institute.
Constraining aerosol sources using MODIS backscattered radiances Easan Drury - G2
Hyperspectral Imagery
Digital Imaging and Remote Sensing Laboratory Real-World Stepwise Spectral Unmixing Daniel Newland Dr. John Schott Digital Imaging and Remote Sensing Laboratory.
HyCODEJan 2003 EcoLight: Irradiance calculations for coupled ecosystem models Curtis D. Mobley Sequoia Scientific, Inc. Bellevue, WA In collaboration with.
METO 621 Lesson 27. Albedo 200 – 400 nm Solar Backscatter Ultraviolet (SBUV) The previous slide shows the albedo of the earth viewed from the nadir.
Hyperspectral Satellite Imaging Planning a Mission Victor Gardner University of Maryland 2007 AIAA Region 1 Mid-Atlantic Student Conference National Institute.
Introduction to Digital Data and Imagery
Digital Imaging and Remote Sensing Laboratory Mixed Pixels and Spectral Unmixing 1 Spectral Mixtures areal aggregate intimate Linear mixing models (areal.
Reflected Solar Radiative Kernels And Applications Zhonghai Jin Constantine Loukachine Bruce Wielicki Xu Liu SSAI, Inc. / NASA Langley research Center.
Operational Radar and Optical MApping Partners The OROMA team consists of 7 developers and 4 end users from coastal management: Overview Beach nourishment.
Accuracy Assessment. 2 Because it is not practical to test every pixel in the classification image, a representative sample of reference points in the.
Satellite Imagery and Remote Sensing NC Climate Fellows June 2012 DeeDee Whitaker SW Guilford High Earth/Environmental Science & Chemistry.
Remote Sensing Hyperspectral Remote Sensing. 1. Hyperspectral Remote Sensing ► Collects image data in many narrow contiguous spectral bands through the.
Radiometric and Geometric Correction
Blue: Histogram of normalised deviation from “true” value; Red: Gaussian fit to histogram Presented at ESA Hyperspectral Workshop 2010, March 16-19, Frascati,
Using spectral data to discriminate land cover types.
Spectral Requirements for Resolving Shallow Water Information Products W. Paul Bissett and David D. R. Kohler.
U.S. Department of the Interior U.S. Geological Survey Multispectral Remote Sensing of Benthic Environments Christopher Moses, Ph.D. Jacobs Technology.
Resolution A sensor's various resolutions are very important characteristics. These resolution categories include: spatial spectral temporal radiometric.
Image Classification Digital Image Processing Techniques Image Restoration Image Enhancement Image Classification Image Classification.
Resolution Resolution. Landsat ETM+ image Learning Objectives Be able to name and define the four types of data resolution. Be able to calculate the.
Accomplishments Conclusions Dyctiota (macroalgae), Diploria clivosa and the gorgonians show similar reflectance curves due to the zooxanthellae (unicellular.
High Spectral Resolution Infrared Land Surface Modeling & Retrieval for MURI 28 April 2004 MURI Workshop Madison, WI Bob Knuteson UW-Madison CIMSS.
University of Wisconsin GIFTS MURI University of Hawaii Contributions Paul G. Lucey Co-Investigator.
Chapter 4. Remote Sensing Information Process. n Remote sensing can provide fundamental biophysical information, including x,y location, z elevation or.
Estimating Water Optical Properties, Water Depth and Bottom Albedo Using High Resolution Satellite Imagery for Coastal Habitat Mapping S. C. Liew #, P.
HOW DO WE STUDY THE SEAFLOOR?. 1. Line-sounding – starting around 85 B.C. lead weighted ropes were dropped over the side of the boat and the depth was.
What is an image? What is an image and which image bands are “best” for visual interpretation?
 Introduction to Remote Sensing Example Applications and Principles  Exploring Images with MultiSpec User Interface and Band Combinations  Questions…
A Study on the Effect of Spectral Signature Enhancement in Hyperspectral Image Unmixing UNDERGRADUATE RESEARCH Student: Ms. Enid Marie Alvira-Concepción.
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Examples of Closure Between Measurements and HydroLight Predictions Curtis D. Mobley Sequoia Scientific, Inc. Bellevue, Washington Maine 2007.
Kelley Bostrom University of Connecticut NASA OCRT Meeting May 12, 2010.
The Second TEMPO Science Team Meeting Physical Basis of the Near-UV Aerosol Algorithm Omar Torres NASA Goddard Space Flight Center Atmospheric Chemistry.
14 ARM Science Team Meeting, Albuquerque, NM, March 21-26, 2004 Canada Centre for Remote Sensing - Centre canadien de télédétection Geomatics Canada Natural.
Sensitivity Analysis (SA) SA studies the effect of changes in model assumptions (Nuisance Parameters) on a given output (Parameters of interest)[9]. For.
A processing package for atmospheric correction of compact airborne spectrographic imager (casi) imagery over water including a novel sunglint correction.
Hyperspectral remote sensing
Remote Sensing Unsupervised Image Classification.
NASA’s Coastal and Ocean Airborne Science Testbed (COAST) L. Guild 1 *, J. Dungan 1, M. Edwards 1, P. Russell 1, S. Hooker 2, J. Myers 3, J. Morrow 4,
Estimating the uncertainties in the products of inversion algorithms or, how do we set the error bars for our inversion results? Emmanuel Boss, U. of Maine.
Data Models, Pixels, and Satellite Bands. Understand the differences between raster and vector data. What are digital numbers (DNs) and what do they.
NOTE, THIS PPT LARGELY SWIPED FROM
Active Remote Sensing for Elevation Mapping
Integrating LiDAR Intensity and Elevation Data for Terrain Characterization in a Forested Area Cheng Wang and Nancy F. Glenn IEEE GEOSCIENCE AND REMOTE.
High-Resolution Determination of Coral Reefs using Fluorescence Imagery Laser Line Scanner (FILLS)
Orbits and Sensors Multispectral Sensors. Satellite Orbits Orbital parameters can be tuned to produce particular, useful orbits Geostationary Sun synchronous.
Hyperspectral Sensing – Imaging Spectroscopy
Digital Data Format and Storage
Hydrolight and Ecolight
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Hyperspectral Image preprocessing
What Is Spectral Imaging? An Introduction
Image Information Extraction
Igor Appel Alexander Kokhanovsky
Remote sensing in meteorology
Presentation transcript:

Remote Sensing of Optically Shallow Waters Retrieval of Bathymetry, Bottom Classification, and Water Optical Properties from Hyperspectral Imagery Curtis D. Mobley Sequoia Scientific, Inc. Bellevue, Washington, USA Work underway with W. Paul Bissett, et al. Florida Environmental Research Institute Tampa, Florida

Overview What’s the problem? How the terrestrial RS people solve a similar problem Why the terrestrial solution doesn’t work for the ocean Airborne hyperspectral remote sensing One way to attack the ocean problem via a spectrum matching and look-up-table (LUT) methodology One example analysis Error estimates for the retrieved information

The Problem Many areas of the coastal ocean are difficult to map (shallow water, dangerous coral reefs, denied access) and monitor (for detection of storm effects on bathymetry and bottom vegetation; ecosystem changes in water quality or bottom vegetation due to human inputs or climate change) at high spatial and temporal resolution. Need to extract environmental information (bathymetry, bottom type, water-column inherent optical properties) from remote-sensing reflectances R rs.

Terrestrial Thematic Mapping Generation of maps of vegetation type, land usage, population density, etc. is called “thematic mapping.” Intensively studied for terrestrial mapping for >30 years (e.g., Landsat multispectral sensors)

A Common Terrestrial Solution (1) Build a library of measured R rs spectra for various surface types (bare soil, grasslands, forest, pavement, healthy crops, diseased crops, etc.) simple example: 2 wavelengths and 3 classes R rs 11 22 bare soil water forest R rs (  1 ) R rs (  2 )

A Common Terrestrial Solution (2) Compute class mean R rs spectra and covariance matrices R rs 11 22 bare soil water forest R rs (  1 ) R rs (  2 )

The mean spectrum for class m, having N m spectra each with K wavelengths, is (dropping the rs from R rs ) Math Details The K  K covariance matrix for class m is  m (i,j) tells how R rs at wavelength i covaries with R rs at wavelength j [units of 1/sr 2 for R rs ]  m (i,i) is the variance at wavelength i i = 1,…,K i, j = 1,…,K

A Common Terrestrial Solution (3) Let I be an image R rs spectrum (i.e., R rs for a particular image pixel, after good atmospheric correction). In supervised classification, the object is to assign I to one of the predetermined classes of R rs spectra. One powerful way to do this is using maximum liklihood estimation (MLE), which says (trust me, you don’t want to see the derivation) that I most likely belongs to the class m having the smallest value of D 2 MLE (m) is the “distance” between I and the mean spectrum for class m. Note: we compare the image spectrum I only with the mean spectrum from each predetermined class; very fast since the matrices are precomputed ( |  m | is the determinant of  m ;  m -1 is the inverse ) For the details see Richards, J. A. and X. Jia, Remote Sensing Digital Image Analysis: An Introduction; Fourth Edition. Springer.

A Common Terrestrial Solution (4) In summary: use predetermined classes of surface types (trees, grass, etc.) use supervised classification use MLE to determine the “best fit” of the image spectra to the allowed classes This works very well for classification of land surfaces, so now let’s do the same thing for mapping of optically shallow waters.

Oceanic Thematic Mapping Can I extract bottom depth, bottom type, and water IOPs from imagery of optically shallow waters? image acquired by NRL-DC as part of the CoBOP program.

MLE Does NOT Work for the Ocean Problem (or, perhaps, I’m just not smart enough to figure out how to make it work) We cannot define simple “sand,” “coral,” etc. bottom classes because of the effects of depth and water column IOPs on the bottom reflectance spectra. Every combination of a bottom type, depth, and water IOPs is an individual class.

Terrestrial vs. Shallow-water Remote Sensing Terrestrial thematic mapping retrieves only the surface type For shallow waters, I need the bottom type AND the water depth AND the water IOPs, which is asking for much more and is thus a much more difficult problem Land surfaces are often very reflective (R = ), so that atmospheric correction is less critical (surface reflected radiance is a bigger part of the measured total radiance) Oceans are generally very dark (R < 0.05), so that very good atmospheric correction is required to obtain accurate R rs I thus need all of the information I can get, e.g., well calibrated, hyperspectral R rs

Hyperspectral Airborne Imagery Hyperspectral imagery acquired from aircraft allows for rapid, high resolution observation of coastal waters. rapid: imagery over large areas (>100 km 2 ) can be acquired and processed in days high resolution: ground resolution is ~1m to a few meters, as desired cost: much less than data collection from small boats or diver observations Hyperspectral: 30 or more bands with 10 nm or better resolution Typically have >100 bands with ~5 nm resolution

The PHILLS Sensor PHILLS: Portable Hyperspectral Imager for Low-Light Spectroscopy (developed in 1990’s by NRL-DC) A pushbroom scanning spectrometer: records calibrated hyperspectral radiances along a line perpendicular to the flight direction There are several similar systems now in use (CASI, SAMPSON, etc)

spatial wavelength camera optics image the ground onto the focal plane slit selects the across-track spatial dimension prism disperses the light, nm 2D CCD records radiance as a function of across-track position (1024 pixels) and wavelength (128 or 256 bands) build up radiance (and remote sensing reflectance, after atmospheric correction) as a function of (x,y,  ) as the plane flies ground scene PHILLS Optical Design

PHILLS in Use

Spectrum Matching and Look-Up-Table R rs Inversion (Mobley et al., Applied Optics, 44(17), ) LUT retrieval: Depth 2.75 m 80% sand, 20% grass IOP set #17 pixel R rs extraction database of R rs spectra database search spectrum match

Authors in oceanography include Bachmann, Bissett, Davis, Goetz, Hoyer, Lee, Liu, Louchard, Lyzenga, Mobley, Sandidge, to name a few… Spectrum Matching Is Nothing New Google Scholar: 973 refs with "spectrum matching" (all fields, not just earth remote sensing) There are extensive spectra reflectance libraries for terrestrial and manmade surface types, but not for the ocean. I have to build my library of R rs spectra using HydroLight. Each R rs spectrum corresponds to a known bottom depth, bottom reflectance (either a pure spectrum or a mixture of various bottom types), and a, b, and b b spectra. (I currently have ~10 6 R rs spectra)

Example: PHILLS Horseshoe Reef Image NRL-DC PHILLS image from ONR CoBOP program, May x899 pixels at ~1.3 m resolution Horseshoe Reef ooid sand mixed sediment, corals, turf algae, seagrass Lee Stocking Island, Bahamas dense seagrass

● Unconstrained inversions I know nothing about the environment, so I do a simultaneous retrieval of everything: bathymetry, bottom reflectance and type, and water-column absorption, scatter, & backscatter ● Constrained inversions I know the bathymetry and/or water optical properties, so I retrieve only what I don’t know Adding (correct) constraints adds information, so presumably the retrievals of the remaining unknowns will be improved Unconstrained and Constrained Inversions

Unconstrained Bathymetry Retrieval Black:NRL acoustic survey for ONR CoBOP program Color:LUT unconstrained depth retrieval acoustic bathymetry coverage is a few meters along track and ~10 m cross track resolution; interpolate to pixel level for depth-constrained retrievals

Unconstrained LUT vs. Acoustic Bathymetry These “LUT errors” also include errors due to latitude-longitude calculations in mapping acoustic ping locations to image pixels (horizontal errors of several meters or more due to failure of built- in navigation instrument)

Unconstrained LUT Bottom Classification photo of dense seagrass over sand substrate

Depth-Constrained Bottom Classification Depth- constrained inversion Unconstrained inversion some sand substrate with sparse vegetation changed to pure sediments

Depth-Constrained Bottom Classification Depth- constrained inversion Unconstrained inversion some dense vegetation changed to pure corals or mixtures

IOP-Constrained Retrievals dots and squares: two sets of ac9 data from the Horseshoe Reef area. lines: similar a and b from the LUT IOP database; the four backscatter curves have particle backscatter fractions of 0.01, 0.02, 0.03, and 0.04 To constrain the IOPs, assume that a and b are constant over the image area (probably wrong: CDOM decreases as go off shore, and resuspended sediment likely higher near shore)

IOP-Constrained Bathymetry IOP- constrained inversion Unconstrained inversion

IOP-Constrained Bathymetry IOP-constrained inversion Unconstrained inversion Constraining the IOPs gave slightly greater depths on average but did not greatly improve the bathymetry retrieval.

Depth- and IOP-Constrained Bottom Classification for Horseshoe Reef Divers laid down transects along the reef (within the polygon) and measured the bottom coverage via photography (very laborious)

retrieval databasebare sand dark sediment or sand and sparse grass sediment mixed with grass, turf, macrophytes pure coral sediment mixed w/ coral and algae depth errors: % error/rms err/ % in ±1m/ % in ±25% 7 (a, b) x 4 B p ; unconstrained depths /1.16/66/92 4 (a, b) x 4 B p ; unconstrained depths /1.20/65/90 4 (a, b), B p = 0.02; unconstrained depths /1.24/65/90 1 (a, b), B p = 0.02; unconstrained depths /1.39/61/86 7 (a, b) x 4 B p ; constrained depths NA 4(a, b) x 4 B p ; constrained depths NA 4(a, b) ), B p = 0.02; constrained depths NA 1 (a, b), B p = 0.02; constrained depths NA measured (along diver transects) ~2.5hardpan~14turf + algae from Lesser & Mobley, Coral Reefs (accepted) Depth- and IOP-Constrained Bottom Classification for Horseshoe Reef

kNN Error Analysis The previous figures were generated using the one “best fit” or “closest” database R rs for each pixel (smallest Euclidean distance: D 2 =  j [R rsdb (  j ) - R rsim (  j )] 2 ) Many spectra in the database are very similar and correspond to slightly different environmental conditions (depths, bottom reflectances, IOPs) Noise in the image R rs may cause different database spectra to be the closest match, and thus give different retrievals Rather than using just the closest match, find the k closest matching database spectra (k Nearest Neighbors, kNN) and “vote” on the retrieval

kNN Error Analysis PHILLS R rs (blue) best fit (k = 1) database spectrum (red) k = 50 closest database spectra (green)

kNN Error Analysis depth distribution for the k = 50 closest spectra (red) best fit gaussian depth distribution (equal area; blue)

kNN Error Analysis best fit (k = 1) retrieval: depth z b = 6.00 m bottom type = pure sea grass IOPs = database set 49 k = 50 NN retrievals: depth: gaussian fit: mean z b = 5.89 m, std dev = 0.42 m bottoms: 44 pure sea grass; 4 10% sand + 90% grass; 1 turf algae; 1 sargassum --> bottom is dense sea grass with good confidence IOPs: 16 set 42; 8 set 43; 15 set 49, 11 were 4 others --> IOPs were probably close to database IOP sets 42, 43, or 49 (all of which are similar) ground truth: depth = 5.78 m (acoustic) bottom type: dense sea grass (visual) IOPs: were not measured at this pixel and time, but the retrievals are consistent with IOPs measured in this area

kNN Error Analysis On average, using statistical estimation based on k ~ 50 NN gives ~ 25% reduction in rms depth errors (preliminary finding) kNN analysis also gives quantitative estimates of the errors in the retrievals, which is VERY IMPORTANT

Conclusions (1) For this image: Unconstrained retrievals of depth, bottom reflectance, and water- column IOPs are consistent with available ground truth. Constraining the bathymetry does not greatly change the bottom classification and IOP retrievals. Why not? Because the unconstrained bathymetry was already close to correct. Constraining the IOPs does not greatly change the retrieved bathymetry and bottom classification, because the unconstrained IOPs were already close to correct Constraining both bathymetry and IOPs does not greatly change the retrieved bottom classification (ditto) This indicates that non-uniqueness was not a problem: LUT did not find combinations of wrong depth, wrong bottom reflectance, and wrong IOPs that gave almost the same R rs spectrum as the correct solution.

Conclusions (2) Adding constraints does however greatly improve the image processing time because less of the LUT R rs database needs to be searched for each pixel. For the Horseshoe Reef image (on a 2 GHz PC): unconstrained inversion: 71 minutes (>10 10 R rs comparisons) depth-constrained inversion: 25 min IOP-constrained inversion: 27 min depth- and IOP-constrained inversion: 3.5 min Statistical estimation techniques based on kNN matching can provide quantitative error estimates on the retrieved information

What is the Future of Remote Sensing? Ocean color remote sensing has a long history of multispectral satellite imagers (CZCS, SeaWiFS, MODIS, and — let us pray for success — NPOESS/VIIRS, and others from other countries) Hyperspectral sensors so far have been placed only on aircraft, and hyperspectral imagery is proving useful for RS of optically shallow waters and coastal waters (very active research area in many countries) Why no hyperspectral sensors on satellites? It’s technologically risky It’s politically risky The advantage of HS over MS has not been convincingly presented IMHO, hyperspectral is not just hype. The future of ocean RS lies in hyperspectral sensors and new classes of retrieval algorithms that make use of both spectral magnitude and shape (neural networks, spectrum matching, etc.). And don’t forget polarization….

Go forth now into your optical oceanography careers with great self confidence!!