Maa-57.2040 Kaukokartoituksen yleiskurssi General Remote Sensing Image restoration Autumn 2007 Markus Törmä

Slides:



Advertisements
Similar presentations
Digital Image Processing
Advertisements

Spectral Reflectance Curves
Radiometric Corrections
Major Operations of Digital Image Processing (DIP) Image Quality Assessment Radiometric Correction Geometric Correction Image Classification Introduction.
Measurement of Radiation - Solar radiation - Long wave radiation - Net radiation - Exposure of radiation sensors.
Landsat-based thermal change of Nisyros Island (volcanic)
Class 8: Radiometric Corrections
Image Preprocessing Image Preprocessing.
Radiometric and Geometric Errors
Atmospheric effect in the solar spectrum
Retrieval of smoke aerosol loading from remote sensing data Sean Raffuse and Rudolf Husar Center for Air Pollution Impact and Trends Analysis Washington.
Remote sensing in meteorology
Atmospheric scatterers
Lecture 12: Image Processing Friday 11 February 2011 Last lecture: Earth-orbiting satellites Reading Ch
VALIDATION OF REMOTE SENSING CLASSIFICATIONS: a case of Balans classification Markus Törmä.
Karthaus, September 2005 Wouter Greuell IMAU, Utrecht, NL -Why? -Cloud masking -Retrieval method -An application: estimate surface mass balance from satellite.
A 21 F A 21 F Parameterization of Aerosol and Cirrus Cloud Effects on Reflected Sunlight Spectra Measured From Space: Application of the.
Energy interactions in the atmosphere
January 20, 2006 Geog 258: Maps and GIS
Radiometric Correction
ESTEC July 2000 Estimation of Aerosol Properties from CHRIS-PROBA Data Jeff Settle Environmental Systems Science Centre University of Reading.
Lecture 12: Image Processing Thursday 12 February Last lecture: Earth-orbiting satellites Reading, LKC p
Chapter 12 Spatial Sharpening of Spectral Image Data.
Maa Kaukokartoituksen yleiskurssi General Remote Sensing Image enhancement I Autumn 2007 Markus Törmä
Radiometric Correction and Image Enhancement
1 Image Pre-Processing. 2 Digital Image Processing The process of extracting information from digital images obtained from satellites Information regarding.
Remote Sensing Image Rectification and Restoration
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
Image Restoration and Atmospheric Correction Lecture 3 Prepared by R. Lathrop 10/99 Revised 2/04.
Radiometric and Geometric Correction
Orthorectification using
Maa Kaukokartoituksen yleiskurssi General Remote Sensing Image enhancement II Autumn 2007 Markus Törmä
Chapter 5 Remote Sensing Crop Science 6 Fall 2004 October 22, 2004.
West Hills College Farm of the Future. West Hills College Farm of the Future Precision Agriculture – Lesson 4 Remote Sensing A group of techniques for.
Karnieli: Introduction to Remote Sensing
Digital Image Processing GSP 216. Digital Image Processing Pre-Processing – Correcting for radiometric and geometric errors in data Image Rectification.
Electromagnetic Radiation Most remotely sensed data is derived from Electromagnetic Radiation (EMR). This includes: Visible light Infrared light (heat)
What is an image? What is an image and which image bands are “best” for visual interpretation?
Radiometric Correction and Image Enhancement Modifying digital numbers.
7 elements of remote sensing process 1.Energy Source (A) 2.Radiation & Atmosphere (B) 3.Interaction with Targets (C) 4.Recording of Energy by Sensor (D)
Lecture 3 The Digital Image – Part I - Single Channel Data 12 September
NOAA/NESDIS Cooperative Research Program Second Annual Science Symposium SATELLITE CALIBRATION & VALIDATION July Barry Gross (CCNY) Brian Cairns.
Topographic correction of Landsat ETM-images Markus Törmä Finnish Environment Institute Helsinki University of Technology.
Digital Image Processing Definition: Computer-based manipulation and interpretation of digital images.
BIOPHYS: A Physically-based Algorithm for Inferring Continuous Fields of Vegetative Biophysical and Structural Parameters Forrest Hall 1, Fred Huemmrich.
EG2234: Earth Observation Interactions - Land Dr Mark Cresswell.
Under the direction of Rudolf Husar
Co-Retrieval of Surface Color and Aerosols from SeaWiFS Satellite Data Outline of a Seminar Presentation at EPA May 2003 Sean Raffuse and Rudolf Husar.
Applying Pixel Values to Digital Images
Remote Sensing Image Enhancement. Image Enhancement ► Increases distinction between features in a scene ► Single image manipulation ► Multi-image manipulation.
Co-Retrieval of Surface Color and Aerosols from SeaWiFS Satellite Data Outline of a Seminar Presentation at EPA May 2003 Sean Raffuse and Rudolf Husar.
Remote Sensing Waves transport energy. According to quantum theory, light may be considered not only as an electro-magnetic wave but also as a "stream"
Satellites Storm “Since the early 1960s, virtually all areas of the atmospheric sciences have been revolutionized by the development and application of.
Change Detection Goal: Use remote sensing to detect change on a landscape over time.
Interactions of EMR with the Earth’s Surface
NOTE, THIS PPT LARGELY SWIPED FROM
Electro-optical systems Sensor Resolution
Remote sensing/digital image processing. Color Arithmetic red+green=yellow green+blue=cyan red+blue=magenta.
Remote Sensing Part 2 Atmospheric Interactions & Pre-processing.
Electromagnetic Radiation
Professor Ke-Sheng Cheng
Radiometric Preprocessing: Atmospheric Correction
7 elements of remote sensing process
Hyperspectral Image preprocessing
Image Information Extraction
Sensor Effects Calibration: correction of observed data into physically meaningful data by using a reference. DN  Radiance (sensor)  Radiance (surface)
Spectral Transformation
Remote sensing in meteorology
Lecture 12: Image Processing
Presentation transcript:

Maa Kaukokartoituksen yleiskurssi General Remote Sensing Image restoration Autumn 2007 Markus Törmä

Digital image processing Image is manipulated using computer Image  mathematical operation  new image Application areas: –Image restoration –Image enhancement –Image interpretation / classification

Image restoration Errors due to imaging process are removed Geometric errors –position of image pixel is not correct one when compared to ground Radiometric errors –measured radiation do not correspond radiation leaving ground Aim is to form faultless image of scene

Image enhancement Image is made better suitable for interpretation Different objects will be seen better  manipulation of image contrast and colors Different features (e.g. linear features) will be seen better  e.g. filtering methods Multispectral images: combination of image channels to compress and enhance imformation –ratio images –image transformations Necessary information is emphasized, unnecessary removed

Digital image processing Analog signal: –phenomena is described of measured continuosly according to time or space Digital signal: –analog signal is sampled with some interval 2-dimensional digital signal digital image

Digital image processing Image function f(x,y): Function according to spatial coordinates x and y Value of function in position (x,y) corresponds to brightness of image in corresponding position

Digital image processing Digital image consists of individual picture elements, pixels, which form discere lattice in spatial domain (x,y) Digital image can also be presented using SIN-waves with different frequencies and amplitudes –called frequence domain (u,v) –Fourier transform is used to determine frequencies Manipulation of digital image: –Image domain: pixel values are modified directly at iomage using some algorithm –Spatial domain: frequencies and amplitudes of SIN-waves are modified

Sources of error Movement opf imaging platform Changes in height or speed Attitude of satellite Image provider should correct Instrument Scanning or measurement principle Failures of sensors Production methods or accuracy of instrument Calibration of instrument

Sources of error Atmosphere Attenuation of radiation and decrease of contrast Image is blurred Difficult to correct Object Roundness of Earth Rotation of Earth Topography It is possible to cerrect these quite well

Geometric correction Position of image pixel is not correct when compared to ground Errors due to instrument, movement of imaging platform and object are removed Known errors in geometry: –Earth cirvature and rotation –Topography –Imaging geometry Rectification to map projection –Geometric transformation –Interpolation of pixel digital numbers

Geometric correction Geometric correction is made –automatically using orbital parameters or –manually using ground control points Alternatives –orbital parameters –ground control points –orthocorrection Most accurate results by combining all

Raw image data It can be difficult to recognize ground features from raw image data, because they are not necessarily similar than in nature

Raw image vs. corrected image

Geometric correction using orbital parameters Information about –position of satellite (XYZ) –attitude (  ) Correction can be low quality due to poor orbital information Knowledge about scanning geometry, movement of object and topography (DEM) will increase accuracy

Geometric correction Accuracy of correction depends on quality of used information Some examples about accuracy using orbital parameters: –NOAA AVHRR: 5 km km –Spot 1-4: 350 m –Spot 5: 50 m Ground control points: accuracy should be better than 1 pixel –depends also mathematical model and topographic variations Orthocorrection most accurate

Geometric correction Manual correction: Ground control points (GCPs): –image coordinates are measured from image –map coordinates from map or georeferenced image

Geometric correction GCPs are known and well distinguished ground features –crossroads, buildings, small lakes, small islands, features in waterline More GCPs is better When transformation between image coordinate system and map coordinate system is defined using polynomials, minimum number of points: –1st degree polynomial: 3 points –2nd degree: polynomial: 6 points

Example Erdas Imagine Old Landsat TM-image is georeferenced to same map coordinate system than Landsat ETM-image

Example 2nd degree polynomial 15 GCPs

Automatic correction Software searches corresponding points from image to be georeferenced and image in map coordinate system This can be based on –correlation between subimages –recognizable features (linear like roads or lakes) These points are used as GCPs Software produces many (e.g points for Landsat ETM-image) –user has to select which can be used Automatization is needed when there are many images and/or it has to be made daily

Topographic error Property of imaging system using central projection Image of object is in incorrect place due to height variations

Orthocorrection Topographic error is removed by changing the perspective of image from central projection to orthogonal projection DEM is needed

Interpolation of digital numbers Digital numbers for pixels of corrected image must be interpolated from uncorrected image Seldomly number from uncorrected image can be used directly Methods –nearest neighbor interpolation –bilinear interpolation –cubic convolution interpolation

Nearest neighbor interpolation Take value of closest pixel from uncorrected image Easy to compute Values do not change Result can be inaccurate korjattu kuva alkuperäinen kuva

Nearest neighbor interpolation Values of some pixels are chosen more than once, some not at all ”Piecewise” image Linear features can disappear korjattu kuva alkuperäinen kuva

Bilinear interpolation 4 closest pixels from uncorrected image are used Average weighted by distance Changes digital numbers –corresponds to average filtering korjattu kuva alkuperäinen kuva

Cubic convolution 16 (4x4) closest pixels from uncorrected image are used Smaller interpolation error than NN or BL- interpolation korjattu kuva alkuperäinen kuva

Alkuperäinen kuva bilineaarinen kuutio lähin naapuri

Image formation Scene f(x) Acquired image g(x) Scene f(x) is corrupted by atmosphere and instrument in imaging process They act like filters h(x)

Image formation Image acquisition can be modelled with image degradation model: f(x) * h(x) + n(x) = g(x) g(x): acquired image h(x): filter corresponding to averaging effects due to atmosphere and instrument n(x): random errors due to instrument and data transmission f(x): scene

Inverse filtering Errorness image of scene f(x) should be acquired by making inverse process to known image g(x) Image degradation model in frequency domain: G(u)=F(u)H(u)+N(u) Ideal inverse filtering F e (u) = G(u)/H(u) - N(u)/H(u) In practice difficult to solve –zeros in H(u) –effect of N(u) increases Usually radiometric correction is divided to different phases which are corrected individually

Radiometric correction In order that measurements taken with –different instruments –differents dates are comparable Aim: radiance or reflectance Radiance (W/m 2 /sr): –physical term which describes intensity of radiation leaving ground to some direction Reflectance: –Radiance / incoming irradiance

Instrument calibration Instruments are calibrated before satellite launch –measurements of known targets –instrument response is followed by measuring calibration targets in instrument or stable targets on ground Each channel have calibration coefficients GAIN and OFFSET –these can vary within time –response decreases, so same target looks more dark

Instrument gain Pixel digital number is multiplied with gain radiance = Dn * Gain Gain = Lmax – Lmin / 255 –Lmax: largest radiance which can be measured by instrument –Lmin: smallest radiance which can be measured by instrument

Instrument offset Background noide detected by instrument Measurement, when instrument do not receive any radiation = Lmin

Radiometric correction Equation: R = (Lmax-Lmin)/255*DN + Lmin OR R=Gain*DN + offset

Other corrections Sun zenith angle: DN’=DN / SIN(sun  ) Ground is illuminated differently when Sun zenith angle changes –seasonal differences

Other corrections : Distance between Sun and Earth This distance varies according to seasons Irradiance incoming to ground is when taking sun zenith angle into account:

Atmospheric correction Absorption and scattering Decrease of diffuse skylight –due to atmospheric scattering –largest at smaller waveleghts (blue), decreases as wavelenght increases –dampens image contrast

Atmospheric correction REF: reflectance of pixel L sat : radiance measured by instrument L haze : radiance due to atmospheric scattering (diffuse skylight) TAU v : atmospheric transmittance from ground to instrument E 0 : Sun spectral irradiance outside atmosphere, including effect of distance between Sun and Earth: E 0 = E / d 2, where E is Sun spectral irradiance outside atmosphere and d is distance between Sun and Earth in Astronomical Units sz: Sun zenith angle TAU z : atmospheric transmittance from Sun to ground E down : maanpinnalle tullut ilmakehän sironnan vaikutus

Atmospheric correction Apparent reflectance model Removes effects of changing Sun-Earth distance and Sun zenith angle changing imaging geometry Does not remove atmospheric effects: absorption or scattering Following parameters for correction model: TAU z : 1.0, TAU v : 1.0, E down : 0.0, L haze : 0.0

Atmospheric correction DARK-OBJECT-SUBTRACTION It is supposed that there are areas in image that are on shadow, so that all radiation coming to instrument from these areas is due to diffuse skylight Following parameters for correction model : TAU z : 1.0 TAU v : 1.0 E down : 0.0 L haze : measure radiance from target which is in shadow (like shadow of cloud) or does not reflect radiation (water in infrared)

Atmospheric correction CHAVEZ Modified DOS Atmospheric transmittances are approximated by angles of imaging geometry Following parameters for correction model : TAU z : cos(sz) TAU v : 1.0 or cos(incidence angle) E down : 0.0 L haze : measure radiance from target which is in shadow (like shadow of cloud) or does not reflect radiation (water in infrared)

Atmospheric correction More theoretic methods try to model how radiation travels in atmosphere Aim is to model –atmospheric transmittance and absorption –scattering due to gases –reflectances due to atmosphere, not ground Difficult, requires lot of computing and precise knowledge about the state of atmosphere

VTT atmospheric correction SMAC: Simplified Method of Atmospheric Correction Removes Rayleigh scattering, absorption due to atmospheric gases, imaging geometry (sun anlges and distance) Digital number  reflectance Needed calibration coefficients of instrument sun zenith and azimuth angles amount of water vapour ozone atmospheric optical depth

VTT atmospheric correction Original Landsat ETM-images Atmospheric optical Depth can be estimated from image if there are suitable channels ETM7/ETM3 ratio should be about 0.4 for olf coniferous forests

VTT atmospheric correction Corrected Landsat ETM- mosaic Eastern Finland 7 images RGB: 321

VTT atmospheric correction Landsat ETM- mosaic of Northern Finland consists of 9 images

Dehazing image Based on Tasselled Cap-transformation –TC4 image sensiive to atmospheric effects Landsat-5 TM-image: TC4 = *TM *TM *TM *TM *TM *TM Image channel is corrected by subtracting TC4 from it TM cx = TM x - (TC4 - TC4 0 )*A x TM cx : Corrected digital number of channel x TM x : Original digital number of channel x TC4 0 : Value of haze-free pixel in TC4 A x : Correction factor determined from image

Dehazing image Original and corrected TM1

Dehazing image Original and corrected TM2

Dehazing image Original and corrected TM3

Dehazing image Original and corrected TM4

Dehazing image Original and corrected TM5

Dehazing image Original and corrected TM7

Clouds and their shadows It is difficult to automatically remove clouds at visible and infrared regions –thermal infrared helps –thich clouds easy, thin clouds and haze difficult Removal by masking –Cloud interpretation –  Mark their area as ”Nodata” or 0 Problem shadows –mixed with water areas –difficult to automate Relatively simple way: –interprete clouds using thresholding or clustering –make cloud mask (1: cloud, 0: other) –dilate mask –move mask so that it covers also shadows

Topographic correction Imaging geometry (positions and angles between instrument, object and radiation source) changes locally E.g. deciduous forest on the sunny side or same side as instrument of the hill looks brighter than similar forest on the shadow side Reflectance is highest when slope is perpendicular to incoming radiation

Topographic correction Correction of image pixel values is based on topographic variations, other characteristics are not taken into account Determine the angle between incoming radiation and local surface normal  Illumination image cos(i)

Topographic correction Landsat ETM (RGB: 743) and DEM from NLS

Topographic correction Landsat ETM (RGB: 743) and illumination image

Topographic correction Variables of equations: L O : Original reflectance L C : Corrected reflectance sz: Sun zenith angle i: Angle between incoming radiation and local surface normal k: Minnaert coefficient, estimated from image m: Slope of regression line between illumination image and image pixels, estimated from image b: Offset of regression line between illumination image and image pixels, estimated from image C: Correction factor of C-correction, C = b/m

Topographic correction Lambert cosine correction: it is assumed that ground reflects radiation as Lambertian surface, in other words same amount to different directions L C = L O COS(sz) / COS(i)

Topographic correction Minnaert correction: Coefficient k is used to model the effect of different surfaces L C = L O [ COS(sz) / COS(i) ] k

Topographic correction Ekstrand correction: Minnaert coefficient k varies according to illumination L C = L O [ COS(sz) / COS(i) ] k COS(i)

Topographic correction Statistical-empirical correction: correction removes correlation between illumination image and image channel L C = L O – m cos(i)

Topographic correction C-correction: coefficient C should model diffuse light L C = L O [ ( cos(sz) + C ) / ( cos(i) + C ) ]

ATCOR 2/3 Developed by DLR (German Aerospace Center) – ATCOR 2/3 tries to remove components 1 and 3: 1.path radiance: radiation scattered by the atmosphere 2.reflected radiation from the viewed pixel 3.radiation reflected by the neighborhood and scattered into the view direction (adjacency effect) –Only 2 contains information from the viewed pixel.

ATCOR 2/3 Atmospheric Database Atmospheres with different vertical profiles of pressure, air temperature, humidity, ozone content Various aerosol types (rural, urban, maritime, desert) Visibilities: km (hazy to very clear) Ground elevations km, extrapolated for higher regions Solar zenith angles degree Tilt geometries with tilt angles up 50 degrees are supported A discrete set of relative azimuth angles ( deg., increment 30 deg.) is provided for the atmospheric LUTs of tilt sensors, interpolation is applied if necessary Capability for mixing of atmospheres (water vapor, aerosol) The sensor-specific database is available for the standard multispectral sensors, i.e., Landsat TM, ETM+, SPOT, IRS-P6 etc.

ATCOR 2 Atmospheric correction for flat terrain SPECTRA module to determine the atmospheric parameters (aerosol type, visibility, water vapor) compare retrieved scene reflectance spectra of various surface covers with library spectra tune correction of retrieved spectra so that resemples library spectra Constant atmospheric conditions or spatially varying aerosol conditions Retrieval of atmospheric water vapor column for sensors with water vapor bands (around 940/1130 nm). Example sensors: MOS-B, Hyperion. Statistical haze removal: a fully automatic algorithm that masks haze and cloud regions and removes haze of land areas. De-shadowing of cloud or building shadow areas. Automatic classification of spectral surface reflectance (program SPECL2) using 10 surface cover templates. This is not a land use classification, but a reflectance-shape classification. Still it may be useful as it is a fast automatic classification algorithm. Surface emissivity and surface (brightness) temperature maps for thermal band sensors. Value added products: vegetation index SAVI, LAI, FPAR, wavelength-integrated albedo, absorbed solar radiation flux, surface energy fluxes for thermal band sensors: net radiation, ground heat flux, latent heat, sensible heat flux

ATCOR 2 Atmospheric correction for flat terrain SPECTRA module to determine the atmospheric parameters (aerosol type, visibility, water vapor) –compare retrieved scene reflectance spectra of various surface covers with library spectra –tune correction of retrieved spectra so that resemples library spectra Constant atmospheric conditions or spatially varying aerosol conditions Retrieval of atmospheric water vapor column for sensors with water vapor bands (around 940/1130 nm). Example sensors: MOS-B, Hyperion. Statistical haze removal: a fully automatic algorithm that masks haze and cloud regions and removes haze of land areas.

ATCOR 2 De-shadowing of cloud or building shadow areas. Automatic classification of spectral surface reflectance (program SPECL2) using 10 surface cover templates –not a land use classification, but a reflectance-shape classification Surface emissivity and surface (brightness) temperature maps for thermal band sensors. Value added products: –vegetation index SAVI –LAI, FPAR –wavelength-integrated albedo –absorbed solar radiation flux –surface energy fluxes for thermal band sensors: net radiation, ground heat flux, latent heat, sensible heat flux

ATCOR 2 examples of haze removal IRS-1C Liss-3 scene recorded 25 June 1998 (Craux, France) Ikonos scene of Dresden, 18 August 2002

ATCOR 2 Dresden, the SPOT-3 sensor (22 April 1995) Left: surface reflectance after atmospheric correction (RGB=SPOT bands 3/2/1, NIR/Red/Green) Right: the results of the automatic spectral reflectance classification Color coding of the classification map: - dark to bright green: different vegetation covers - blue: water - brown: bare soil - grey: asphalt, dark sand/soil - white: bright sand/soil - red: mixed vegetation/soil - yellow: sun flower, rape, while blooming

ATCOR 3 Correction for mountaineous regions Scene has to be ortho-rectified Processing eliminates the atmospheric / topographic effects and generates surface data (reflectance, temperature) corresponding to a flat terrain Features as ATCOR2, except –Quick topographic correction (without atmospheric correction) –SKYVIEW: sky view factor calculation with a ray tracing program to determine the proportion of the sky hemisphere visible for each pixel of the terrain. –SHADOW: cast shadow calculation depending on solar zenith and azimuth angle employing a ray tracing program

ATCOR 3 Components: 1.path radiance: radiation scattered by the atmosphere (photons without ground contact) 2.reflected radiation from the viewed pixel 3.adjacency radiation: ground reflected from the neighborhood and scattered into the view direction 4.terrain radiation reflected to the pixel (from opposite hills, according to the terrain view factor) Only component 2 contains information from the viewed pixel.

ATCOR 3 Top left: DEM ( m asl) Top right: illumination image Bottom left: original TM data (bands 3/2/1 coded RGB) Bottom right: surface reflectance image after combined atmospheric and topographic correction Areas that appear dark in the original image due to a low solar illumination are raised to the proper reflectance level in the corrected image Walchensee lake and surrounding mountains in the Bavarian Alps TM scene acquired 28 July 1988

ATCOR 2/3 Output: Surface reflectance channels Surface (brightness) temperature, surface emissivity map Visibility index map (corresponds to total optical thickness at 550 nm) and aerosol optical thickness map Water vapor map (if required water vapor channels are available, e.g. at 940 nm) Surface cover map derived from template surface reflectance spectra (10 classes): a fast automatic spectral classification. Value added channels: SAVI, LAI, FPAR, albedo, radiation and heat fluxes

Errors in images Errorneous pixel or scanning lines are due to instrument or data transmission malfunctions Fill with neighboring data –bad idea Treat as clouds –holes in data

Errors in images Landsat TM 189/18,

Errors in images: striping Instrument channels have many sensors which are badly calibrated

Errors in images: striping IRS WiFS 36/22, , channel 1 (red)

Jun Something wrong with data transmission?

Interactive restoration Interactive restoration of image using Fourier- transformation