Image Enhancements, Indices and Transformations
(A) Energy Source or Illumination Radiation and the Atmosphere (B) Interaction with the Target (C) Transmission, Reception, and Processing (E) Interpretation and Analysis (F) Application (G) Reference: CCRS/CCT Recording of Energy by the Sensor (D) Remote Sensing Process
(A) Energy Source or Illumination (B) Radiation and the Atmosphere Interaction with the Target (C) Transmission, Reception, and Processing (E) Interpretation and Analysis (F) Application (G) Reference: CCRS/CCT Recording of Energy by the Sensor (D) Remote Sensing Process
(A) Energy Source or Illumination (B) Radiation and the Atmosphere (C) Interaction with the Target Transmission, Reception, and Processing (E) Interpretation and Analysis (F) Application (G) Reference: CCRS/CCT Recording of Energy by the Sensor (D) Remote Sensing Process
(A) Energy Source or Illumination (B) Radiation and the Atmosphere (C) Interaction with the Target Transmission, Reception, and Processing (E) Interpretation and Analysis (F) Application (G) Reference: CCRS/CCT Remote Sensing Process (D) Recording of Energy by the Sensor
(A) Energy Source or Illumination (B) Radiation and the Atmosphere (C) Interaction with the Target Interpretation and Analysis (F) Application (G) Reference: CCRS/CCT Remote Sensing Process (D) Recording of Energy by the Sensor (E) Transmission, Reception, and Processing (E) Transmission, Reception, and Processing
(A) Energy Source or Illumination (B) Radiation and the Atmosphere (C) Interaction with the Target Interpretation and Analysis (F) Reference: CCRS/CCT Remote Sensing Process (D) Recording of Energy by the Sensor (E) Transmission, Reception, and Processing (F) Interpretation and Analysis
Energy Source or Illumination (A) Radiation and the Atmosphere (B) Interaction with the Target (C) Transmission, Reception, and Processing (E) Interpretation and Analysis (F) (G) Application Reference: CCRS/CCT Recording of Energy by the Sensor (D) Remote Sensing Process
9 Agricultural Efficiency Air Quality Water Management Disaster Management Carbon Management Aviation Ecological Forecasting Invasive Species Coastal Management Homeland Security Energy Management Public Health Applications
10 Image enhancement Alteration of the image in such a way that the information contained in the image is easier to visually interpret or systematically analyze
11 Types of image enhancement Radiometric enhancement Spatial enhancement Spectral enhancement
12 Types of image enhancement Radiometric enhancement Spatial enhancement Spectral enhancement
13 Radiometric enhancement Compensates for inadequacies in the image contrast (too dark, too bright, too little difference between the brightness of features in the image) Attempts to optimize the distribution of pixel values over the radiometric range of the image
14 Radiometric enhancement Often increases contrast for some image pixels while decreasing it for others.
15 Types of radiometric enhancement 1. Linear stretch 2. Piecewise linear stretch 3. Histogram equalization (non-linear stretch)
16 Linear stretch Simple method that expands the range of original image pixel values to the full radiometric range of the image; Best applied to images where pixel values are normally distributed
17 Minimum/maximum linear stretch
18 no stretchlinear stretch Minimum/maximum linear stretch
19 Contrast Stretching of Predawn Thermal Infrared Data of the Savannah River OriginalOriginal Minimum- maximum +1 standard deviation
20 Piecewise linear stretch Allows for enhancement of a specific range of pixel values
21 Piecewise linear stretch Slope of the linear contrast enhancement changesSlope of the linear contrast enhancement changes Piecewise contrast stretching (sometimes referred to as using breakpoints)Piecewise contrast stretching (sometimes referred to as using breakpoints)
Piecewise Linear Contrast Stretching
23 Histogram equalization (non-linear stretch) Redistributes pixel values so that there are roughly the same number of pixels with each value within a range Applies greatest contrast enhancement at the peaks of the histogram
24 Histogram equalization DarkLightMost populated
25 Histogram matching Convert the histogram of one image to match the histogram of another
26 Histogram matching rules General shape of histograms should be similar Relative dark/light features should be the same Spatial resolution should be the same Same relative distribution of land cover
27 Histogram matching rules Histogram matching is useful for matching data of the same or adjacent scenes that were scanned on separate days, or are slightly different because of sun angle or atmospheric effects Especially useful for mosaicing or change detection
28 Histogram matching + = input imagematch image LUToutput image
29 Types of image enhancement Radiometric enhancement Spatial enhancement Spectral enhancement
30 Spatial enhancement Modifies pixel values based on the values of surrounding pixels Changes the “spatial frequency” of an image
31 Spatial frequency The number of changes in pixel value per unit distance for any particular part of an image Few changes – low frequency area Dramatic changes – high frequency area
32 Neighboring pixel brightness values rather than an independent pixel value Spatial frequency
33 Types of spatial enhancement 1. Convolution filtering 2. Resolution merge
34 Convolution filtering Process of assigning a new value for an image pixel based on a weighted average of surrounding pixels Can be used to visually enhance an image OR to prepare an image for classification
35 Kernel A matrix of coefficients used to average the value of each image pixel with the neighborhood of pixels surrounding it Kernel is systematically moved across the image and a new value is calculated for each input image pixel (at the center of the kernel)
36 Kernel
37 Convolution Formula the kernel coefficient at column i,row j the pixel value at column i, row j the dimension of the kernel (i.e., 3X3) the sum of the kernel coefficients (if 0, then 1) the output pixel value
38 = [(-1 × 8) + (-1 × 6) + (-1 × 6) + (-1 × 2) + (16 × 8) + (-1 × 6) + (-1 × 2) + (-1 × 2) + (-1 × 8)] / ( ) = ( ) / 8 = 88 / 8 = 11 Convolution Formula
39 Increase spatial frequency used to enhance “edges” between non- homogeneous groups of image pixels Not often used prior to classification High-frequency (high-pass) kernel
40 High-frequency (high-pass) kernel before filteringafter filtering
41 Sum of all kernel coefficients is zero Output pixel values are zero where equal Low values become much lower, high values become much higher Used as an edge detector Can be biased to detect edges in a certain direction Kernel above is biased towards the south Stream delineation, fault mapping Zero-sum kernel
42 Zero-sum kernel before filteringafter filtering
43 Kernel coefficients are usually equal Simply averages pixel values Results in increased pixel homogeneity and a “smoother” image Most widely-used filtering mechanism Smooth terrain; reduce noise; generalize land cover (post-classification) Kernel: 3X3 or 5X5 Low-frequency (low-pass) kernel
44 Low-frequency (low-pass) kernel before filteringafter filtering
45 Resolution merge Using an image with high spatial resolution to increase the spatial resolution of a lower spatial resolution image of the same area (a.k.a., “pan sharpening”)
46 Resolution merge += original MS (30m)panchromatic (15m)output image (15m) note that this changes the input image pixel values
47 Types of image enhancement Radiometric enhancement Spatial enhancement Spectral enhancement
48 Spectral enhancement Create, expand, transform, analyze or compress multiple bands of image data Can be used to both visually enhance data and prepare it for image classification
49 Types of spectral enhancement 1. Principal component analysis 2. Tasseled cap 3. Indices
50 Principal Components Analysis (PCA) Transforms a multi-band image into a series of uncorrelated images (“components”) that represent most of the information present in the original dataset Can be more useful for analysis than the original source data
51 Principal Components Analysis (PCA) The first one or two components represent most of the information (variance) present in the original image bands; PCA reduces data redundancy First PC accounts for the maximum proportion of the variance, each succeeding PC accounts for the maximum proportion of the remaining variance Reduce dimensionality (i.e., # of bands need to be analyzed)
band #1 values band #2 values 1st component is the longest axis (AB); minimizes the squared distance from each point to the line 2nd component is the 2nd longest axis (CD); it’s “orthagonal”, or completely uncorrelated with the first axis Original image values are converted based on the equation defining the axis line
Landsat ETM+ (6 bands, excluding thermal & pan)
54 pixel 1pixel 2pixel 3 band band band band band band Principal Components Analysis (PCA) PCA seeks to generate uncorrelated images to reduce data redundancy. The pixels in bands 1 through 3 are perfectly correlated Band 2 = Band ; Band 3 = Band Bands 4, 5 & 7 are less correlated to Band 1. They contain more unique information to contribute to the first PC
55 Tasseled cap transformation Transforms a multi-band image into a series of images optimized for vegetation studies using coefficients specific to a particular sensor Images represent the “brightness”, “greenness”, and “wetness” Vegetation studies: brightness is used to identify and measure soil greenness is used to identify and measure vegetation wetness is used to measure soli/vegetation moisture content
Micale and Marrs 2006 Healthy dense vegetation Bare soil Water Tasseled cap transformation
57
58 Indices Create new images by mathematically combining the pixel values from multiple image bands Most often ratios of band values
59 Common uses of indices Mineral exploration Reduce radiometric differences Minimize shadow effects Vegetation analysis
60 Normalized Difference Vegetation Index (NDVI) A ratio of the red visible and near infrared bands Used widely as a measure of both the presence and health of vegetation Values range from -1 to +1
61 Normalized Difference Vegetation Index (NDVI) Based upon findings that the chlorophyll in plant leaves strongly absorbs red visible light (from 0.6 to 0.7 µm), while the cell structure of the leaves strongly reflects near-infrared light (from 0.7 to 1.1 µm)
63 where: NIR is the near-infrared response of pixel p R is the visible red response of pixel p Normalized Difference Vegetation Index example: NIR=100, R=50 (0.333) NDVI is positive when NIR > R, negative when NIR < R Larger NDVI values result from larger differences between the NIR and Red bands Note that the software may scale the -1 to +1 NDVI values to 8-bit (0 to 255)
Normalized Difference Vegetation Index (NDVI) The main difference between green and dry vegetation is the amount of red visible absorbed
65 greyscale NDVIpseudocolor NDVI Normalized Difference Vegetation Index (NDVI)
NASA MODIS global NDVI