1 Peter Fox GIS for Science ERTH 4750 (98271) Week 7, Tuesday, March 6, 2012 Analysis of ‘continuous surfaces’ (filtering, slopes, shading)

Slides:



Advertisements
Similar presentations
November 12, 2013Computer Vision Lecture 12: Texture 1Signature Another popular method of representing shape is called the signature. In order to compute.
Advertisements

Spatial Filtering (Chapter 3)
Chapter 3 Image Enhancement in the Spatial Domain.
Regional Processing Convolutional filters. Smoothing  Convolution can be used to achieve a variety of effects depending on the kernel.  Smoothing, or.
Sliding Window Filters and Edge Detection Longin Jan Latecki Computer Graphics and Image Processing CIS 601 – Fall 2004.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Lecture 4 Edge Detection
Chapter 9 Vertical Motion. (1) Divergence in two and three dimensions. The del “or gradient” operator is a mathematical operation performed on something.
More Raster and Surface Analysis in Spatial Analyst
ANALYSIS 3 - RASTER What kinds of analysis can we do with GIS? 1.Measurements 2.Layer statistics 3.Queries 4.Buffering (vector); Proximity (raster) 5.Filtering.
CS 376b Introduction to Computer Vision 02 / 27 / 2008 Instructor: Michael Eckmann.
Digital Image Processing
Edge Detection Phil Mlsna, Ph.D. Dept. of Electrical Engineering
1 Image Filtering Readings: Ch 5: 5.4, 5.5, 5.6,5.7.3, 5.8 (This lecture does not follow the book.) Images by Pawan SinhaPawan Sinha formal terminology.
Chapter 7 Bits of Vector Calculus. (1) Vector Magnitude and Direction Consider the vector to the right. We could determine the magnitude by determining.
Image Enhancement.
Edge Detection Today’s readings Cipolla and Gee –supplemental: Forsyth, chapter 9Forsyth Watt, From Sandlot ScienceSandlot Science.
Computational Photography: Image Processing Jinxiang Chai.
Raster Data Analysis Chapter 11. Introduction  Regular grid  Value in each cell corresponds to characteristic  Operations on individual, group, or.
Chapter 3 (cont).  In this section several basic concepts are introduced underlying the use of spatial filters for image processing.  Mainly spatial.
CS 376b Introduction to Computer Vision 02 / 26 / 2008 Instructor: Michael Eckmann.
Neighborhood Operations
Spatial-based Enhancements Lecture 3 prepared by R. Lathrop 10/99 updated 10/03 ERDAS Field Guide 6th Ed. Ch 5: ;
Reflectance Map: Photometric Stereo and Shape from Shading
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Image Features Kenton McHenry, Ph.D. Research Scientist.
1 Peter Fox GIS for Science ERTH 4750 (98271) Week 9, Tuesday, March 27, 2012 Using uncertainties, analysis and use of discrete entities.
Spatial Filtering: Basics
1 Peter Fox GIS for Science ERTH 4750 (98271) Week 6, Tuesday, February 28, 2012 Kriging, variograms, term project discussion/ definition.
 Trace the incremental algorithm for constructing convex data on the sample point data given below. Intermediate steps should be shown First, two steps.
From Pixels to Features: Review of Part 1 COMP 4900C Winter 2008.
University of Texas at Austin CS384G - Computer Graphics Fall 2010 Don Fussell Image processing.
1 Peter Fox GIS for Science ERTH 4750 (98271) Week 8, Tuesday, March 20, 2012 Analysis and propagation of errors.
Intro to Raster GIS GTECH361 Lecture 11. CELL ROW COLUMN.
Data Types Entities and fields can be transformed to the other type Vectors compared to rasters.
AdeptSight Image Processing Tools Lee Haney January 21, 2010.
1 Peter Fox Data Science – ITEC/CSCI/ERTH-4350/6350 Week 4, September 16, 2014 Data Analysis.
Graphics Lecture 13: Slide 1 Interactive Computer Graphics Lecture 13: Radiosity - Principles.
Raster Analysis. Learning Objectives Develop an understanding of the principles underlying lab 4 Introduce raster operations and functions Show how raster.
COMP322/S2000/L171 Robot Vision System Major Phases in Robot Vision Systems: A. Data (image) acquisition –Illumination, i.e. lighting consideration –Lenses,
CSC508 Convolution Operators. CSC508 Convolution Arguably the most fundamental operation of computer vision It’s a neighborhood operator –Similar to the.
October 7, 2014Computer Vision Lecture 9: Edge Detection II 1 Laplacian Filters Idea: Smooth the image, Smooth the image, compute the second derivative.
1 Peter Fox GIS for Science ERTH 4750 (98271) Week 5, Tuesday, February 21, 2012 Introduction to geostatistics. Interpolation techniques continued (regression,
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Edge Detection and Geometric Primitive Extraction Jinxiang Chai.
L7 - Raster Algorithms L7 – Raster Algorithms NGEN06(TEK230) – Algorithms in Geographical Information Systems.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP7 Spatial Filters Miguel Tavares Coimbra.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
: Chapter 5: Image Filtering 1 Montri Karnjanadecha ac.th/~montri Image Processing.
Digital Filters. What are they?  Local operation (neighborhood operation in GIS terminology) by mask, window, or kernel (different words for the same.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
September 26, 2013Computer Vision Lecture 8: Edge Detection II 1Gradient In the one-dimensional case, a step edge corresponds to a local peak in the first.
Sliding Window Filters Longin Jan Latecki October 9, 2002.
Filters– Chapter 6. Filter Difference between a Filter and a Point Operation is that a Filter utilizes a neighborhood of pixels from the input image to.
Spatial Filtering (Chapter 3) CS474/674 - Prof. Bebis.
Image Enhancement in the Spatial Domain.
Environmental Remote Sensing GEOG 2021
Miguel Tavares Coimbra
Edge Detection Phil Mlsna, Ph.D. Dept. of Electrical Engineering Northern Arizona University.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Image Pre-Processing in the Spatial and Frequent Domain
Raster Analysis and Terrain Analysis
Fourier Transform: Real-World Images
Edge Detection The purpose of Edge Detection is to find jumps in the brightness function (of an image) and mark them.
Computer Vision Lecture 9: Edge Detection II
Dr. Chang Shu COMP 4900C Winter 2008
May 18, 2016 Spring 2016 Institute of Space Technology
Digital Image Processing
Image Filtering Readings: Ch 5: 5. 4, 5. 5, 5. 6, , 5
Introduction to Artificial Intelligence Lecture 22: Computer Vision II
Presentation transcript:

1 Peter Fox GIS for Science ERTH 4750 (98271) Week 7, Tuesday, March 6, 2012 Analysis of ‘continuous surfaces’ (filtering, slopes, shading)

Contents Reading review Analysis of continuous surfaces (filtering, slopes, shading) Projects Lab on Friday, assignment 2

Reading for last week Geostatistics References Kriging (wikipedia) Bohling on Kriging Bohling on Variograms Query –Help with SQL (Standard Query Language) –Chapter 8 in MapInfo User Guide (10.5): Selecting and Querying Data (p ) 3

Spatial analysis of continuous fields Filtering (Smoothing = low-pass filter) High-pass filter is the image with the low-pass (i.e. smoothing) removed One-dimension; V(i) = [ V(i-1) + 2 V(i) + V(i+1) } /4 another weighted average 4

5

Square window (convolution, moving window) New value for V is weighted average of points within specified window. –V ij = f [ SUM k=i-m, i+m SUM l=j-n, j+n V kl w kl ] / SUM w kl, –f = operator –w = weight 6

Each cell can have same or different weight but typically SUM w kl = 1. For equal weighting, if n x m = 5 x 5 = 25, then each w = 1/25. Or weighting can be specified for each cell. For example for 3x3 the weight array might be: So V ij = [ V i-1,j-1 + 2V i,j-1 + V i+1,j-1 + 2V i-1,j + 3V i,j + 2V i+1,j +V i-1,j+1 +2V i,j+1 +V i+1,j+1 ] /15 7 1/152/151/15 2/153/152/15 1/152/151/15

8 Low pass =smoothing

9 High pass – smoothing removed Low pass =smoothing

Modal filters The value or type at center cell is the most common of surrounding cells. Example 3x3: A A B C A D C A B B A B C A C B C B A C -> A A A C C C B B B B A A C B C B B B A 10

Or You can use the minimum, maximum, or range. For example the minimum: A A B C A D C A B B A B C A C B C B A C -> A A A A A A A A A B A A C B C B B B A –No powerpoint animation hell… Note - Because it requires sorting the values in the window, it is a computationally intensive task, the modal filter is considerably less efficient than other smoothing filters. 11

Median filter Median filters can be used to emphasize the longer- range variability in an image, effectively acting to smooth the image. This can be useful for reducing the noise in an image. The algorithm operates by calculating the median value (middle value in a sorted list) in a moving window centered on each grid cell. The median value is not influenced by anomalously high or low values in the distribution to the extent that the average is. As such, the median filter is far less sensitive to shot noise in an image than the mean filter. 12

Compare median, mean, mode 13

Median filter Because it requires sorting the values in the window, a computationally intensive task, the median filter is considerably less efficient than other smoothing filters. This may pose a problem for large images or large neighborhoods. Neighborhood size, or filter size, is determined by the user- defined x and y dimensions. These dimensions should be odd, positive integer values, e.g. 3, 5, 7, 9... You may also define the neighborhood shape as either squared or rounded. A rounded neighborhood approximates an ellipse; a rounded neighborhood with equal x and y dimensions approximates a circle. 14

Sobel filter Edge detection –performs a 3x3 or 5x5 Sobel edge-detection filter on a raster image. The Sobel filter is similar to the Prewitt filter, in that it identifies areas of high slope in the input image through the calculation of slopes in the x and y directions. The Sobel edge-detection filter, however, gives more weight to nearer cell values within the moving window, or kernel. 15

Kernels In the case of the 3x3 Sobel filter, the x and y slopes are estimated by convolution with the following kernels: Each grid cell in the output image is then assigned the square-root of the squared sum of the x and y slopes. 16 X-direction Y-direction

Reading – Spatial Filters Do some web searches What about temporal filtering ??? –Trick question 17

Slopes Slope is the first derivative of the surface; aspect is the direction of the maximum change in the surface. The second derivatives are called the profile convexity and plan convexity. For surface the slope is that of a plane tangent to the surface at a point. 18

Gradient The gradient, which is a vector written as del V, contains both the slope and aspect. –del V = ( dV/dx, dV/dy ) For discrete data we often use finite differences to calculate the slope. In the plot above the first derivative at Vij could be taken as the slope between points at i-1 and i+1. –d V ij / d x = ( V i+1,j – V i-1,j ) / (2 dx) 19

Second derivative … is the slope of the slope. We take the change in slope between i+1 and i, and between i and i-1. d 2 V / dx 2 = [ ( V i+1,j – V i,j ) / dx - ( V i,j – V i-1,j ) / dx ] / dx The slope, which is the magnitude of del V, is: | del V | = [ (d V / d x ) 2 + ( d V / d y ) 2 ] 1/2 20

Aspect (more important for GIS) The aspect is tan -1 -(d V/dy ) / (d V/dx) which gives the direction of the maximum slope. For example, this analysis used in drainage networks because water flows in the direction of steepest descent. Where the slopes are zero could be a drainage divide. 21

E.g. slope 22

And it’s Aspect 23

Vectors The gradient of a 2-dimensional scalar field is a vector. This vector sometimes has physical meaning and you may want to map it. –For example, for topography data, the gradient is the uphill direction and a drainage vector will be in the direction opposite the topography gradient. –For hydrogeology, water flow vectors are in the direction negative the gradient in hydraulic head. –Wind blows opposite the pressure gradient, etc. 24

Ah-MapInfo has a cousin MapInfo does not directly let you plot vectors but you can do this fairly easily with MapBasic. Since the x-component of the gradient is dV/dx and the y-component is dV/dy, you can build a line object that starts at the grid point (X, Y) and ends at the point (X + dV/dx, Y + dV/dy) (some scaling is required and you choose the arrow option for the line type). 25

26

We’ll do this in the lab (after break) '-- gradients in x and y dZdx = (zl(kx+1)- zl(kx-1)) / (x(ix+1)- x(ix-1)) dZdy = (zl(kx-nxf)-zl(kx+nxf)) / (y(iy-1)- y(iy+1)) '-- xc,yc is the centroid of the point (object) where the vector starts xc = x(ix) yc = y(iy) '-- xe,ye is the endpoint of the vector, negative of the gradient xe = xc - dZdx/vscale ye = yc - dZdy/vscale '-- make a line object from the center to the end of the vector oVect = createline( xc, yc, xe, ye) '-- make the line symbol a blue vector alter object oVect Info obj_info_pen, makepen(1, 59, BLUE) ' -- insert line into table insert into vect_table (Rate, Obj) values (slope, oVect) 27

Laplacian filters Some properties, such as groundwater level or temperature, follow LaPlace’s equation: del 2 V = 0. In Cartesian coordinates this means: del 2 V = d 2 V / dx 2 + d 2 V / dy 2 = 0. 28

On a grid – finite differences del 2 V = [(Vi+1,j – Vi,j )/dx - (Vi,j - Vi-1,j )/dx]/dx + [(Vi,j+1 – Vi,j )/dy - (Vi,j - Vi,j-1 )/dy]/dy = 0 If dx = dy then we get: –Vi+1,j + Vi-1,j + Vi,j+1 + Vi,j-1 - 4Vi,j = 0, or –Vi,j = ( Vi+1,j + Vi-1,j + Vi,j+1 + Vi,j-1 ) / 4 Hence, the Laplacian operator is simply the average of the four surrounding values!!!! Ah – when does dx=dy??? 29

Shaded Relief -> ViewShed A viewshed describes the set of points that are visible from a viewpoint. This is done by line-of-site ray tracing. Shaded relief involves calculating the orientation of the surface relative to the line of sight or the illuminating feature. The angle the light hits the surface can calculated from the topography t. If del t is the gradient of the topography, then the unit vector n normal to the surface is in the vertical plane and perpendicular to del t. The brightness of the light reflected will be largest when the surface is perpendicular to the illumination direction. 30

Excuse me? 31

What can I see? 32

Illuminate us … (sorry) If S is the illumination vector (having brightness and direction), then the reflection brightness will be proportional to n · S = | S|/ cos A where A is the angle between n and S. 33

Shaded Relief 34

Summary Topics for GIS (for Science) –Filtering, slopes +, enhancements For learning purposes remember: –Demonstrate proficiency in using geospatial applications and tools (commercial and open-source). –Present verbally relational analysis and interpretation of a variety of spatial data on maps. –Demonstrate skill in applying database concepts to build and manipulate a spatial database, SQL, spatial queries, and integration of graphic and tabular data. –Demonstrate intermediate knowledge of geospatial analysis methods and their applications. 35

Friday Mar. 9 Lab assignment session – two problems, I’ll put them up right after class today Complete them in class, get signed off before leaving 10% of grade 36

Reading for this week NONE!! Happy spring break 37

Next classes Tuesday, March 20, Analysis of errors Friday, March 23 – lab with material from this week (lab assignment 10%) Note March 30 – open lab (no assignment, work on projects, get help from Max) 38