Urban growth pattern of the San Antonio area

Slides:



Advertisements
Similar presentations
Clustering k-mean clustering Genome 559: Introduction to Statistical and Computational Genomics Elhanan Borenstein.
Advertisements

UNIT 8: Statistical Measures
MAPPING OAK WILT IN TEXAS Amuche Ezeilo Wendy Cooley.
K Means Clustering , Nearest Cluster and Gaussian Mixture
Clustering & image segmentation Goal::Identify groups of pixels that go together Segmentation.
Unsupervised learning: Clustering Ata Kaban The University of Birmingham
An Overview of RS Image Clustering and Classification by Miles Logsdon with thanks to Robin Weeks Frank Westerlund.
Image Classification.
Unsupervised Training and Clustering Alexandros Potamianos Dept of ECE, Tech. Univ. of Crete Fall
Techniques and Data Structures for Efficient Multimedia Similarity Search.
Adapted by Doug Downey from Machine Learning EECS 349, Bryan Pardo Machine Learning Clustering.
Basics: Notation: Sum:. PARAMETERS MEAN: Sample Variance: Standard Deviation: * the statistical average * the central tendency * the spread of the values.
A Novel 2D To 3D Image Technique Based On Object- Oriented Conversion.
COLOR TEXTURE SEGMENTATION USING FEATURE DISTRIBUTIONS.
Tal Mor  Create an automatic system that given an image of a room and a color, will color the room walls  Maintaining the original texture.
Image segmentation by clustering in the color space CIS581 Final Project Student: Qifang Xu Advisor: Dr. Longin Jan Latecki.
Image Classification
Classification Advantages of Visual Interpretation: 1. Human brain is the best data processor for information extraction, thus can do much more complex.
Exercise #5: Supervised Classification. Step 1. Delineating Training Sites and Generating Signatures An individual training site is delineated as an “area.
UNIT 8:Statistical Measures Measures of Central Tendency: numbers that represent the middle of the data Mean ( x ): Arithmetic average Median: Middle of.
1 Remote Sensing and Image Processing: 6 Dr. Mathias (Mat) Disney UCL Geography Office: 301, 3rd Floor, Chandler House Tel:
Technological Educational Institute Of Crete Department Of Applied Informatics and Multimedia Intelligent Systems Laboratory 1 CLUSTERS Prof. George Papadourakis,
Landsat unsupervised classification Zhuosen Wang 1.
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
Digital Image Processing
Application of spatial autocorrelation analysis in determining optimal classification method and detecting land cover change from remotely sensed data.
K-Means Algorithm Each cluster is represented by the mean value of the objects in the cluster Input: set of objects (n), no of clusters (k) Output:
 A Characteristic is a measurable description of an individual such as height, weight or a count meeting a certain requirement.  A Parameter is a numerical.
Supervised Classification in Imagine D. Meyer E. Wood
Remote Sensing Unsupervised Image Classification.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 28 Nov 9, 2005 Nanjing University of Science & Technology.
Clustering Algorithms Sunida Ratanothayanon. What is Clustering?
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Color Image Segmentation Mentor : Dr. Rajeev Srivastava Students: Achit Kumar Ojha Aseem Kumar Akshay Tyagi.
CHANGE DETECTION ANALYSIS USING REMOTE SENSING TECHNIQUES Change in Urban area from 1992 to 2001 in COIMBATORE, INDIA. FNRM 5262 FINAL PROJECT PRESENTATION.
1 Berger Jean-Baptiste
Unsupervised Classification
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
MONTE CARLO SIMULATION MODEL. Monte carlo simulation model Computer based technique length frequency samples are used for the model. In addition randomly.
Gaussian Mixture Model classification of Multi-Color Fluorescence In Situ Hybridization (M-FISH) Images Amin Fazel 2006 Department of Computer Science.
6.1 Confidence Intervals for the Mean (Large Samples) Prob & Stats Mrs. O’Toole.
One-Variable Statistics. Descriptive statistics that analyze one characteristic of one sample  Where’s the middle?  How spread out is it?  How do different.
Data!.
Fuzzy Logic in Pattern Recognition
One-Variable Statistics
Scratch for Interactivity
Supervised Training and Classification
Unsupervised Learning
Unsupervised Classification in Imagine
Class 10 Unsupervised Classification
UZAKTAN ALGIILAMA UYGULAMALARI Segmentasyon Algoritmaları
Data Clustering Michael J. Watts
Map of the Great Divide Basin, Wyoming, created using a neural network and used to find likely fossil beds See:
CHAPTER 3 Data Description 9/17/2018 Kasturiarachi.
University College London (UCL), UK
Creating Scatterplots
Clustering.
Creating Scatterplots
Describing Distributions of Data
Inserting a table into word
pub/ hurtado/5336
Unsupervised Classification
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Multivariate Statistical Methods
DATA MINING Introductory and Advanced Topics Part II - Clustering
Statistical Classification on a Multispectral Image
Environmental Remote Sensing GEOG 2021
Unsupervised Learning: Clustering
EM Algorithm and its Applications
Class 10 Unsupervised Classification
Presentation transcript:

Urban growth pattern of the San Antonio area San Antonio population as of April 1, 1980 – 785,880 Average population of San Antonio for 1985 – 860,906 average between 1980 and 1990 San Antonio population as of April 1, 1990 – 935,933 San Antonio population as of April 1, 2000 – 1,144,646 % change in population from 1980 to 1990 = 19% % change in population from 1990 to 2000 = 22% % change in population from 1980 to 2000 = 46%

Received the image for San Antonio for 1986 from Dr. Xie Used the image for San Antonio for 2003 from the images given to in class from Dr. Xie The image for 1986 was larger than the image for 2003 Dr. Xie took the pixel location for the top left hand corner and the bottom right hand corner of the 2003 San Antonio image Dr. Xie inputted these locations into the image for 1986 to so that they would be the same size and the same geographic location.

San Antonio in 1986 after the size change

San Antonio in 2003

Braunig Lake 1986 2003

1604 and I-10 Interchange Now Fiesta Texas Quarry is larger 1986 2003

Kelly Air Force Base 1986 2003

Hwy 281 and Loop 1604 2003 1986

San Antonio Airport 2003 1986

Isodata unsupervised classification Isodata calculates class means evenly distributed in the data space and then iteratively clusters the remaining pixels using minimum distance techniques. Each iteration recalculates means and reclassifies pixels with respect to the new means. Iterative class splitting, merging, and deleting is done based on input threshold parameters. All pixels are classified to the nearest class unless a standard deviation or distance threshold is specified, in which case some pixels may be unclassified if they do not meet the selected criteria. This process continues until the number of pixels in each class changes by less than the selected pixel change threshold or the maximum number of iterations is reached.

Isodata image of San Antonio with Maximum Iterations set to 5 9 different classes (colors) 10 different classes (colors)

Braunig Lake Isodata Image - 2003 Isodata Image - 1986

1604 and I-10 Interchange Isodata Image - 2003 Isodata Image - 1986

Pixel Count for Isodata Difference Initial State Class 1 Class 2 Class 3 Class 4 Class 5 Class 6 Class 7 Class 8 Class 9 Class Total 50435 1264501 1181173 594792 263209 81589 31816 590326 145951 Class 10 Final State

Percentage for Isodata Difference Initial State Class 1 Class 2 Class 3 Class 4 Class 5 Class 6 Class 7 Class 8 Class 9 Class Total 100 Final State

K-Means unsupervised classification K-Means calculates initial class means evenly distributed in the data space and then iteratively clusters the pixels into the nearest class using a minimum distance technique. Each iteration recalculates class means and reclassifies pixels with respect to the new means. All pixels are classified to the nearest class unless a standard deviation or distance threshold is specified, in which case some pixels may be unclassified if they do not meet the selected criteria. This process continues until the number of pixels in each class changes by less than the selected pixel change threshold or the maximum number of iterations is reached.

K-Means image of San Antonio with Maximum Iterations set to 5 5 different classes (colors)

Braunig Lake K-Means Image - 2002 K-Means Image – 1986

1604 and I-10 Interchange K-Means Image - 2002 K-Means Image - 1986

Pixel Count for K-Means Difference Initial State Class 1 Class 2 Class 3 Class 4 Class 5 Class Total 524655 1289047 1327614 898480 163996 Final State

Percentage for K-Means Difference Initial State Class 1 Class 2 Class 3 Class 4 Class 5 Class Total 100 Final State

What I did to find the pixel difference for the Isodata Classification On the ENVI Main Menu Click Classification -> Click Post Classification -> Class Statistics Choose Classification Input file - Isodata Image for 1986 Choose Statistics Input file – Isodata for 1986 Selected All items Asked for a Text Report Repeated steps for the Isodata for 2003

Class Stats Summary for 1986 Class Stats Summary for 2003 Class 1 – 50565 points (1.2011%) Class 2 – 1266857 points (30.0916%) Class 3 – 1182127 points (28.0790%) Class 4 – 595314 points (14.1405%) Class 5 – 263499 points (6.2589%) Class 6 – 81625 points (1.9388%) Class 7 – 31823 points (0.7559%) Class 8 – 592105 points (14.0643%) Class 9 – 146085 points (3.4700%) Class Stats Summary for 2003 Class 1 – 145531 points (3.8298%) Class 2 – 584474 points (15.3809%) Class 3 – 636504 points (16.7501%) Class 4 – 577702 points (15.2027%) Class 5 – 533950 points (14.0513%) Class 6 – 467833 points (12.3114%) Class 7 – 275965 points (7.2622%) Class 8 – 146022 points (3.8427%) Class 9 – 42091 points (1.1077%) Class 10 – 389928 points (10.2613%) % Difference For: class 1 = 188% class 6 = 473% class 2 = -54% class 7 = 767% class 3 = -46% class 8 = -75% class 4 = -3% class 9 = -93% class 5 = 103%