Download presentation
Presentation is loading. Please wait.
Published byMuriel Gilmore Modified over 8 years ago
1
Accuracy Assessment of Thematic Maps THEMATIC ACCURACY
2
What is Accuracy Assessment? Map Accuracy -- The proportion of agreement between a classified map and reference data assumed to be correct. Thematic Precision – The level of detail that is mapped. A map distinguishing lodgepole pine, Douglas fir, Ponderosa, etc. is more precise (but probably less accurate!) than a map just showing Forest. Accuracy and precision are DIFFERENT!
3
Spatial Accuracy vs. Thematic Accuracy Thematic accuracy is how well the class names on the map correspond to what is really on the ground. Spatial accuracy quantifies errors in the locations of boundaries. Thematic and spatial accuracy are related but are usually treated separately.
4
Steps for performing thematic accuracy assessment: 1) Develop a sampling scheme 2) Collect reference data 3) Compare reference data to classified map 4) Compute accuracy metrics and deliver to map users
5
Sampling Schemes Reference locations (same thing as training sites) must be unbiased Reference locations must be large enough to find with certainty on your classified image You must ensure that you can correctly identify the types at your reference sites You must visit LOTS of reference locations They MUST be different than the training sites used to create the original classified image.
6
Collecting Reference Data Same as for original training site data! Field, other images, personal knowledge, etc.
7
Quantifying Accuracy – Comparing Mapped Types to Reference Types Contingency Tables = Error Matrices = Confusion Matrices Traditional Accuracy Statistics are calculated using the error matrix
8
Contingency Table or Error Matrix Error matrix is an n x n array where n is the number of classes Rows: reference (“correct”) data (n rows) Columns: mapped classes (n cols) Note that rows and columns can be switched, so you have to pay attention!
9
Error Analyses - Example Create a map with 3 thematic classes (water, forest, and urban) Collect 95 ground reference data (Water 33, Forest 39, and Urban 23) Compare those locations to those places in the map Generate Error Matrix.
10
Confusion Matrix Reference data Classified image Water ForestUrbanTotal Water215733 Forest631239 Urban012223 Total27373195
11
Classification Accuracy Lots of ways to look at the thematic accuracy of a classification Overall accuracy Errors of omission Errors of commission User’s accuracy Producer’s accuracy Accuracy statistics (e.g., Kappa) Fuzzy accuracy
12
Overall Accuracy Of all of the reference sites, what proportion were mapped correctly? Easiest to understand but least amount of information for map users and map producers (us).
13
Overall Accuracy Reference data Classified image WaterForestUrbanTotal Water215733 Forest631239 Urban012223 Total27373195 Correctly classified: 21 + 31 + 22 = 74 Total number reference sites = 95 Overall accuracy = 74 / 95 = 77.9%
14
Off-diagonal Elements The off-diagonal elements of a contingency table tell us the most about how to improve our remote sensing classification! Should spend lots of time examining ERRORS to figure out what went wrong
15
Errors of Omission The type on the ground is not that type on the classified image – the real type is OMITTED from the classified image.
16
Omission Error Reference data Classified image WaterForestUrbanTotal Water215733 Forest631239 Urban012223 Total27373195 For water: 5 + 7 = 12 12 / 33 = 36% For forest: 6 + 2 = 8 8 / 39 = 20% For urban: 0 + 1 = 1 1 / 23 = 4%
17
Errors of Commission A type on the classified image is not that type on the ground – the type is COMMITTED to the classified image.
18
Commission Error Reference data Classified image WaterForestUrbanTotal Water215733 Forest631239 Urban012223 Total27373195 For water: 6 + 0 = 6 6 / 27 = 22% For forest: 5 + 1 = 6 6 / 37 = 16% For urban: 7 + 2 = 9 9 / 31 = 29%
19
Producer’s Accuracy Map accuracy from the point of view of the map maker (PRODUCER). How often are real features on the ground correctly shown on the map?
20
Producer’s Accuracy Can be computed (and reported) for each thematic class Producer accuracy = 100 – omission error Water: 100 – 36 = 64% Forest: 100 – 20 = 80% Urban: 100 – 4 = 96% Or PA = #correct/row total
21
User’s Accuracy Accuracy from the point of view of a map USER (not a map maker). How often is the type the map says should be there really there?
22
User’s Accuracy Can be computed (and reported) for each thematic class User accuracy = 100 – commission error Water: 100 – 22 = 78% Forest: 100 – 16 = 84% Urban: 100 – 29 = 71% Or UA = #correct/column total
23
Another example forestbushcropurbanbarewaterunclassProduc er acc forest 44040003010 0.83 bush 20220004010200.71 crop 10 210105010600.58 urban 200 24010010400.56 bare 0010 2300100.88 water 020000240100.89 User acc 0.900.760.880.920.510.86 overall 73.15%
24
Ok. Now, wing one on the board.
25
Accuracy Assessment -- Summary Absolutely critical to any remote sensing based mapping project Expensive – should be included in the budget from the start Requires collection of accurate reference data either in the field or from higher resolution data Analysis should include many aspects of accuracy to give users more information about the product.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.