Presentation is loading. Please wait.

Presentation is loading. Please wait.

26. Classification Accuracy Assessment

Similar presentations


Presentation on theme: "26. Classification Accuracy Assessment"— Presentation transcript:

1 26. Classification Accuracy Assessment
3/16/01 Accuracy Assessment FR 4262

2 26. Classification Accuracy Assessment
3/16/01 Shrub & Herbaceous Forest Cultivated Water Urban – % Impervious 100 Landsat Classification of Twin Cities Metro Area Land Cover How accurate is this classification? Miles 4.5 9 18 27 36 FR 4262

3 26. Classification Accuracy Assessment
3/16/01 Introduction Classification accuracy accuracy of a classified image compared to an independent, reference or standard sometimes called “thematic” accuracy Why do accuracy assessment? FR 4262

4 History of Accuracy Assessment
26. Classification Accuracy Assessment 3/16/01 History of Accuracy Assessment Largely overlooked with interpretation of aerial photography Time honored skill and assumed correct Quantitative assessment approaches are relatively new and have been primarily associated with classification of digital satellite data Early work started in the mid-1970’s FR 4262

5 Phases or Levels of Accuracy Assessment
26. Classification Accuracy Assessment 3/16/01 Phases or Levels of Accuracy Assessment “It looks good” Non-site specific comparison of area of each class Site specific assessment by comparison to known areas and determination of overall percent correct Use of an error matrix and statistics derived from it FR 4262

6 26. Classification Accuracy Assessment
3/16/01 Accuracy vs. Precision Accuracy = agreement between a standard (assumed to be correct) and a classification Precision = detail or specificity; variance Bias = consistent difference between estimated and true values Accurate (unbiased) and precise X X X X X Precise, but not accurate (i.e., biased) Accurate (unbiased), but not precise X X X X FR 4262

7 Sources of Classification Errors
26. Classification Accuracy Assessment Sources of Classification Errors 3/16/01 Data acquisition errors Sensor performance, stability, view angle, atmosphere Data processing errors Misregistration Scene-dependent errors Resolution, mixed pixels Misclassification Errors of omission and commission Inaccurate reference data Temporal inconsistencies between reference map and imagery Incorrectly identified data used for training the classifier; a sure cause of misclassification If reference data class is incorrect in the map used for accuracy assessment, it will be counted as an error even if the pixels are correctly classified FR 4262

8 Boundaries and Mixed vs. Pure Pixels
26. Classification Accuracy Assessment 3/16/01 Boundaries and Mixed vs. Pure Pixels FR 4262

9 Example of Commission and Omission Errors
26. Classification Accuracy Assessment 3/16/01 Example of Commission and Omission Errors Forest Non-Forest True Boundary FR 4262

10 Example of Commission and Omission Errors, cont.
26. Classification Accuracy Assessment 3/16/01 Example of Commission and Omission Errors, cont. Classified as Non-Forest Classified as Forest FR 4262

11 Example of Commission and Omission Errors, cont.
26. Classification Accuracy Assessment 3/16/01 Example of Commission and Omission Errors, cont. Forest Non-Forest Non-Forest Classified as Forest Classified as Forest Classified as Non-Forest = CORRECT CLASSIFICATION = INCORRECT CLASSIFICATION = CORRECT CLASSIFICATION True Boundary Errors of Commission (of Forest) and Omission (of Non-Forest) FR 4262

12 Factors Affecting Classification Accuracy
26. Classification Accuracy Assessment 3/16/01 Factors Affecting Classification Accuracy Parcel size and shape Small and/or narrow parcels have higher proportion of mixed pixels Number and uniformity of classes Spectral-radiometric “contrast” among classes Adequacy of training data Quality of reference data compatibility with imagery in terms of date, scale, resolution, classification system FR 4262

13 Error Characteristics
26. Classification Accuracy Assessment 3/16/01 Error Characteristics Errors are not randomly distributed spatially or among classes more likely to be systematic and preferential Often errors are not spatially isolated points, but occur in groups of varied size and shape i.e., the entire field or stand may be misclassified Errors may be related to the parcels to which they pertain e.g., may occur at edges of fields FR 4262

14 Pre-Accuracy Assessment Considerations
26. Classification Accuracy Assessment 3/16/01 Pre-Accuracy Assessment Considerations How is the map information presented? Classification scheme Discrete vs. continuous data Minimum mapping unit? FR 4262

15 Pre-Accuracy Assessment Considerations
26. Classification Accuracy Assessment 3/16/01 Pre-Accuracy Assessment Considerations Reference data Source (maps, photos, field data, images…) Existing vs. new data Photo vs. ground Resolution vs. classification Assumption of 100% accuracy How to collect? (procedures, cost) When to collect (before, during, after project)? Objectivity and Consistently Independence (equal probability) Collection consistency Inter-interpreter variation Quality control Spatial autocorrelation FR 4262

16 26. Classification Accuracy Assessment
3/16/01 Using existing data vs. collecting new data Existing maps and data are seldom current May have employed a different classification system Are often of unknown accuracy Proceed with caution when using existing data and maps FR 4262

17 Accuracy Assessment Sampling Designs
26. Classification Accuracy Assessment 3/16/01 Accuracy Assessment Sampling Designs Area-based assessment Simple random Systematic (simple, offset) Stratified random (by class or other) Stratified systematic (e.g. by quad) Cluster sampling Multi-stage sampling FR 4262

18 Comparison of Non-site Specific and Site Specific Measures of Accuracy
26. Classification Accuracy Assessment 3/16/01 Comparison of Non-site Specific and Site Specific Measures of Accuracy Classified or Interpreted Images Reference Map Non-Site Specific Site Specific Area of A = 200 Area of A = 198 Area of A = 180 Which is the better classification? Class A FR 4262

19 Example of Reference Sections for Training and Accuracy Assessment
26. Classification Accuracy Assessment 3/16/01 Example of Reference Sections for Training and Accuracy Assessment Stratified systematic design with sections as sampling units FR 4262

20 Units of Comparison for Site Specific Accuracy Assessment
26. Classification Accuracy Assessment 3/16/01 Units of Comparison for Site Specific Accuracy Assessment Pixels Clusters of pixels (e.g., 3 x 3 block) Neighborhoods of pixels e.g., crop fields or forest stands “Wall-to-wall” comparisons a problem because of mixed pixels FR 4262

21 26. Classification Accuracy Assessment
3/16/01 Common Mistakes Inappropriate reference source Resolution, time, quality Samples not representative Too few / unequal probabilities Pre-screening reference areas Sampling homogeneous areas “Fudging” samples “Correcting” data Rejection of “outliers” and “bad data” Autocorrelation Invalid sampling scheme (statistically: bias ) Inter-interpreter variation Non-blind reference data collection Not planning for accuracy assessment Not reporting enough info FR 4262

22 Comparing Classifications to Training Data
26. Classification Accuracy Assessment 3/16/01 Comparing Classifications to Training Data Not a rigorous (good) approach will likely be biased training data are generally more accurately classified than other (independent) data Instead need an independent sample of reference data to compare classification to FR 4262

23 26. Classification Accuracy Assessment
3/16/01 Group Exercise Describe a valid sampling scheme to assess the accuracy of a single 2011 Level 2 (i.e. several broad classes with one level of sub-classes) Landsat-based land cover classification of the Twin Cities Metro Area. FR 4262


Download ppt "26. Classification Accuracy Assessment"

Similar presentations


Ads by Google