Download presentation
Presentation is loading. Please wait.
1
Lessons Learned: Land Cover Mapping for the Southwest Regional Gap Analysis Project John Lowry Utah Project Coordinator SWReGAP College of Natural Resources RSGIS Laboratory Utah State University 13 th National GAP Analysis Program Meeting Fort Collins, Colorado R. Douglas Ramsey Principle Investigator Dept. of Fisheries, Wildlife and Range Science Utah State University
2
Presentation Overview: I. Overview of SWReGAP II. Lessons Learned: Methodology & Technical Applications III. Lessons Learned: Coordination & Programmatic Issues IV. Conclusion
3
SWReGAP Participants
4
“Mixed Salt Desert Shrub” 1992 36 Cover Types 1999 52 Cover Types 2000 52 Cover Types 1996 42 Cover Types State-based land cover mapping efforts
5
Regionally standardized data and mapping methods Regionally consistent land cover legend (vegetation classification system) Eco-regional emphasis rather than state-based emphasis Improvements in vegetation map product Objectives for Southwest Regional GAP
6
II.Lessons Learned: Methodology & Technical Applications Mapping scale: spatial resolution & extent, and thematic resolution Predictor data: imagery & ancillary data Classification method: Classification Trees Training samples: sampling strategies Lessons learned…
7
Mapping Scale: Spatial Resolution 5 state coverage = 85 Landsat ETM 7 scenes 30 meter resolution pixels 1 football field ~ 0.5 ha ~ 1.25 acres
8
~ 528,000 square miles ~340,000,000 acres ~1.5 Billion pixels ~1/5 th of conterminous US Mapping Scale: Spatial Extent
9
~ 528,000 square miles ~340,000,000 acres ~1.5 Billion pixels ~1/5 th of conterminous US 40 mapping zones Ecoregionally distinct Spectrally consistent Approximately state-based Mapping Scale: Spatial Extent
10
NVC Formation NVC Alliance NVC Association Gap Analysis Program MRLC 2000 Proposal ~1,800 units National Park Mapping ~ NVC Class/Subclass ~10 units NatureServe Ecological Systems ~5,000 units ~700 units (Natural/Semi-natural types) ~300 units (Slide Courtesy Pat Comer, Nature Serve) Mapping Scale: Thematic Resolution
11
Groups of plant communities and sparsely vegetated habitats unified by similar ecological processes, substrates, and/or environmental gradients. Thematic Resolution: Ecological Systems
12
Bioclimatic divisions differentiate Ecological Systems Great Basin Pinyon-Juniper Woodland Ecological System Pinus monophylla and Juniperus osteosperma Colorado Plateau Pinyon-Juniper Woodland Ecological System Pinus edulis and/or Juniperous osteosperma Great Basin Colorado Plateau
13
Shadscale Shrubland Alliance Four-wing Saltbush Alliance Inter-Mountain Basins Mixed Salt Desert Scrub Ecological System Similar substrates group Alliances
14
Elevation Landform Predictor Datasets: DEM derived
15
July-AugSept-Oct ETM Bands 5, 4, 3 Predictor Datasets: Imagery Derived
16
Data-mining software for decision-making and exploratory data analysis Identifies complex relationships between multiple independent variables to predict a single categorical class Predictor variables may be categorical or continuous Recursively “splits” the predictor data to create prediction rules or a decision tree. Classification Method: Classification Trees
17
Simplified Example: Splits on 2 variables
18
Simplified Example: Tree output for 2 variables
19
Rulemaker (SPLUS & Imagine) Vinod Chowdary (USU, MS Computer Sci.) http://www.gis.usu.edu/docs/projects/swgap/rulemaker.html STATMOD (SPLUS & Arcview) Christine Garrard (USU, MS Biology) http://bioweb.usu.edu/gistools/statmod/ Imagine CART Module (See5 & Imagine) Eros Data Center (Earth Satellite Corp) http://www.gis.usu.edu/%7Eregap/download/C5Module/ Tools for Spatially Applying Classification Trees
20
Ground-based opportunistic field sampling Sufficient data to assign an Alliance label to each site Training Samples: Strategy
21
Stratification based on landform or spectral cluster map Later on implemented other sampling strategies to augment ground-collected field data Training Samples: Strategy
22
What have we learned? Mapping Scale Start with a target legend in-hand that is appropriate for the scale of the project Approximately 500 Alliances in the SW Region No. cover types mapped in 1 st generation GAP AZ 52, CO 52, NM 42, NV 65, UT 36. Approximately 100 Ecological systems in the SW Region Ecological Systems are appropriate thematic scale, but must be used with mappability in mind.
23
Some similar ecological processes may not be ideal for grouping Inland Saltgrass AlliancePlaya Sparsely Vegetated Alliance Inter-Mountain Basins Playa Ecological System
24
Inland Saltgrass AlliancePlaya Sparsely Vegetated Alliance Map at Alliance LevelLarge enough to map?
25
What have we learned? Predictor Datasets More predictor variables not necessarily better Choose through intuition and experimentation Let the classification tree tell you (e.g. protree script) Scale and Resolution should be consistent Some good predictors, but spatially incongruent
26
What have we learned? Classification Methods Have the tools in-place and understand the requirements of the classification tool Computational methods for improved Classification Trees Boosting & Pruning Focus Classification Tree on Geographical & Ecological Similar landscapes
27
Eco-regional Zones Life Zones Montane/Subalpine Vs Lowland Ecologically and geographically focused Classification Trees
28
Life ZonesEco-regional ZonesMontane & subalpine classes
29
Ecologically and geographically focused Classification Trees Life ZonesEco-regional ZonesMontane & subalpine classes + Lowland classes
30
Ecologically and geographically focused Classification Trees Life ZonesEco-regional ZonesMontane & subalpine classes + Lowland classes + Un-modeled classes
31
What have we learned? Training Samples Recognize there are two purposes of training samples 1) To assess the landscape—what the legend should be 2) To drive the classification model The quality and quantity of the two differ 1) Assessing the landscape high quality (ground-data) Sufficient species composition and environmental data to characterize the ecological system or alliance 2) Driving the classification model high quantity Quantity depends on: Size of area Number and complexity of cover types 20, 50 or 200 minimum?—proportional to occurrence!
32
What have we learned? Training Samples: A better approach Initial assessment of the landscape Stratified ground-based field work Use earlier products (1 st generation GAP) Collect samples to drive the classification model Remotely-gathered sampling (aerial photography, videography, etc. Quick sampling approaches
33
III.Lessons Learned: Coordination & Programmatic Issues Regionally coordinated…Centrally implemented… Or something else?
34
What have we learned? Advantages of regionally coordinated state-based model Divided the work-load into five groups Local knowledge of vegetation and ecology Vested-interest—doing work for “your constituents”
35
What have we learned? Disadvantages of regionally coordinated state-based model Multiple perspectives/understanding of how to achieve project objectives 5 separate proposals/contracts & timelines Different perspectives on minimum standards & what is achievable Examples: Alliance level mapping & accuracy assessment Ecological modeling or remote sensing-based mapping?
36
Some Considerations… Meeting the Objectives The importance of mapping scale! Spatial resolution & extent Realistic/appropriate thematic resolution— Ecological Systems Clearly Identified Standards/Expectations If multiple groups, clearly understood realistic expectations
37
Some Considerations… Project size and scope The regional model is a good idea Geographical size of SWReGAP may be larger than is manageable under one project. Rather than state-based regional model, consider ecological-based (eco-region) regional model
38
IV. Conclusion: How are we doing? 2000: Work began 2001: Imagery acquisition & processing 2002: Field data collection, methodology & tool development 2003: Field data collection (~70,000 total) & land cover mapping 2004: Land Cover map complete
39
1995 GAP 30 M2003 GAP 30 M1995 GAP Pub.1KM
40
Accuracy Assessment when map is completed Internal Validation concurrently with mapping effort Large landscape (regional & national) monitoring and planning—scales of 1: 100 k – 1: 250k Accuracy, Validation and Appropriate Uses
41
Acknowledgements
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.