Risk Assessment Methodology

Slides:



Advertisements
Similar presentations
WITHOUT LANGUAGE [ DEVELOPING GEO-PROCESSING MODELS USING ARCGIS MODELBUILDER 10 ] PROGRAMMING R. RYAN STEVENS / GIS RESEARCH ANALYST / THE POLIS CENTER.
Advertisements

Access 2007 ® Use Databases How can Microsoft Access 2007 help you structure your database?
Microsoft ® Access ® 2010 Training Create queries for a new database.
Using LMPlan - Version 1.6 by AlNik Solutions, LLC Copyright 2011 ©
Using Mitigation Planning to Reduce Disaster Losses Karen Helbrecht and Kathleen W. Smith United States: Federal Emergency Management Agency (FEMA) May.
Chapter 6 Spatial Joins. Which county has most earthquakes? 1.County data layer 2.Earthquake data layer 3.No common field between layers.
Update on Use of Hazus for FEMA Risk MAP Flood Risk Products Shane Parson – RAMPP PTS (URS)
Update on Hazus AAL Study and Data
Map Analysis with Feature Datasets Francisco Olivera, Ph.D., P.E. Department of Civil Engineering Texas A&M University.
May 19 th, 2010 Risk-Based Community Assistance Visit Prioritization Using GIS Methodologies Bret Gates, FEMA Susan Phelps, AECOM ASFPM 2010 Annual Conference.
Risk Map Early Demonstration Project Lackawanna County, PA CCO Meeting September 13, 2011.
19 Costing Systems: Process Costing Principles of Accounting 12e
Managing Data Resources. File Organization Terms and Concepts Bit: Smallest unit of data; binary digit (0,1) Byte: Group of bits that represents a single.
@ 2007 Austin Troy. Geoprocessing Introduction to GIS Geoprocessing is the processing of geographic information. Perform spatial analysis and modeling.
1 A GIS-Based Flood Inundation Mapping and Analysis Pilot Project Indiana GIS Conference February 19-20, 2008 John Buechler, The Polis Center Moon Kim,
NHD Watershed: Tools and Applications
Air Quality Data Analysis Using Open Source Tools
David Knipe Engineering Section Manager Automated Zone A Floodplain Mapping.
Flood Risk Review Meeting: [Watershed Name] [LOCATION] [DATE]
Lecture 4 Geodatabases. Geodatabases Outline  Data types  Geodatabases  Data table joins  Spatial joins  Field calculator  Calculate geometry 
Flood Risk Datasets & Products in Greenville County, South Carolina Agenda Greenville Co, SC Overview Process, Examples, Lessons Learned, and Community.
SSIS Over DTS Sagayaraj Putti (139460). 5 September What is DTS?  Data Transformation Services (DTS)  DTS is a set of objects and utilities that.
The Diversity of Hazus Uses for Hazus Beyond Planning ASFPM 2011 National Conference Louisville, KY Tuesday, May 17, 2011 ASFPM 2011 National Conference.
Preparing Data for Analysis and Analyzing Spatial Data/ Geoprocessing Class 11 GISG 110.
2005 Ohio GIS Conference September 21-23, 2005 Marriott North Hotel Columbus, Ohio Geoprocessing for Animal Premises ID Luanne Hendricks State of Ohio.
Applied Cartography and Introduction to GIS GEOG 2017 EL
Dr. Shane Parson, PE, CFM, URS (RAMPP Team)
Microsoft ® Access ® 2010 Training Create Queries for a New Database If a yellow security bar appears at the top of the screen in PowerPoint, click Enable.
FLOOD RISK ASSESSMENT IN FREDERICTON USING MS EXCEL SPREADSHEET January 21, 2015 CANHUG Meeting H. McGrath, PhD Candidate, University of New Brunswick.
Copyright © 2006 by Maribeth H. Price 8-1 Chapter 8 Geoprocessing.
2004 CAS RATEMAKING SEMINAR INCORPORATING CATASTROPHE MODELS IN PROPERTY RATEMAKING (PL - 4) ROB CURRY, FCAS.
Touchstone Automation’s DART ™ (Data Analysis and Reporting Tool)
Access 2007 ® Use Databases How can Microsoft Access 2007 help you structure your database?
Automated Solutions to Water Resource Evaluations Katherine Skalak, EIT ODNR Floodplain Management Program 2012 Ohio GIS Conference September ,
“Update Existing Flood Inundation Mapping”
Programming Logic and Design Fourth Edition, Comprehensive Chapter 16 Using Relational Databases.
Managing Data Resources. File Organization Terms and Concepts Bit: Smallest unit of data; binary digit (0,1) Byte: Group of bits that represents a single.
Introduction of Geoprocessing Lecture 9. Geoprocessing  Geoprocessing is any GIS operation used to manipulate data. A typical geoprocessing operation.
DAY 21: MICROSOFT ACCESS – CHAPTER 5 MICROSOFT ACCESS – CHAPTER 6 MICROSOFT ACCESS – CHAPTER 7 Aliya Farheen October 29,2015.
Northwest Florida Water Management District Monday, August 22, 2011.
Introduction to Geographic Information Systems Fall 2013 (INF 385T-28620) Dr. David Arctur Research Fellow, Adjunct Faculty University of Texas at Austin.
1 Copyright © 2009, Oracle. All rights reserved. Oracle Business Intelligence Enterprise Edition: Overview.
WFM 6202: Remote Sensing and GIS in Water Management © Dr. Akm Saiful IslamDr. Akm Saiful Islam WFM 6202: Remote Sensing and GIS in Water Management Dr.
CENTENNIAL COLLEGE SCHOOL OF ENGINEERING & APPLIED SCIENCE VS 361 Introduction to GIS SPATIAL OPERATIONS COURSE NOTES 1.
More SQL: Complex Queries, Triggers, Views, and Schema Modification
Overview of HAZUS for Flood Loss Estimations
Richard Butgereit, GISP Chief Information Officer
Hazard Assessment with GIS
More SQL: Complex Queries,
ICT Technical Documentation and Flow Chart
Operation Data Analysis Hints and Guidelines
Goal 4: Statewide Depth Grid
Multi-Hazard Risk Report for Tillamook County
North Carolina Lumber River Basin Plan
2000 CAS RATEMAKING SEMINAR
Hazus Data Preparation and Data Exporter Tool
Workers’ Compensation Loss Estimation due to Earthquakes and Terrorism
LOGISTICS NETWORK.
Multi-Hazard Risk Report for Tillamook County
Tan Hoang GEOG 362 – Final Project
Attribute Extraction.
Spatial Data Processing
GIS Lecture: Geoprocessing
Automating and Validating Edits
Access: Database Design Participation Project
Access: Access Basics Participation Project
Vector Geoprocessing.
CAD DESK PRIMAVERA PRESENTATION.
Flood Modeling Tools for Response and Alerting
Automated Zone A Floodplain Mapping
Presentation transcript:

Risk Assessment Methodology Lower Suwannee Watershed

HAZUS Level 1 vs. Level 2 Level 1 Level 2 National datasets used Quick and already processed Used when local data is no available Does not require the creation of new depth grids Level 2 More accurate depth grids based on intensive modeling techniques Building data from local communities allows for more accurate loss estimates Utilizes User Defined Facilities For the Lower Suwannee Watershed, Average Annualized Loss (AAL) from the 2010 Nationwide Study was used along with “Refined” Level 2 Hazus analysis for the new or updated flood studies.

Risk Assessment Datasets Start 1 Start 2 Hazus UDF Geometry used to populate the S_UDF_Pt feature class UDF losses are used to populate the L_UDF_Refined table Hazus GBS Data is used to prepare for the L_Exposure table DFIRM Data is used to build & populate most of the S_FRD_Pol_Ar NHD Data is used to build & populate S_HUC_Ar Update the Population fields from 2010 census and area weight by project area Summarize the CB Exposure values by area weighting for each Political Area to build final L_Exposure table Hazus GBS Data is used to build and populate the S_CenBlk_Ar Population Replacement Values Building Counts HUC Code HUC Name 2010 National AAL Database is used to populate the L_RA_AAL table Political Area Occupancy Return Period Loss (Bldg, Cont, Inv) Percent Damage Aggregate L_UDF_Refined table by Census Block to build L_RA_Refined table (Bldg & Cont by Occupancy Type) Business Disruption Damage Counts Add Occupancy breakdown to losses during aggregation from the L_UDF_Refined to the L_RA_Refined Combine data from the L_RA_AAL table and L_RA_Refined table to populate the L_RA_Composite table L_RA_Composite table area weighted for each Political Area Aggregate the L_UDF_Refined table by Political Area Combine for final L_RA_Summary table DATA FLOW THROUGH THE FLOOD RISK DATABASE: UDF Study Hazus Data DFIRM Data 2010 National AAL DB The Flood Risk Database is a relational database. This chart illustrates the work flow and the relationships used to create the risk assessment features and tables. Six tables and four features Finish

Hazus Input User Defined Facilities (UDF) - used to replace Hazus default values with real life values. A Level 2 Hazus analysis relies on two inputs: User Defined Facilities or UDF points, which are used to replace Hazus default values with real life values and Depth Grids, which represent flood depths across the revised floodplains for the 10, 4… Depth Grids - flood depth values for the 10%, 4%, 2%, 1%, and the 0.2% annual chance flood events.

User Defined Facilities (UDF) One point per structure in the revised floodplain Parcel Data correlated into Hazus UDF Table Schema Occupancy type and Facility type as well as Asset Replacement values for both building and content Aggregated by Census Block 2012 parcel data was used as the basis for placing the UDF points. Parcels that intersected the revised floodplains were selected and a point was placed on top of structures that fell within the SFHAs. Parcels that did not have structures in the floodplain were removed from the selection. The parcel data is then massaged into HAZUS user-friendly datasets and imported as building inventory data. The HAZUS UDF table schema is very specific and correlating the parcel data to HAZUS equivalent values can be an extensive and time consuming process depending on the data you start with. Florida’s statewide parcel dataset has use codes which was a great help when determining the occupancy type. AMEC also developed a custom geoprocessing tool to populate the fields required by HAZUS. When no relevant information is available, HAZUS default values are used. The Census Block is the smallest level that can be used for the Hazus analysis.

10% 2% 1% 0.2% 4% Hazus Regions 5E (5 Event) Depth Grids Multiple Regions Though Hazus Regions were set up on a county level, varying depth grid cell sizes and return periods required multiple runs for each county. The original regions were duplicated for each of the runs so that the User Defined Facilities Data did not have to be reloaded each time. The 5E runs cover the areas where depth grids for all 5 flood events were available. The 4E runs cover the redelineated portions of the Suwannee River, where depth grids were not created for the 4% annual chance flood event. It is important to keep 4 Event data and 5 Event data separate because the equation used to calculate the Average Annualized Losses (AAL) is dependent on the number of return periods you are working with. The depth grids are loaded into their respective regions. These datasets are rather large so loading them took anywhere from 30 minutes to a few hours per region. Once the depth grids are loaded, Hazus uses them to delineate floodplains which can also take a few hours per region.

Hazus Run Now we’re ready to initiate the Hazus run. Hazard scenarios can be set up to run all five flood events at once or to run each flood event individually. The 5 Event regions were straight forward. In order to avoid having to do an individual run for each return period, a dummy depth grid was used for the 4% annual chance flood event for the 4 Event runs. Results associated with the 4% return period were later removed.

Hazus Output Multiple Regions = Multiple Outputs Results from the Hazus run consist of a series of Microsoft SQL Server tables.

Customized Tools Extract Data from Hazus Calculate AAL Equation based on the number of return periods. L_RA_Summary table Extract Data from Hazus Calculate AAL Populate FRD tables Summarize FRD tables The Average Annualized Loss (AAL) is calculated outside of HAZUS. AMEC developed a suite of python scripts and geoprocessing tools to streamline the process of getting the results out of the HAZUS Output tables we just saw and into the format required for the AAL calculations and the final Flood Risk Database. Although these tools automate quite a bit, there are manual processes and decision making that takes place between running each tool. The UDF Extraction tool takes the output of the HAZUS run and creates data to be used in the AAL analysis including the exposure values, census blocks, and UDF data from each region. Once the data has been extracted, the UDF points are QA. In some instances, a UDF points falls between Census Blocks so it must be manually evaluated and assigned a Census Block number. This is also the point where the dummy 4% annual chance flood event values are removed from the 4 Event results. The AAL Calculation tools perform a series of joins based on Occupancy Type and Return Period and then calculate the AAL based on the number of return periods used for the analysis. These results are then used as input into the UDF Extraction to FRD tool. This tool creates and populates the required fields for the L_Exposure, L_RA_UDF_Refined…. The composite table is joined to the census blocks to The final tool uses the political areas and the watershed boundary to weight the losses and summarize the results per community. The FRD Summary tool is unique in that it has a built in QA/QC output. Data is extracted for AAL analysis. L_Exposure, L_RA_AAL, L_RA_UDF_Refined and L_RA_Composite tables

Quality Control Required Rounding Additional QC Fields Losses shown in the report are rounded to the nearest $10,000 for values under $10,000 and to the nearest $100,000 for values over $100,000. This can lead to skewed numbers in the tables (3 million plus $100,000 does not equal 3.2 million). In order to QA our results, we designed our tool to outputs the raw data as additional fields in the summary table. When you look at the raw total $3,185,794….3.2 million makes a little more sense. These fields are removed once the values are verified so that the table meets the requirements of the Flood Risk Database. The table containing the raw values can be included as an additional dataset if requested by the community.

Final Features and Tables