Presentation is loading. Please wait.

Presentation is loading. Please wait.

Risk Assessment Methodology

Similar presentations


Presentation on theme: "Risk Assessment Methodology"— Presentation transcript:

1 Risk Assessment Methodology
Lower Suwannee Watershed

2 HAZUS Level 1 vs. Level 2 Level 1 Level 2
National datasets used Quick and already processed Used when local data is no available Does not require the creation of new depth grids Level 2 More accurate depth grids based on intensive modeling techniques Building data from local communities allows for more accurate loss estimates Utilizes User Defined Facilities For the Lower Suwannee Watershed, Average Annualized Loss (AAL) from the 2010 Nationwide Study was used along with “Refined” Level 2 Hazus analysis for the new or updated flood studies.

3 Risk Assessment Datasets
Start 1 Start 2 Hazus UDF Geometry used to populate the S_UDF_Pt feature class UDF losses are used to populate the L_UDF_Refined table Hazus GBS Data is used to prepare for the L_Exposure table DFIRM Data is used to build & populate most of the S_FRD_Pol_Ar NHD Data is used to build & populate S_HUC_Ar Update the Population fields from 2010 census and area weight by project area Summarize the CB Exposure values by area weighting for each Political Area to build final L_Exposure table Hazus GBS Data is used to build and populate the S_CenBlk_Ar Population Replacement Values Building Counts HUC Code HUC Name 2010 National AAL Database is used to populate the L_RA_AAL table Political Area Occupancy Return Period Loss (Bldg, Cont, Inv) Percent Damage Aggregate L_UDF_Refined table by Census Block to build L_RA_Refined table (Bldg & Cont by Occupancy Type) Business Disruption Damage Counts Add Occupancy breakdown to losses during aggregation from the L_UDF_Refined to the L_RA_Refined Combine data from the L_RA_AAL table and L_RA_Refined table to populate the L_RA_Composite table L_RA_Composite table area weighted for each Political Area Aggregate the L_UDF_Refined table by Political Area Combine for final L_RA_Summary table DATA FLOW THROUGH THE FLOOD RISK DATABASE: UDF Study Hazus Data DFIRM Data 2010 National AAL DB The Flood Risk Database is a relational database. This chart illustrates the work flow and the relationships used to create the risk assessment features and tables. Six tables and four features Finish

4 Hazus Input User Defined Facilities (UDF) - used to replace Hazus default values with real life values. A Level 2 Hazus analysis relies on two inputs: User Defined Facilities or UDF points, which are used to replace Hazus default values with real life values and Depth Grids, which represent flood depths across the revised floodplains for the 10, 4… Depth Grids - flood depth values for the 10%, 4%, 2%, 1%, and the 0.2% annual chance flood events.

5 User Defined Facilities (UDF)
One point per structure in the revised floodplain Parcel Data correlated into Hazus UDF Table Schema Occupancy type and Facility type as well as Asset Replacement values for both building and content Aggregated by Census Block 2012 parcel data was used as the basis for placing the UDF points. Parcels that intersected the revised floodplains were selected and a point was placed on top of structures that fell within the SFHAs. Parcels that did not have structures in the floodplain were removed from the selection. The parcel data is then massaged into HAZUS user-friendly datasets and imported as building inventory data. The HAZUS UDF table schema is very specific and correlating the parcel data to HAZUS equivalent values can be an extensive and time consuming process depending on the data you start with. Florida’s statewide parcel dataset has use codes which was a great help when determining the occupancy type. AMEC also developed a custom geoprocessing tool to populate the fields required by HAZUS. When no relevant information is available, HAZUS default values are used. The Census Block is the smallest level that can be used for the Hazus analysis.

6 10% 2% 1% 0.2% 4% Hazus Regions 5E (5 Event) Depth Grids
Multiple Regions Though Hazus Regions were set up on a county level, varying depth grid cell sizes and return periods required multiple runs for each county. The original regions were duplicated for each of the runs so that the User Defined Facilities Data did not have to be reloaded each time. The 5E runs cover the areas where depth grids for all 5 flood events were available. The 4E runs cover the redelineated portions of the Suwannee River, where depth grids were not created for the 4% annual chance flood event. It is important to keep 4 Event data and 5 Event data separate because the equation used to calculate the Average Annualized Losses (AAL) is dependent on the number of return periods you are working with. The depth grids are loaded into their respective regions. These datasets are rather large so loading them took anywhere from 30 minutes to a few hours per region. Once the depth grids are loaded, Hazus uses them to delineate floodplains which can also take a few hours per region.

7 Hazus Run Now we’re ready to initiate the Hazus run. Hazard scenarios can be set up to run all five flood events at once or to run each flood event individually. The 5 Event regions were straight forward. In order to avoid having to do an individual run for each return period, a dummy depth grid was used for the 4% annual chance flood event for the 4 Event runs. Results associated with the 4% return period were later removed.

8 Hazus Output Multiple Regions = Multiple Outputs
Results from the Hazus run consist of a series of Microsoft SQL Server tables.

9 Customized Tools Extract Data from Hazus Calculate AAL
Equation based on the number of return periods. L_RA_Summary table Extract Data from Hazus Calculate AAL Populate FRD tables Summarize FRD tables The Average Annualized Loss (AAL) is calculated outside of HAZUS. AMEC developed a suite of python scripts and geoprocessing tools to streamline the process of getting the results out of the HAZUS Output tables we just saw and into the format required for the AAL calculations and the final Flood Risk Database. Although these tools automate quite a bit, there are manual processes and decision making that takes place between running each tool. The UDF Extraction tool takes the output of the HAZUS run and creates data to be used in the AAL analysis including the exposure values, census blocks, and UDF data from each region. Once the data has been extracted, the UDF points are QA. In some instances, a UDF points falls between Census Blocks so it must be manually evaluated and assigned a Census Block number. This is also the point where the dummy 4% annual chance flood event values are removed from the 4 Event results. The AAL Calculation tools perform a series of joins based on Occupancy Type and Return Period and then calculate the AAL based on the number of return periods used for the analysis. These results are then used as input into the UDF Extraction to FRD tool. This tool creates and populates the required fields for the L_Exposure, L_RA_UDF_Refined…. The composite table is joined to the census blocks to The final tool uses the political areas and the watershed boundary to weight the losses and summarize the results per community. The FRD Summary tool is unique in that it has a built in QA/QC output. Data is extracted for AAL analysis. L_Exposure, L_RA_AAL, L_RA_UDF_Refined and L_RA_Composite tables

10 Quality Control Required Rounding Additional QC Fields
Losses shown in the report are rounded to the nearest $10,000 for values under $10,000 and to the nearest $100,000 for values over $100,000. This can lead to skewed numbers in the tables (3 million plus $100,000 does not equal 3.2 million). In order to QA our results, we designed our tool to outputs the raw data as additional fields in the summary table. When you look at the raw total $3,185,794….3.2 million makes a little more sense. These fields are removed once the values are verified so that the table meets the requirements of the Flood Risk Database. The table containing the raw values can be included as an additional dataset if requested by the community.

11 Final Features and Tables


Download ppt "Risk Assessment Methodology"

Similar presentations


Ads by Google