Download presentation
Presentation is loading. Please wait.
Published byVirgil McCarthy Modified over 9 years ago
1
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 1 Aids to the Comparison of Improvement Projects of the Virginia Department of Transportation Presented to the Governor’s Commission on Transportation Policy prepared by Center for Risk Management of Engineering Systems University of Virginia and Virginia Transportation Research Council October 2, 2000
2
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 2 Project Team Virginia Department of Transportation: Travis Bridewell Dave Dreis Connie Sorrell Thomas Hawthorne Jeffrey Hores Virginia Transportation Research Council: Wayne S. Ferguson Jack D. Jernigan Cheryl W. Lynn John S. Miller Rod E. Turochy
3
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 3 Project Team (cont.) Center for Risk Management of Engineering Systems: James H. Lambert, Research Assistant Professor of Systems Engineering, Center Associate Director Yacov Y. Haimes, Quarles Professor of Systems Engineering and Civil Engineering and Center Director Hendrik Frohwein, Ph.D. Jeff Baker, M.S. Ruth Dicdican, Ph.D. Student Bonnie Hannigan, B.S. Student Emily Liggett, B.S. Student Rebecca Worthington, B.S. Student Contact information: (804) 982-2072, (804) 924-0865 (fax) lambert@virginia.edu http://www.virginia.edu/~risk/VDOT
4
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 4 Project Team (cont.) Steering Committee: J. Lynwood Butner, Former State Traffic Administrator Robert O. Cassada, Programming and Scheduling Steve D. Edwards, Traffic Engineering Division C. Frank Gee, State Construction Administrator James S. Givens, Secondary Roads Robert A. Hanson, Virginia Transportation Research Council M. Scott Hollis, Urban Roads Jeffrey S. Hores, Culpeper District Traffic Engineer Elona Orban Kastenhofer, Northern Virginia District Kenneth E. Lantz, Jr., Transportation Planning Administrator Harry W. Lee, Fredericksburg District Location and Design/Survey Engineer Jimmy T. Mills, State Location and Design Administrator R. Robert Rasmussen, Traffic Engineering Division Daniel S. Roosevelt, Virginia Transportation Research Council Gerald A. Venable, Traffic Engineering Division Kenneth W. Wester, Northern Virginia District Operations Engineer
5
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 5 Agenda Overview of Comparison Tool Review of Other Approaches Draft Proposal for Future Effort
6
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 6 Overview of Comparison Tool
7
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 7 Motivation Tools are needed to equitably balance: –Crash reduction, –Capacity improvement, and –Project cost –Other factors To aid in the decisions of what roadway-projects to undertake with available funds
8
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 8 Diversity of Improvements in the 6-Year-Plan Road Sections Intersections Bridges Signals Others
9
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 9 Comparison Tool Objective Our objective is to provide tools to assist the Virginia Department of Transportation in improving the comparison in planning of potential roadway improvement projects.
10
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 10 Technological Age Risk Management Optimal Balance Technology Management: Man/Machine/Software Systems Planning Design Operation Risk Management Uncertain Benefits Uncertain Costs
11
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 11 Risk assessment and management must be an integral part of the decisionmaking process
12
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 12 RISK: A Measure of the Probability and Severity of Adverse Effects
13
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 13 RISK vs. SAFETY Measuring risk is an empirical, quantitative, scientific activity (e.g., measuring the probability and severity of harm). Judging safety is judging the acceptability of risks -- a normative, qualitative, political activity. (After William W. Lowrance, 1976)
14
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 14 Risk Assessment and Management What can go wrong? What is the likelihood that it will go wrong? What are the consequences? What can be done? What options are available and what are their associated trade-offs in terms of all costs, benefits, and risks? What are the impacts of current management decisions on future options?
15
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 15 Sample of Our Risk Program with VDOT and VTRC An aid to the comparison of highway improvement projects Risk-based management of guardrail – location selection and upgrade Risk-based management of signs, signals, and lights vulnerable to hurricane Risk-based hurricane preparedness and recovery of the highway agency
16
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 16 Accomplishments Developed a tool to present effectively the tradeoffs in planning roadway improvements; Adopted existing models for estimating the risk reduction, the performance gain, and cost; Worked closely with two pilot VDOT Districts in the development, calibration, testing, and workshop- demonstration (January 19, 1999) of the developed methodology; Provided a framework that is comprehensive, adherent to evidence, logically sound, and practical.
17
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 17 Related Studies Ohio DOT’s “Major New Capacity Program 1998 - 2005” –Value: Communicating ODOT’s agenda for major projects “Taking the Politics out of Planning” –Statewide model for prioritizing transportation (Blake et al., Thomas Jefferson Institute for Public Policy) –VTRC Review
18
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 18 Features of VDOT/VTRC/UVa Comparison Tool Three Quantified Criteria: –Crash Risk Reduction Number of Crashes Avoided per Year –Performance Gain Total Travel Time Saved per Peak-Hour –Cost Preliminary Engineering, Right-of-Way, Construction Engineering –Other factors such as aesthetics, environment, and economic development are considered implicitly by the decision maker
19
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 19 Criteria Included in Comparison Safety Aesthetic Value Cost Performance Environmental Concerns Explicitly Quantified Implicitly Addressed Economic Development
20
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 20 Multi-objective Trade-offs Maximize risk reduction Maximize performance gain A D C B
21
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 21 Daily Traffic Travel Time Saved per Vehicle Total Travel Time Saved Crashes per Year Crashes per Vehicle Crashes Avoided per Vehicle Crashes Avoided per Year Lives Lost, Injuries Comparison Tool Options Right of Way Preliminary Engineering Construction Engineering Life Cycle Length of Road-Section RISK PERFORMANCE COST
22
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 22 Intuitive Graphical Representation Note: Icon area is proportional to project Cost. Crash risk reduction Performance gain
23
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 23 Precision of the Assessment
24
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 24 104 105 106 116 117 118 -10 0 10 20 30 -50005001,0001,5002,0002,500 Travel Time Saved (Minutes per Peak Hour) Crashes Avoided (Crashes per Year) Sample of Database of Richmond District (from Travis Bridewell)
25
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 25 The Comparison Tool brings evidence to the table earlier in the planning process – the process is open and available to inspection. The tool enables improved communication among managers, engineers, legislators, and the public. The scope of application is small- to medium- size projects where the balance of capacity improvement and risk reduction can be central. Some Guiding Principles for Application
26
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 26 Overview of System Prototype www.virginia.edu/~risk/VDOT (location of system prototype)
27
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 27 Features of Software Prototype User-friendly handbook available in print and electronic versions Guides users through the data requirements and computations to estimate crash risk reduction, performance gain, and cost associated with a project Results easily checked by a calculator Reference tables and examples are provided
28
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 28
29
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 29 Sample Project The Culpeper District widening of Route 3 from 2 to 4 lanes starting from the Orange County line west to east of Lignum. Additional project work would include alignment improvement
30
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 30 Database Site Length (mi):2.2 Daily Vehicle Miles Traveled (DVMT) in the two years preceding construction start (02/17/93):13,086 Average Speed during Peak-Hour Without Project (mph) (most likely):58 Average Speed during Peak-Hour With Project (mph) (most likely):64 Number of property-damage only crashes within the project bounds from 2/17/91-2/16/93:4 Number of injury/fatality crashes within the project bounds from 2/17/91-2/16/93: 12 Type of Section:Urban 2-Lane (26’-30’ Pavement) Land Designation:Rural
31
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 31 Crash Risk Reduction
32
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 32 Performance Gain
33
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 33 Cost Estimates
34
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 34 Cost Crash Risk Reduction Performance Gain
35
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 35 Clicking on “Go” will enable the hyperlink
36
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 36 Database of Richmond District 113 -10 0 10 20 30 40 -1,00001,0002,0003,0004,0005,0006,0007,000 Travel Time Saved (Minutes per Peak Hour) Crashes Avoided (Crashes per Year) (from Travis Bridewell)
37
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 37 Hyperlinks to Graphics and Text - Additional Considerations
38
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 38 Guardrail Selection Note: Size of bubbles represents cost or value ($)
39
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 39 Interchange Comparison
40
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 40 Interchange Comparison (cont.)
41
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 41 TEA-21 Planning Factors Economic vitality Safety and security Accessibility and mobility Energy and environment conservation Integration and connectivity Efficient management and operation Preservation of existing system
42
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 42 A Vision of an Improved Planning Document Note: Size of bubbles represents cost or value ($) Time savings Economic development Crash reduction Intermodal capacity Efficiency
43
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 43 Summary of Comparison Tool What we have today is a tool tested internally to VDOT, we used it, we know it works: (1) Recognize that the tool frames some information more easily than in the past; (2) Encourage testing or deployment in other Districts; and (3)Recommend the tool for improved public understanding and public education
44
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 44 Review of Other Approaches
45
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 45 Review of Some Existing and Proposed Prioritization Methods Ohio Transportation Review Advisory Council Sacramento Transportation Programming Guide Taking the Politics Out Of Planning Percentile Ranking Successive Subsetting
46
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 46 Review of Some Existing Prioritization Methods: Features Evaluation criteria –Factors considered, weights –How objective or subjective –Flexibility for decision makers Method outputs –Scores: “stand-alone” or “relative worth” –Uncertainty/error allowances Types of projects evaluated
47
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 47 Ohio Transportation Review Advisory Council Quantified criteria evolve from policy 60% of score based on traffic measures (total traffic, truck traffic, congestion (v/c), functional classification, accident rate) 10% cost 30% score based on economic development measures (jobs created, unemployment rate, etc.) Completely objective, data-driven Results in “stand-alone” scores Applied only to “major new capacity projects”
48
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 48 Sacramento Transportation Programming Guide Projects divided into 7 categories, each category has its own set of evaluation criteria Quantified criteria evolve from policies Ex. “Major street improvements” criteria: accident rate, v/c, economic development, cost, deliverability/readiness, volume, gap closure Results in scores that are dependent upon the set of projects evaluated (“benchmarking” against worst project in each criterion)
49
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 49 Taking the Politics Out Of Planning Criteria evolve from policies but are not quantified Criteria include: congestion relief, safety, cost effectiveness, multimodality, intermodal, economic vitality, quality of life) Does not explicitly include cost Mostly subjective (in points assignment) Results in “stand-alone” scores Appears to be applicable mainly to large-scale projects (no guidelines given)
50
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 50 Percentile Ranking Criteria can be selected by user but must be quantifiable Criteria can be objective (data-driven) or subjective (i.e. pavement condition rating) Scores are assigned along each criterion based on percentile value Results in scores that are dependent upon the set of projects evaluated Applied only to a set of similar projects (i.e. county road improvements)
51
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 51 Successive Subsetting Criteria can be selected by user but must be quantifiable Criteria can be objective (data-driven) or subjective (i.e. substructure condition rating) Requires ordering of criteria in importance Iteratively subset projects along pairs of criteria Results in scores that are dependent upon the set of projects evaluated Applied only to a set of similar projects (i.e. bridge replacements)
52
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 52 Common Features Criteria pertaining to traffic flow/congestion, cost, and safety are most common Most methods incorporate measures of relative merit (by “benchmarking” or inherently) Most methods are applicable to a certain type of project or require sets of criteria to be developed for each project type Uncertainty is not explicitly addressed but can be incorporated (similar to “error bars”)
53
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 53 Draft Proposal for Future Effort
54
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 54 Development of an Objective Project Prioritization Instrument Purpose Scope Methods Limitations
55
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 55 Purpose To create an accurate, practical, objective rational, project prioritization instrument. Accurate: measurement of pros and cons can be estimated reliably Practical: staff can use it with available data Objective: adherent to evidence. Rational: Can be understood by specialists and nonspecialists
56
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 56 Scope Development of an instrument to aid decision makers (does not actually dictate priorities or rankings). Not an information system, although results of this effort may eventually be used to specify requirements for an information system. Case study sample of projects: two districts are proposed for investigation on an experimental basis
57
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 57 Approach 1.Select transportation performance measures 2.Identify methods of comparison 3.Implement prioritization method for a case study 4.Legislative and data constraints 5.Archive case study results 6.Conduct Workshops and training
58
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 58 Task 1: Select Transportation Performance Measures Review literature: much has been done elsewhere Assess the utility of these measures. –Can they be estimated in a timely fashion? –Are such estimates reliable? –Can the results be understood? (Interview staff) –Investigate suitability of VDOT databases Examples: tons of CO per day, number of businesses displaced, average travel speeds, number of crashes eliminated, …
59
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 59 Example: Carbon Monoxide Emissions Methods include EPA’s Mobile Model as well as sketch planning tools Person-hours required to run the model for a particular project (e.g. a lane widening) Historical accuracy of the model from previous studies Data needs: travel speed distribution, vehicle type distribution, 24 hour volume Output is in “tons of carbon monoxide per day” Fits within one of the FHWA planning factors
60
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 60 Task 2: Identify Methods to Compare Projects Review literature and examine practices in other states –Categorize projects by type –Categorize projects by urgency Look at other disciplines (e.g. pavement management)
61
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 61 Example of Tasks 1 and 2
62
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 62 Another Example of Tasks 1 and 2
63
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 63 Task 3: Implement the Best Prioritization Method Case study District and system specific District specific only System specific only Not restricted
64
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 64 Task 4: Identify Legislative and Data Constraints Predetermined funding allocations Conformity status Data that are not available Data that are not accessible Difficult computational methods
65
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 65 Task 5: Archive Case Study Results for Future Assessment Value of performance measures Suggested rank Assigned rank Whether or not project was implemented Other factors
66
Center for Risk Management of Engineering Systems University of Virginia, Charlottesville 66 Task 6: Provide Training/Workshops Training for multiple districts Regional focus Emphasis on understanding how the instrument works rather than the mechanics of using a software package Gives VDOT a voice in deciding what components should be used further and what components need modification.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.