Download presentation
Presentation is loading. Please wait.
Published byMyles Gilbert Modified over 9 years ago
1
NCHRP 3-94: Systems Operations and Management Guide Steve Lockwood, Phil Tarnoff, John Conrad and Rich Margiotta Presentation to AASHTO SSOM Manchester, NH June 16, 2009
2
Agenda Introduction and Approach The Black Box – How it works Example of Pilot Application Web Product Concept Sample Strategies for Review
3
Introduction Objective: Mainstreaming SO&M as formal SDOT program Guidance for SDOT managers Develop existing material into an accessible and user-friendly product Web-based approach
4
Benefits of Web-Based Approach Avoids lengthy paper documents with big charts Relationships among elements built in Custom tailored to user Users self-evaluate SDOT state of play Increasing levels of detail displayed on automated basis Hyperlinks to supporting documents
5
Approach Covers interrelated elements critical to improving SO&M performance Process and scope Institutional elements (including organization)
6
Operations Capability Maturity Levels: Process Dimension Levels and Supporting Institutional Dimension Levels Ad Hoc Managed Integrated Level 1 Level 2 Level 3 Ad hoc operation. Relationships not coordinated Processes fully documented & staff trained Fully coordinated, performance-driven Transitioning Agencies (most) Goal for the future A few Leaders Support Arch. Support Arch. Support Arch.
7
The Process Dimension 1. Business processes Scoping/Planning Program development Programming and Budgeting Procurement Operations 2. Systems and Technology Regional architectures Project systems engineering (design) process Standards/interoperability Testing and validation Maintenance 3. Performance Measures definition Data acquisition management Measures utilization
8
The Institutional Dimension 1. Mission Mission commitment Operations Culture SDOT authorities (laws) Continuous improvement acceptance (culture) 2. Leadership In-reach Outreach (policy-makers, stakeholders) 3. Organization Status/authority (equivalence) Unit relationships (consolidation) Authority/responsibility allocation 4. Staffing Core capacities definition/filled Technical capacities (staff development) Career path (incentives), Recruitment and retention 5. Resources Funding sources Budgeting/resource allocation 6. Partnerships Interagency (cooperation, co- training) Intergovernmental (cooperation, coordination) Participation in MPO activities PPP (rationalized outsourcing)
9
Agenda Introduction and Approach The Black Box – How it works Example of Pilot Application Web Product Concept Sample Strategies for Review
10
Framework: The Capability Maturity Model Performance-critical elements identified (process, institutional) Self-evaluation based by user type (administrator, project manager, program manager) Presumes continuous improvement in critical elements Strategies provided for levels of user to get to next level (different strategies for different user types and their levels) Each strategy has guidance including, examples, reference Criteria for levels, strategies, guidance are in an invisible data base
11
Agenda Introduction and Approach The Black Box – How it works Example of Pilot Application Web Product Concept Sample Strategies for Review
12
A recent "manual" application of the guide logic State of Old Hampshire Medium size, some SO&M programs Some Policy-based strategy applications Start-up Performance measurement context Has an aggressive SO&M vision Example below:
13
Program Assessment: Strengths and Weaknesses Overall assessment in terms of maturity levels (for selected critical process elements) Scope Staff Management Customer Service Business Processes Technology and Systems Performance Measurement
14
Ranking and rationale Action Plan -- starting with lowest level Strategies to move up next level Investment implications
15
LEVELS OF OPERATIONAL MATURITY LevelNameCharacteristics 1IncompleteAd hoc processes 2PerformedProcedures defined and tracked 3ManagedProcess is managed and measured 4Established Continuous Analysis
16
CRITERIA FOR ASSESSING OPERATIONAL MATURITY Basic Dimension Level 1 – Incomplete Level 2 – PerformedLevel 3 – ManagedLevel 4 - Established Scope Narrow, opportunisticNeeds-based, standardized Addresses full range of highway M&O Includes multi-modal & freight Staff Management On-the-job trainingStaff supervision, active training program, standardized procedures Performance-based management program Opportunities for advancement, risk taking for new procedures encouraged Customer Service Ad hoc information dissemination Rudimentary dissemination of roadway condition information using website Call in phone number, 511 and travel time on signs Call center operation, interaction with media, frequent publication of performance Business Processes Informal, undocumented Planned, mainstreamedIntegrated, documented Coordinated with external entities Technology and Systems Project oriented, no use of systems engineering Generalized platforms based on systems engineering analysis Extensive use of standards, provision for continuous validation Coordinated with external systems and requirements Performance Measurement Outputs reportedOutcomes reported Emphasis on dissemination to the public Outcomes usedPerformance accountability
17
SUMMARY OF THE EVALUATION OF DOT'S OPERATIONAL MATURITY DimensionRankComments Scope2Missing arterial flow monitoring and incident management Staff Management 2Absence of performance based management process Customer Service Low 2Website data incomplete and/or inaccurate. Absence of well-defined call-in process Business Processes 2Expanded guidelines for VMS messages required. ConOps incomplete Technology and Systems 4Careful adherence to standards and systems engineering processes Performance Measurement Low 2Absence of useful outcome measures Failure to use results to influence the program Overall Performance Low 2Based on the guideline that the agency’s maturity level is equal to the dimension at the lowest level
18
Agenda Introduction and Approach The Black Box – How it works Example of Pilot Application Web Product Concept Sample Strategies for Review
19
Indicate User’s position (check one): Top management (CO or District) Program manager (CO or District) Project manager (CO or district) Step 1 – User Role Self-identification
20
Evaluation goes element-by-element PROCESS DIMENSION CRITERIA DEFINING LEVEL OF MATURITY L-1L-2L-3L-4 1. BUSINESS PROCESSES ELEMENT Scoping/Planning Program development Programming and Budgeting Procurement Operations 2. SYSTEMS AND TECHNOLOGY ELEMENT Regional architectures Need for architectural standardization among state and regional systems are ignored Regional architecture developed and subjected to periodic reviews System implementations recognize need for future upgrades as systems installed by others offer increased functionality and expanded data interchange requirements Regional architecture recognized as a dynamic tool requiring continuing review and updates as regional systems mature and new technologies and applications become available. Systems engineering Standards/interoperabilit y Testing and validation 3. PERFORMANCE ELEMENT Measures definition Data acquisition management Measures utilization
21
Ex: Systems and Technology – Regional Architecture Need for architectural standardization among state and regional systems are ignored Regional architecture developed and subjected to periodic reviews System implementations recognize need for future upgrades for increased functionality Regional architecture recognized as a dynamic tool requiring continuing review and updates Step 2: User self-evaluation of current level of capability in all key elements
22
Note: User will repeat step for other 7 elements later Answer similar questions regarding: 1. PROCESS DIMENSION 1.Business processes (including scope) 2.Systems and Technology 3.Performance 2. INSTITUTIONAL DIMENSION 4.Mission 5.Leadership 6.Organization / Staffing 7.Resources 8.Partnerships
23
Step 3: Display of user’s level for each element at lowest level REGIONAL ARCHITECTURES -- CRITERIA DETERMINE EXISTING AGENCY LEVEL L-1L-2L-3L-4 Need for architectural standardization among state and regional systems are ignored Regional architecture developed and subjected to periodic reviews System implementations recognize need for future upgrades as systems installed by others offer increased functionality and expanded data interchange requirements Regional architecture recognized as a dynamic tool requiring continuing review and updates as regional systems mature and new technologies and applications become available.
24
Step 4 System Determines Appropriate Strategy GENERAL STRATEGY TO ADVANCE LEVELS OF MATURITY – TOP MANAGMENT L-1 L-2L-2 L-3L-3 L-4 Ensure that agency is an active participant in the development and maintenance of the regional architecture. Monitor ongoing system developments as well as changing needs to ensure that the architecture is both followed and updated as needed Coordinate architectural activities with performance measurement to assess architectural effectiveness as compared with vision developed for relevant concepts of operations. Identify areas needing improvement and develop recommended architectural enhancements.
25
Step 5 – Guidance for Appropriate level transition displayed
26
( Note: Other layers are stored by level and position ) TECHNICAL PROCESSES -- TOP MANAGMENT (A) Systems Architecture Strategy from Level 2 to Level 3 A. Upper management requires periodic reports (semi-annual is recommended) from the Architecture lead that define the manner in which the architecture is being used to influence the design and implementation of regional systems. The reports should also include a description of the manner in which the architecture is being modified to reflect changes in regional systems and requirements. Responsibility: Upper management must be involved in the review and critique of the Architecture lead reports. B. Agency commitment to the concepts of the regional architecture is demonstrated through mid-level management support of the efforts of the Architecture Lead. This requires periodic (at least every second meeting) attendance by the immediate supervisor(s) of the Architecture lead as well as the supervisor(s) of other agency personnel participating in the development of the regional architecture. In this way, the agency is assured that its long term requirements and operational philosophies are reflected in the ongoing evolution of the architecture. Responsibility: Supervisory personnel affiliated with the staffing of the architectural review committee
27
(Note: Other layers are stored by level and position ) TECHNICAL PROCESSES -- TOP MANAGMENT (A) Systems Architecture Strategy from Level 3 to Level 4 It is important to continually assess the degree to which the regional architecture meets the requirements of the region’s surface transportation agencies. This can be done through the use of performance measures that 1 – measure the architecture’s impact on the development of regional systems and 2 – measure the satisfaction of the stakeholders with its use. The first step in the performance measurement process is to identify appropriate measures. The measures should be selected by the architecture review committee. Examples include stakeholder satisfaction (a qualitative measure), the reliability of systems developed using the architecture, and the degree to which information identified by the architecture is shared among participating agencies. Responsibility: The identification of performance measures is performed by the architecture review committee. The implementation of the performance measures is performed by the Architecture Lead working with the appropriate group within the agency responsible for performance measurement. B. Level 4 performance is based on the continuous assessment of the effectiveness of agency policies. For this reason, the architecture itself must be continuously assessed to determine whether it is meeting the stakeholder needs, or whether modifications are required. This process is undertaken through the review of new stakeholder requirements, modifications or additions to stakeholder systems, and/or changes in technology that might have an impact on the architecture. At the periodic stakeholder meetings, time should be allocated to identify changes in any of these areas, and to discuss their potential impact on the architecture. Responsibility: Responsibility of the architecture review committee as lead by the Architecture lead.
28
Step 6: Store and continue with other elements System shows accounting matrix for components checked and next component User checks “next” System goes to next most critical element
29
Step 7 to N: system repeats steps 3-6 Step N + 1: completion high level summary display of change management strategies “You were ranked as xxxxxxxxx, and we recommended the following” Print-out functionality
30
Agenda Introduction and Approach The Black Box – How it works Example of Pilot Application Web Product Concept Other Sample Strategies for Review
31
Performance Measurement : General Strategies GENERAL STRATEGY TO ADVANCE LEVELS OF MATURITY Performance Element: (A) Measures Definition L-1 L-2L-2 L-3L-3 L-4 Develop a PM program based on output measures only Develop a PM program based on output measures and a few outcome measures only Develop a PM program based on full set of both output/outcome measures, linked to agency operations
32
Performance: Level 1 to Level 2 PERFORMANCE MEASUREMENT Measures Definition Strategy from Level 1 to Level 2 A. The philosophy here is, “if you’re going to manage it, you’re going to measure it”. For Level 1 to Level 2, the limited number of operational activities that will be monitored must first be identified. For Level 2 to 3, all operations activities engaged in will have performance measures defined for them. Output measures for both Level transitions should be relatively simple and easy to collect. These are often referred to as “activity based” measures since they monitor the extent of activities undertaken, and their immediate consequences. Therefore, the measures selected will include at a minimum: incident duration incident and work zone characteristics (number, type, severity) operational activity (website hits, service patrol stops, messages) equipment locations and status equipment downtime Responsibility: Operations staff Relationships: N/A References: Guide to Benchmarking Operations Performance Measures, http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218 http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218 NCHRP Web-Only Document 97, "Guide to Effective Freeway Performance Measurement: Final Report and Guidebook", http://www.trb.org/news/blurb_detail.asp?id=7477http://www.trb.org/news/blurb_detail.asp?id=7477
33
Performance: Level 2 to Level 3 PERFORMANCE MEASUREMENT Measures Definition Strategy from Level 2 to Level 3 Guidance A. Development of congestion-based outcome performance measures should begin at Level 3 These measures can be developed through available data and modeling methods, i.e., they can be “indirect measurements” of congestion. For example, a variety of methods are available for converting v/c ratios to delay-based measures. The congestion measures should be based on travel time estimation. In addition vehicle-miles of travel should be monitored. To the extent that the data allow, these measures should be developed for different time periods (e.g., peak hour, peak period, offpeak). Responsibility: Operations staff, but data may be borrowed from other units (e.g., traffic counts, capacity). Relationships: N/A References: Guide to Benchmarking Operations Performance Measures, http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218 http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218 NCHRP Web-Only Document 97, "Guide to Effective Freeway Performance Measurement: Final Report and Guidebook", http://www.trb.org/news/blurb_detail.asp?id=7477http://www.trb.org/news/blurb_detail.asp?id=7477
34
Performance: Level 3 to Level 4 PERFORMANCE MEASUREMENT Measures Definition Strategy from Level 3 to Level 4 A. Develop a formal Operations Strategic Plan. The Operations Strategic Plan should include vision, goals, and objectives for what operations are trying to achieve. Performance measures and targets should be crafted to monitor the progress toward meeting these. For example: Responsibility: Operations staff Relationships: N/A References: see below B. Additional performance measures should be added at Level 4. These include a variety of travel time-based congestion measures, including reliability, as well as customer satisfaction. The complete “incident timeline” should be measured. Lane and shoulder blockage times should be tracked, so that lane- and shoulder-hours lost due to incidents and work zones can be developed. Responsibility: Operations staff Relationships: N/A References: NCHRP Web-Only Document 97, "Guide to Effective Freeway Performance Measurement: Final Report and Guidebook", http://www.trb.org/news/blurb_detail.asp?id=7477http://www.trb.org/news/blurb_detail.asp?id=7477 Guide to Benchmarking Operations Performance Measures, http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218 http://www.trb.org/TRBNet/ProjectDisplay.asp?ProjectID=1218 C. Coordinate performance measures used for operations with those used by safety and planning. In order to establish a comprehensive and integrated performance measurement program, it is important that practitioners apply performance measures (metrics) uniformly across all types of applications. It is desirable to maintain performance measures that are used for specific applications, but a core set of measures should be used across all applications. This is particularly useful for congestion/mobility metrics, and it is not all that difficult to implement. Responsibility: Operations staff Relationships: Safety and planning personnel nee to be involved References:
35
Performance: Level 3 to Level 4 (continued)
36
PROGRAMM ING AND BUDGETING USERS CURRENT LEVEL Level 1Level 2Level 3 Level 4 Individual project capital costs. Outside central office formal budgeting process Defined ”program” wish list of projects in some districts, integrated into district capital budgets Statewide staged SO&M multi-year program of projects (Capital, operating and maintenance requirements) SO&M integrated into standard formal capital budgeting process/cycles Step 3: Display of user’s levels and indication of lowest level
37
PROGRAMMING AND BUDGETING LEVEL TRANSITION STRATEGY L 1 L l 2-L 2 L 3-L 3 L 4- A.Secure agreement from top management on the need and utilization of an SO&M budget B.Organize an interactive central office/district task force to develop program as basis for budget C.Develop program of “next step” projects by priority and aggregate to total program estimate, statewide and by district – using such plan material as may be available A.Organize an interactive central office/district task force to develop program budget B.Utilize output from Statewide SO&M plan to develop staged SO&M program and related budge (capital, operating maintenance – including staffing, training C.Determine administrative or legal adjustments necessary in state budgeting for DOTs regarding eligible use of funds and appropriate budget categories for SO&M program components A.Request/receive mandate from top management to integrate SO&M in the formal SDOT budgeting process B.Organize an agency-wide task force to determine a process for investment trade-offs among capacity vs. operations vs, maintenance investments – as part of regular budgeting cycle C.Integrate SO&M program into standard Formal SW programming and budgeting process D.Work with stakeholders to make necessary adjustments in state law an administrative procedures for use of state funds for SO&M Step 4: Display of Strategies to get to next level for lowest level (“critical” component)
38
Step 5: Detailed Guidance for moving selected component to next level PROGRAMMING/BUDGETING GUIDANCE FOR MOVING FROM LEVEL 2TO LEVEL 3 Function Develop sustainable budget for continuously improving SO&M applications General Strategy Develop budget estimate for statewide program next steps Guidance A. Organize an interactive central office/district task force to develop program budget Develop central office/district task force to develop budget –including SO&M and budgeting unit staffs B. Develop Budgeting process Review standard budgeting process for other core programs Establish appropriate analogue process steps C. Utilize output from Statewide or district SO&M plan to develop staged SO&M program and related budge (capital, operating maintenance – including staffing, training Analyze updated program into budget in terms of stages Establish cost components, including capital, operating and maintenance C. Determine administrative or legal adjustments necessary in state budgeting for DOTs regarding eligible use of funds and appropriate budget categories for SO&M program components Review proposed projects in terms of funding source eligibility restrictions – including capital, operating, staffing and maintenance components Meet with, develop support among, key legislative stakeholders as necessary -- depending on administrative discretion Allocate costs as appropriate Responsibility: top management working through combined central office/district task force including budgeting staff Relationships: Mandate from top management to SO&M and planning staff References: Ex: NDSHA CHART, WSDOT
39
MDSHA CHART "07 Budget Capital Field and IM Equipment$3,855,000 Network Engineering$2,687,000 Leased Circuit Costs$1,000,000 CHART System and Network Connectivity$4,162,000 Plan., Develpmt, Engrg, & Coord.$3,000,000 CHART System Integration$3,100,000 Overhead $1,157,000 Total Capital $18,961,000 Staff and Operating Expenses Operations salaries, overhead, overtime, and expenses (staff 64)$6,324,742 Systems Maintenance: emergency, preventive, and routine maintenance$1,200,000 Administrative: supplies and contractual salaries and other expenses$260,000 Miscellaneous Operating Expenses :travel expenses and meals$40,000 Total Operating $7,816,276 Grand Total $26,777,000
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.