Download presentation
Presentation is loading. Please wait.
Published byMarcus Bond Modified over 8 years ago
1
TSM&O Performance Measures Plan Steering Committee Meeting #2 August 2016
2
2 Agenda Introductions & Desired Meeting Outcomes Project Schedule Review of Stakeholder Meeting & Outcomes Performance Measures Plan by Program Area TSMO Plan Structure Next Steps and Action Items
3
3 Introductions Name, Title Source: http://firearmshistory.blogspot.com/2012/05/handgun- shooting-positions-shooting.html Desired Meeting Outcomes Understanding of stakeholder process Establish support for measures and prioritization Input on ownership/leads for core measures Input on implementation/communication of meas.
4
Input Needed from Steering Committee Do we have the right metrics identified & prioritized ? Thoughts on most effective way to implement program area metrics? Thoughts on most effective way to communicate metrics as actionable information? Insights or recommendations for TSM&O Performance Measures Plan Documentation ? 4
5
Schedule - TSMO Performance Measures Plan 5
6
Identify actionable & implementable TSM&O “centric” performance measures to: – Document costs and benefits – Prioritize resource allocation & investment – Discern optimum investment in operations – Improve quality of work (QA/QC) – Learn lessons for continual refinement TSM&O Performance Measures Plan Purpose 6
7
Stakeholder Meetings and Outcomes Audience Focus – Actionable & Useful Plan Traffic Incident Management System Management Program Management
8
TSM&O Stakeholder Meetings & Lively Discussion TIM Group Darin Weaver, TIM Coor. Amy Mastraccio, ITS Lt. Jeff Lewis, OSP Nathaniel Price, FHWA Brent Atkinson, TMOC Jerimiah Griffin, D8 ADM Jim Scholtes, D10 ADM Ty Burns, D12 ADM Tom Davis, R5 Ops Craig Olson, R2 Ops TOC Operations Matt Badzinski, ITS/TAD Misti Timshell, ITS/TAD Arleigh Mooney, CTOC Mngr. Brent Atkinson, TMOC Mngr, Jeremiah Griffin, STOC Julie Scruggs, STOC Nicole Schmunk, NWTOC Chris Wright, Traveler Info/PM Coor. Bill Link, TSSU Mike Kimlinger, Traffic Mngr./Asset Mnt. System Operations Rich Arnold, TPAU Doug Norval, TPAU Joel McCarroll, R4 Traffic Mngr. Dennis Mitchell, R1 Traffic/ITS Angela Kargel, R2 Traffic Mngr. Shyam Sharma, R3 Traffic Mngr. Jeff Wise, R5 Traffic/Roadway Bill Link, TSSU Mngr Mike Kimlinger, Traffic Standards/Asst mngt. Doug Spencer, ITS HQ 8
9
Big Spreadsheet “Decision Matrix” 9
10
Metric Category TSM&O Performance Measure Definition Metric Purpose Current TSM&O Metric? Data Source Reqrmts Ease of Data (& Sources) TSM&O Core Metric ? Directly Related to ODOT KPM? TSM&O Priority (for implementation) Issue(s) addressed Freeway vs. Arterial Agency lead Metric notes Implementation Plan Recommendation (near, medium, long-term) Communication Plan Recommendation 10
11
Methodology to Identify “Core” Measures Stakeholder value assessment Ability to influence ODOT actions Current metric or not? Ease of implementation – Data Availability – Automation 11
12
TSM&O Core Performance Measures Plan By Program Area 12
13
13 Audience = TIM team and ODOT district staff, TOC mgmt. team, ITS unit
14
14 * ODOT Key Performance Measure
15
15
16
Highest Priority are “On-Scene” Time Metrics 6 Core Performance Measures Identified Refine 90 Minute Clearance Goals – Urban v. Rural (OSP / City Limits) – Severity (Minor, Injury, Truck, Fatal) – Geographic (District, Crew) Secondary Crashes Defined but Effort to Implement Key Stakeholder Outcomes
17
17 Audience = TOC mgmt. team, TIM team, ITS unit
19
Highest Priority are “On-Scene” Time Metrics 5 Core Performance Measures Identified – 4 are currently collected More frequent automated timers TIM timeline accuracy Help facilitate data collection from OSP, tow agencies, other TIM responders TIM timeline accuracy. Key Stakeholder Outcomes
20
20 Audience = Traffic, Planning & Policy, Transportation Data Section, ITS unit
21
21
22
FAST Act / MAP-21 Rulemaking Measures to be Handled by Others. Plan will acknowledge. 6 Core Performance Measures Identified – 4 are not currently collected & reported actively Need statewide consistent definition of “congested” speed, free flow speed… pilot test? Emerging tools to evaluate, PeMS, UDOT SPM, etc. Key Stakeholder Outcomes
23
23 Audience = ITS unit, FHWA
24
24
25
Highest Priority are Major Event Timeliness Metrics 5 Core Performance Measures Identified Key Stakeholder Outcomes
26
26 Audience = ITS unit, maintenance & operations, traffic, electrical
27
27
28
28 Key Stakeholder Outcomes Most “ad hoc” program area 7 Core Performance Measures Identified – All are inconsistent or yet to be developed Need TSM&O Pilot, Identify software requirements, do electronic inventory, & condition assessment
29
29 Audience = ITS unit, maintenance & operations
30
30
31
TSMO Perf. Meas. Plan Overview Plan Purpose – Audience Program Area & Overview TSM&O Perf. Meas. Goal(s) Core Measures – Existing v. Future Secondary Measures Implementation Plan Core Measures Communication Plan Core Measures Appendix – “One-Pager” Reference 31
32
Example One-Pager Metric Overview (DDOT) 32
33
Implementation Plan Recommendations Validate data sources with ground truth, QA.QC Pilot test sources, configurations; establish baseline Identify gaps, needs, set “requirements” for automated metric tool. Set realistic targets, goals Develop regular metric report Refine & enhance regular metric report 33
34
Communication Plan Recommendations Identify desired messaging & story of visual / report – Audience tailored Dashboard overview vs. staff action oriented Explore suite of existing reporting tools. – Accurate? – Resource light or intensive? – Identify gaps & requirements for future enhancements. 34
35
DRAFT EXAMPLE
38
Input Needed from Steering Committee Do we have the right metrics identified & prioritized ? Thoughts on most effective way to implement program area metrics? Thoughts on most effective way to communicate metrics as actionable information? Insights or recommendations for TSM&O Performance Measures Plan Documentation ? 38
39
TSMO Performance Measures Plan Wrap Up Recap Discussion – Consensus Items – Items to Add to Plan Next Meeting in October – Budget time prior to meeting for draft plan review 39
40
Thank you. 40
41
Steering Committee Survey What metric are you most excited to see come through this plan (existing or new)? What metric do you believe will be the most difficult to accurately implement? 41 Source: http://firearmshistory.blogspot.com/2012/05/handgun- shooting-positions-shooting.html
42
OR
43
Vehicle Counts Speed/Congestion Travel Time/Reliability Stops Delay Equipment Health Mobility Database Linkages & Implementation 43 Travel Time / Speeds (NPMRDS) Signal Hi- Resolution Data Other Traffic Count Data Asset Mgt. Real-time Status
44
Goal = Safely Manage Incidents & Shorten Durations Treatment Benefit/Cost (Need Informed by Metrics) 44 TIM Training Reconstruction Tech. Expand IR Program Facilities Closer to Incidents
45
Goal = Safely Manage Incidents & Shorten Durations Treatment Benefit/Cost (Need Informed by Metrics) 45 TOC Training Integrated Tech. Increase Staff Resource Optimization
46
Goal = Safely reduce delays/congestion for all modes of Oregon travelers. Treatment Benefit/Cost (Need Informed by Metrics) 46 Strategic Work Zone Management ITS – Active Mgt. Increase Capacity Proactive Maintenance
47
Goal = Inform travelers, reduce crash risks, distribute demand. Treatment Benefit/Cost (Need Informed by Metrics) 47 TripCheck Outreach Automated Data New Roadside ATIS
48
Goal = Active asset management to extend life, reduce costs. Treatment Benefit/Cost (Need Informed by Metrics) 48 Proactive Maintenance Reactive Maintenance Asset Replacement
49
Goal = Active resource management to maximize DOT value. Treatment Benefit/Cost (Need Informed by Metrics) 49 Proactive Management Reactive Management Staff Turnover Automated, Predictive Tech.
50
50 Group Discussion/Feedback Is the measure clear & understandable? Does ODOT have the needed data? How will the perf. meas. be actionable? Are perf. meas. missing from the list? Implementation? How would you prioritize them?
51
Mobility 51
52
Metric Category: Flow Rate 52 Examples Freeway vs. ArterialImportance Data easily available? (data sources) veh/hr = vehicles, point, 1 hourbothhigh YES (ATR, central signal system, UDOT SPM software, PORTAL) veh/hr/lane = vehicles, lane, 1 hourbothmedium PARTIAL (ATR, central signal system, PORTAL) turning movement counts = vehicles, movement, 1 hour arterialmedium PARTIAL (TDD manual, central signal system, UDOT SPM software, GridSmart TM ) % trucks in traffic stream = truck, vehicles, point, 15 minutes bothmedium PARTIAL (ATR, AVC, Wavetronix TM, PORTAL) bikes/hr = bikes, point, 1 hourarteriallow-medium NO (PORTAL, central signal system, EcoCounter TM )
53
Metric Category: Speed 53 Examples Freeway vs. ArterialImportance Data easily available? (data sources) % of hour operating in congested conditions = speeds <30mph, segment, 1 hour bothmedium-high YES (NPMRDS, Iteris PeMS TM PORTAL) % of hour operating in congested conditions = speeds <15mph, segment, 1 hour bothmedium-high YES (NPMRDS, Iteris PeMS TM PORTAL) point speed (time mean speed) = speed, point, 1 hour bothlow-medium YES (ATR, Wavetronix TM, PORTAL)
54
Metric Category: Travel Time 54 Examples Freeway vs. ArterialImportance Data easily available? (data source) average peak period travel time = speed, segment, peak period bothmedium YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM ) travel time = minutes per milebothmedium YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM ) 5 th & 95 th percentile travel time = minutes per mile for 5 th & 95 th percentile condition bothmedium YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM )
55
Metric Category: Travel Time Reliability 55 Examples Freeway vs. ArterialImportance Data easily available? (data source) Planning Time Index = ratio of 95 th percentile TT to free flow TT by unit time bothhigh YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM ) Planning Time = free flow TT x planning time index bothmedium YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM ) Buffer Time = difference of 95th percentile TT and average TT by unit time bothmedium YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM ) Travel Time Index = ratio of average TT to free flow TT by unit time arterialmedium YES (NPMRDS, Iteris PeMS TM PORTAL, Bluetooth TM )
56
Metric Category: Stops 56 Examples Freeway vs. ArterialImportance Data easily available? (data source) percent arrivals on red = proportion of vehicle detection actuations on red signal indication over time. Related to Percent arrivals on green & Purdue Coordination Diagram. arterialmedium PARTIAL (ATC controllers, central signal system, UDOT SPM software) stop frequency per node, segment, or user arteriallow PARTIAL (ATC controllers, central signal system, UDOT SPM software) stop rate per node, segment, or userarteriallow PARTIAL (ATC controllers, central signal system, UDOT SPM software)
57
Metric Category: Delay 57 Examples Freeway vs. ArterialImportance Data easily available? (data source) hours of vehicle delay (delay magnitude) per node, segment or user bothmedium-high PARTIAL (Iteris PeMS TM, Bluetooth) phase/split failures by movement or signal phase (instances where queued vehicles must wait more through more than one green indication to proceed). Related to Split Monitoring, Purdue Phase Termination. arterialmedium YES (2070 controllers, central signal system, UDOT SPM software) average intersection delay = delay of users averaged over time arterialmedium PARTIAL / NO (Iteris PeMS TM, Bluetooth) average delay by movement or signal phase = delay of users by movement/phase over time. arterialmedium PARTIAL / NO (Iteris PeMS TM, Bluetooth)
58
FHWA Draft FAST/MAP-21 Mobility Rule-making 58 Examples Freeway vs. ArterialImportanceData easily available? % of the Interstate System mileage uncongested (freight) freewayrequired YES (NPMRDS, Iteris PeMS TM ) % of the Interstate System providing for reliable travel times (target is LOTTR < 1.5) freewayrequired YES (NPMRDS, Iteris PeMS TM ) % of the non-Interstate NHS providing for reliable travel times (target is LOTTR < 1.5) bothrequired YES (NPMRDS, Iteris PeMS TM ) % of the Interstate System where peak hour travel times meet expectations (target PHTTR < 1.5) freewayrequired YES (NPMRDS, Iteris PeMS TM ) % of the non-Interstate NHS where peak hour travel times meet expectations (target PHTTR < 1.5) bothrequired YES (NPMRDS, Iteris PeMS TM ) % of the Interstate System mileage providing for reliable truck travel times freeway required YES (NPMRDS, Iteris PeMS TM ) annual hours of excessive delay per capitaBothrequired PARTIAL (NPMRDS, Iteris PeMS TM ) total emission reductionsbothrequiredNo
59
Vehicle Counts Speed/Congestion Travel Time/Reliability Stops Delay Equipment Health Mobility Database Linkages & Implementation 59 Travel Time / Speeds (NPMRDS) Signal Hi- Resolution Data Other Traffic Count Data Asset Mgt. Real-time Status
60
Hourly modal flow rate (vehicles, peds) % of hour/day operating in congested conditions (speed) Percentile travel times by minute/hour/day (travel time) – 5 th ~ free flow; 50 th (median)~ average; 95 th ~ longest Planning time index (reliability) Percent arrivals on red (stops) – arterials – Report # of observations for normalization Hours of vehicle delay (delay) Highest Ranking Mobility Performance Measures Important & Easy to Access 60
61
61 Group Discussion/Feedback Is the measure clear & understandable? Does ODOT have the needed data? How will the perf. meas. be actionable? Are perf. meas. missing from the list? Implementation? How would you prioritize them?
62
Asset Management 62
63
Recommended Asset Mgmt. Condition Rating Score each individual device on each performance measure, normalized by 0 to 100. – Lowest = worst, Highest = best Performance Measures: – % of design life remaining – Resources spent on asset (hard & soft costs) % proactive vs. % reactive – % of asset downtime Device Communication Supporting System (e.g. Detection) 63
64
Define Targets for Asset Management Expected service/design life. – Target of full design life per device/system. – No device exceeding 100% of design life. Expected annual O&M cost by device type. – 70% of O&M costs are proactive 100% uptime target (0% downtime target). – What’s realistic per device/system? 64
65
Metric Category: All TSM&O Assets 65 ExamplesImportanceData easily available? asset condition & site ratinghigh PARTIAL (depending on asset) % asset beyond service lifehigh PARTIAL (depending on asset)
66
Metric Category: VMS & Drum Signs 66 ExamplesPriorityData easily available? % proactive maintenancehighyes staff resources spent per asset per year highmoderate % asset uptimehighmoderate asset conditionhighyes % detection/communication failurehighmoderate
67
Metric Category: Signs 67 ExamplesImportanceData easily available? % of signs beyond service lifemediummoderate % of signs meeting retro reflectivity goals mediummoderate
68
Metric Category: Signals 68 ExamplesImportanceData easily available? % proactive maintenancehighmoderate % asset uptimehighmoderate asset conditionhighmoderate % detection/communication failurehighmoderate
69
Metric Category: Illumination 69 ExamplesImportanceData easily available? % illumination beyond service lifemediumno
70
Metric Category: Traffic Structures 70 ExamplesImportanceData easily available? structure ratinghighyes
71
TIM 71
72
Metric Category: Incident Timeline 72 Proposed Metrics From TIM Strategic PlanPriority Data easily available? (data sources) detection timeyeslow NO (OnStar) verification timeyeshigh YES (CCTV, CAD) dispatch time (shown in TOC metrics as well) yeshigh YES (CAD, 911) response time (various TIM disciplines) yesmedium-high YES (ODOT and OSP data) roadway clearance timeyeshigh YES (incident report) incident clearance timeyeshigh PARTIAL (incident report)
73
Draft Refined Roadway Clearance Analysis 73 MUTCD Temp Traffic Control (6I.01) Major = 2+ hours Intermediate = 30 min. – 2 hrs. Minor = >30 min.
74
Metric Category: 2 nd Order Incident Metrics 74 Proposed Metrics From TIM Strategic PlanImportance Data easily available? (data sources) roadway closure durationyeshigh YES (incident report) % of crash incidents meeting goal for roadway clearance time nohigh YES (incident report) roadway clearance time mediannohigh YES (incident report)
75
Metric Category: Incident Status – Tow Related 75 Proposed Metrics From TIM Strategic PlanImportance Data easily available? (data sources) tow arrival timenohighNO time tow was callednohighNO length of time to complete tow tasks nohighNO % of incidents where tow companies arrive without correct equipment yeslowNO
76
Incident TIM staffing/resource attributes 911-ODOT Dispatch Integration (OIS) to help with event reconciliation (TOCS v. Crash Analysis Unit). 76 Proposed Metrics From TIM Strategic PlanImportance Data easily available? (data sources) locationnomedium YES (incident report) incident densitynomedium YES (incident report) incident typenomedium YES (incident report)
77
Metric Category: Secondary Crashes 77 Proposed Metrics From TIM Strategic PlanPriority Data easily available? (data sources) secondary crash = within the queue of the existing incident yesmedium NO (only in text) Ratio of secondary crashes to primary crashes nolow NO (only in text)
78
Metric Category: Incident Responder Risk 78 Proposed Metrics From TIM Strategic PlanPriority Data easily available? (data source) average time on scene per incident type nohigh PARTIAL (ODOT and OSP data) incident responder struck-bys yesmediumNO incident responder fatalities yesmediumNO On-Scene Time
79
Metric Category: Staff Training 79 Proposed Metrics From TIM Strategic PlanImportance Data easily available? (data source) number of responders trained in National TIM training classes yeshighNO number of responders trained in specific discipline nohighNO
80
TOC Operations 80
81
Metric Category: Efficiency/Scheduling of Staff Time 81 Proposed MetricsImportance Data easily available? (data source) TOC total workload by hour of day and day of week high PARTIAL (TOCS, Pantel) Employee total workload by hour of day and day of week or month (seasonality) high PARTIAL (TOCS, Pantel)
82
Metric Category: Dispatch time ODOT-only resources vs. all events (911 Dispatch) 82 Proposed MetricsImportance Data easily available? (data source) dispatch timehigh YES (TOCS)
83
Metric Category: SOG Compliance/QA 83 Proposed MetricsImportance Data easily available? (data source) lane blocking crashes with no ATISmedium YES (TOCS) average time to notificationmedium YES (TOCS) ATIS posting in less than 10 minutesmedium YES (TOCS) events with no notificationmedium YES (TOCS) timer usage by dispatcher (timeline reporting)medium PARTIAL (TOCS) traveler info updatesmedium-high YES (TOCS)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.