Download presentation
Presentation is loading. Please wait.
1
Lessons learned from Metro Vancouver
An exploration of ‘Big Data’ sources to inform best practice travel time studies: Lessons learned from Metro Vancouver Fearghal Kinga, Mohamed Mahmouda, Clark Limb aTransLink; bAcuere Consulting (Vancouver, British Columbia, Canada)
2
Phased Approach Phase 1: Phase 2: Understand and Define Congestion
Measure and Report on Congestion Literature Review Literature Review Definition of Congestion Definition of Congestion Identify Preferred Measures Identify Preferred Measures Recommend data sources, locations, timing Recommend data sources, locations, timing Confirm data availability, quality, validation Confirm data availability, quality, validation Develop and conduct congestion analysis plan Develop and conduct congestion analysis plan Generate results and maps Generate results and maps Produce Regional Congestion Report Produce Regional Congestion Report
3
Presentation Outline Phase 2 Data Sources Congestion Analysis Plan
Field Validation Travel Time Metrics Preliminary Output Lessons Learned and Next Steps
4
Phase 2 Data Sources
5
Phase 2 Data Source Data Source:
Probe data: smart phone GPS & probe vehicles Commercial vehicles Limitations: Historical data – field validation is not an option Aggregated annually/monthly data Aggregated by day of week for a given month Limited road coverage (road segments are ‘set’ with no flexibility) Network link lengths Network links with missing data Network completeness In-depth analysis revealed several inconsistencies in speed measures This means that each 15 minute record contains samples encompassing 4 or 5 days of the week (of the same day of the week). No flexibility to review or analyze a specific dates. Fig 1 – long segments in downtown (Many significant roadways are represented by links with lengths that are relatively quite large) Fig 2 - Aggregation of Samples by Link with No Samples, 7:30-8:30AM Wednesdays (5 hour total duration) January 2014 In a given month (Jan 2014): 45% of the segments had no data. (table 2.15) – to be fair mostly where outside peak periods A baseline review of the average link speed data resulted in minimum speeds of 3 mph and maximum of 185 mph Overall, the results were not consistent with our expectations, 2003 TTS results, or google maps data. We have a report on this analysis.
6
Phase 2.2 Data Source Data Source: GPS data from smart phones Pros:
Data parameters can be determined by the analyst Flexibility to collect and analyse by date/time/OD pattern/route segments Data is collected in real-time Limitations: No historical data Heavy data processing, management and cleaning required Historic data used in the absence of real time Black box The main advantage of google maps data (other than it is actually valid) is the great flexibility it provides to develop the work plan Real-time so it can be valid
7
Phase 2.2 Data Source Inputs:
Origin/Destination (or route segment start/end points) Route or waypoints (optional) Outputs: Geo-coded origin/destination Distance Duration (typical) Duration in traffic (real-time) Route Time stamp The main advantage of google maps data (other than it is actually valid) is the great flexibility it provides to develop the work plan
8
Congestion Analysis Plan
9
Congestion Analysis Plan
Added value: Ability to compare output with 2003 Travel Time Study Sub-Centre Analysis 14 sub-centres Sub-Regional Analysis 14 sub-regions, with 5-6 ODs Road Network Analysis Google Road Analysis Network (gRAN) ~1,500 one-way road segments (~1km each) Added value: Ability to analyse and report on localised travel times Added value: Ability to analyse and report on travel times by road segment
10
1. Sub-Centre Analysis (14 x 14 ODs)
11
2. Sub-Regional Analysis
12
2. Sub-Regional Analysis
(14 sub-regions; 5-6 ODs)
13
3. Road Network Analysis (~1,500 road segments)
14
Field Validation
15
Field Validation ‘Ground-truthing’ exercise (Nov 2016):
3 OD pairs; 6 routes; 6 drivers; 4 time periods; 6 days Drivers departed start points in 10 min intervals GPS data collected every second Google maps data collected every minute Data was matched spatially and temporally Total of 4,016 samples (representing road segments) Routes with predefined segments (not all on the gRAN) – each route is about 30 segments They were defined based on the most likely shortest path (to be comparable to the API activity centers data) The three routes cover different types of roads (urban/rural) and different classes (collectors/arterials/highways) 10 mins interval to spread data over the peak period Coquitlam - UBC Downtown Vancouver - Surrey Langley - Richmond
16
Field Validation Google = * GPS
17
Field Validation
18
Data Verification (Temporal verification)
Network-wide Average Speeds, Nov, 2016
19
Data Verification (Spatial verification)
Average Speed, working day, PM peak, Nov, 2016
20
Travel Time Metrics
21
Travel Time Metrics Standard Deviation of Travel Time
Measures variability in travel time from average travel time (norm) Useful for understanding reliability of travel time Travel Time Index Measures the ratio between reference and congested travel times Well known congestion index, easily understood Widely used index (TomTom and INRIX reports) Standard Deviation of Travel Time Measures variability in travel time from average travel time (norm) Useful for understanding reliability of travel time Travel Time Index Measures the ratio between reference and congested travel times Well known congestion index, easily understood Widely used index (TomTom and INRIX reports)
22
The importance of choosing the right Reference speed
𝑇𝑟𝑎𝑣𝑒𝑙 𝑇𝑖𝑚𝑒 𝐼𝑛𝑑𝑒𝑥= 𝐴𝑐𝑡𝑢𝑎𝑙 𝑡𝑟𝑎𝑣𝑒𝑙 𝑡𝑖𝑚𝑒 𝑅𝑒𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑠𝑝𝑒𝑒𝑑 𝟑𝟎 𝒎𝒊𝒏𝒔 𝟐𝟎 𝒎𝒊𝒏𝒔 1.50 𝟑𝟎 𝒎𝒊𝒏𝒔 𝟐𝟑 𝒎𝒊𝒏𝒔 1.𝟑𝟎 𝟑𝟎 𝒎𝒊𝒏𝒔 𝟐𝟖 𝒎𝒊𝒏𝒔 1.𝟎𝟕
23
Preliminary Output
24
TTI: Early Morning (12am – 6am)
Ref. speed: 24hr max Downtown
25
TTI: AM Peak (6am – 9am) Ref. speed: 24hr max Downtown
26
TTI: Mid-day (9am – 3pm) Ref. speed: 24hr max Downtown
27
TTI: PM Peak (3pm – 7pm) Ref. speed: 24hr max Downtown
28
TTI: Evening Period (7pm – 12am)
Ref. speed: 24hr max Downtown
29
Lessons Learned and Next Steps
30
Lessons learned and next steps
Passive (Big) data sources can offer a cost effective means to conduct travel time analyses Data validation is necessary and important True value lies in flexibility of setting parameters and granularity of data analysis & output (across space and time) Developing a road network for analysis is complex and time consuming Traffic volume data is still required to develop congestion metrics
32
EXTRA SLIDES
33
Why is Congestion Measurement important?
Monitor Improvements Implement New Policies We are here Contribute towards fact-based dialogue Set Goals, Targets, & Objectives First 3 bubbles: Performance-based management Accountability Goal-setting Next 2 bubbles: Stakeholder engagement Structured decision-making Policy options Implementation Report on Trends Monitor Baseline Conditions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.