Presentation is loading. Please wait.

Presentation is loading. Please wait.

DEVELOPING A CONTINUOUS IMPROVEMENT PROGRAM FOR TRAFFIC SAFETY INFORMATION SYSTEMS Baltimore, MD August 8, 2016 Allen Parrish Traffic Records Forum 2016.

Similar presentations


Presentation on theme: "DEVELOPING A CONTINUOUS IMPROVEMENT PROGRAM FOR TRAFFIC SAFETY INFORMATION SYSTEMS Baltimore, MD August 8, 2016 Allen Parrish Traffic Records Forum 2016."— Presentation transcript:

1 DEVELOPING A CONTINUOUS IMPROVEMENT PROGRAM FOR TRAFFIC SAFETY INFORMATION SYSTEMS Baltimore, MD August 8, 2016 Allen Parrish Traffic Records Forum 2016

2 NHTSA ASSESSMENTS Six Areas Crash Roadway Citation/Adjudication Driver Vehicle EMS Six Metrics Timeliness Accuracy Completeness Uniformity Integration Accessibility

3 NHTSA ASSESSMENTS 6 areas X 6 metrics = 36 combinations: –Split citation and adjudication –Consistent with the assessments –7 areas X 6 metrics = 42 combinations Ignore multiple metrics in EMS –Makes the illustration too complicated Illustrate with 42 metrics

4 A CI PROGRAM Metrics: –Some seem obvious –Others are not as obvious A lot of states start with a few obvious metrics and fade out relatively quickly A comprehensive program: –Needs to start with a comprehensive view

5 TOP DOWN – STOP LIGHT CHART TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash Roadway Citations Adjudication Driver Vehicle EMS

6 CONCEPTUAL FRAMEWORK Continuous Improvement: –Identify metrics (e.g., Crash Record Entry Delay) –Establish benchmarks (e.g., “within 48 hours”) –Measure performance –Report results to the TRCC and stakeholders –Work with stakeholders to make improvements Assumption: You can’t go from 0 to 42 instantaneously –Improving continuous improvement is itself continuous improvement.

7 CONTINUOUS IMPROVEMENT Determine what to measure Measure Be transparent as to the outcome –Reporting metric results should be standard at the TRCC meetings Involve the stakeholders Let the metrics (help) drive the plan

8 CATEGORIZE METRICS Who/what does this metric reflect? –Infrastructure –Processes –People What is the potential for a measure to change with changing conditions? –Metric has growth potential –Metric can’t change much or at all

9 WHO/WHAT IS REFLECTED? Scenario #1: –Paper crash forms –Centralized data entry in state capital Metric: –What is the delay between the date/time of the crash and the point at which the record is available for analysis? –Delay = 15 months What is this delay measuring? –Could be slow completion of form (people or process) –Could be slow transmission of the form (people or process) –Could be bottleneck in data entry (might be people or process, definitely technology) Dominant issue is probably the technology!

10 WHO/WHAT IS REFLECTED? Scenario #2: –Electronic system at roadside –Local approval process –Electronic submission of the data once approved to centralized repository Metric: –What is the delay between the date/time of the crash and the point at which the record is available for analysis? –Delay = 2 weeks What is this delay measuring? –Could be slow completion of form (people or process) –Could be slow transmission of the form (people or process) –Less likely to be the technology

11 WHAT IS THE POTENTIAL FOR CHANGE? Some metrics can’t change much Others have great potential to go substantially up or down Possibilities: –Static – This item can’t change –Diminishing upside and/or downside – Diminishing returns has made a particular thing pretty static Crash records 99.6% complete after 48 hours Can’t change much – it’s virtually static –High potential value – A metric that you really want to measure because it’s got substantial change potential. These are the “good metrics”

12 PUTTING THESE TOGETHER “Static” metrics: –Mostly useless for continuous improvement –Sometimes they are the only metric for a particular box –Need to supply those for your assessments, but they don’t have practical value Infrastructure versus people/process: –Hopefully your measure will touch at least one –Sometimes there may be a metric for both –42 metrics could expand to 84

13 STOP LIGHT CHARTS 2 charts: –One for infrastructure –One for people/process Red/yellow/green: –How well the metric meets the benchmark Black: –No metric in that category Static metrics

14 BIG PICTURE TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash Roadway Citations Adjudication Driver Vehicle EMS People/Process TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash Roadway Citations Adjudication Driver Vehicle EMS Tech Infrastructure

15 CRASH LINES IN CHARTS TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash T = Crash to final record by agency [operational] A = Number of inaccurate plate numbers [operational] C = Number of missing records by agency [operational] U = MMUCC potential realized [infrastructure] I = No metric Ac = Percentage of stakeholders with access to CARE [static 100%] TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash People/Process Infrastructure

16 CRASH LINES IN CHARTS TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash T = Crash to final record by agency [operational] A = Number of inaccurate plate numbers [operational] C = Number of missing records by agency [operational] U = MMUCC potential realized [infrastructure] I = DEFERRED – CAN’T YET ADDRESS Ac = Percentage of stakeholders with access to CARE [static, infrastructure, 100%] TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash People/Process Infrastructure

17 SUMMARY Try to find at least one metric that fits every box Try to find some area that the box can grow (i.e., metric isn’t static) Red boxes indicate that something needs to be fixed: –Use the red boxes to motivate projects

18 THANK YOU Questions and Discussion


Download ppt "DEVELOPING A CONTINUOUS IMPROVEMENT PROGRAM FOR TRAFFIC SAFETY INFORMATION SYSTEMS Baltimore, MD August 8, 2016 Allen Parrish Traffic Records Forum 2016."

Similar presentations


Ads by Google