DEVELOPING A CONTINUOUS IMPROVEMENT PROGRAM FOR TRAFFIC SAFETY INFORMATION SYSTEMS Baltimore, MD August 8, 2016 Allen Parrish Traffic Records Forum 2016.

Slides:



Advertisements
Similar presentations
Gallup Q12 Definitions Notes to Managers
Advertisements

Judy Windle Sunflower Systems How to Clean Up Your Property Data: From Plan to Execution.
Black Box. Science Any approach that involves the gaining of knowledge to explain the natural world Scientists test ideas by gathering evidence It is.
Why bother? Trying to do something differently in an academic or NHS setting can sometimes be a frustrating experience.
FL Department of Education School Bus Crash Data/FIRES Introduction Justin Atwell, Project Manager.
The language of barriers and drivers: problems and limitations Yolande Strengers WARNING: This presentation is deliberately provocative. Don’t take it.
Readiness Index – Is your application ready for Production? Jeff Tatelman SQuAD October 2008.
Iowa’s Impaired Driving Records Demonstration Project Traffic Records Forum July 16, 2003 Mary Jensen Iowa Department of Transportation Traffic Records.
New Approaches to Data Transfer DOT Daniel Morgan 29 October 2014.
Washington State Department of Social & Health Services – Division of Behavioral Health and Recovery - PRI One Department Vision Mission Core set of Values.
10/16/ State Strategic Plan Review 10/16/063 Section 408 Program Matrix Systems: Crash Roadway Vehicle Driver Citation / Adjudication Injury Surveillance.
1 Commercial Vehicle Safety Data Quality and Uniformity.
December 14, 2011/Office of the NIH CIO Operational Analysis – What Does It Mean To The Project Manager? NIH Project Management Community of Excellence.
EHR Goal Setting and Impact on Quality of Care
Module Crash Data Collection and Uses Describe the process by which crash data are collected and used in road safety management.
Joseph D. Wallace ABCN… the missing piece. Joseph D. Wallace New Center Expansion: Growing Right from Wrong How to Think About, Structure and Achieve.
My Course Was Selected For LAE Assessment Reporting. What Do I Do Now??? LAE Assessment.
National Highway Traffic Safety Administration Results from 22 Traffic Records Assessments John Siegler National Driver Register and Traffic Records Division.
Traffic Records Assessment Assessor Training October 2015.
Traffic Records Assessment Overview and Insights Luke Johnson | John Siegler Traffic Records Forum August 8, 2016.
Human Resource Management 1 Performance Management Process.
1 42nd International Forum on Traffic Records & Highway Information Systems Traffic Records Electronic Data System (TREDS) Virginia Highway Safety Office.
Traffic Records Assessment Training August Advisory Updates Traffic Records Program Assessment Advisory update: Assessment participants have been.
What is it? Road Safety Week is the UK’s biggest road safety event. People all around the country get involved to learn and spread the word about road.
Going Green: Putting Your Newsletter On Line
I’ve Got Your Number (again) Presented by Bob Scopatz
Waste and Warp it Annual meet up
Florida Citation Inventory System and Improvements to Citation / Adjudication Data 4/11/2016 4/11/2016.
Model Minimum Uniform Crash Criteria (MMUCC) 5th Edition
Data Quality We’ve Got Your Number
How I cope with stress - I believe you can cope too
Implementing the Surface Transportation Domain
An Introduction to Motivational Interviewing
Tackling the black attainment gap in your union
Quick Question: Starting on page 52 of your book,
ROUTINE HEALTH INFORMATION SYSTEMS
LAE Assessment My Course Was Selected For LAE Assessment Reporting.
What is it? Road Safety Week is the UK’s biggest road safety event. People all around the country get involved to learn and spread the word about road.
CS/COE 1541 (term 2174) Jarrett Billingsley
Basic Guide to Writing an Essay
ROAD SAFETY WEEK.
Effective Safety Data Governance (Illinois Experience)
TRCC Roundtable August 9, 2016.
Enterprise Content Management, Shared Services, & Contract Management
The Pedestrian Safety Challenge
Writing the Paragraph The Basics.
How…? Implementation Science
UKCAT.
Automating Profitable Growth
Automating Profitable Growth™
Using outcomes data for program improvement
Using Texas Crash Records and EMS Registry Data to Identify Traffic Safety Issues in Small and Local Communities By Michael Martin & Eva Shipp, PhD ATSIP.
Standard Algorithm for Multiplication
Enhancing institutional capacity and an efficient public administration (TO 11) - state of play in the framework of the negotiations     Florian HAUSER,
NM Traffic Records Coordinating Committee
Data Integration Roundtable
Reaching your breaking point
R&D Knowledge Management Opportunities for Design.
Deep Space and Time.
Automated Driving Systems: A Vision for Safety
Why CRM Doesn’t Work as Partner Management Software
Tackling Timed Writings
Introduction to Engineering Design II (IE 202)
Automating Profitable Growth™
Basic Guide to Writing an Essay
Lead Safety Indicators
Jasmine Thornton L. Johnson
What’s the advantage of using Cornell notes?
From Randomness to Probability
Second U.S. Roadway Safety Data Capabilities Assessment
Presentation transcript:

DEVELOPING A CONTINUOUS IMPROVEMENT PROGRAM FOR TRAFFIC SAFETY INFORMATION SYSTEMS Baltimore, MD August 8, 2016 Allen Parrish Traffic Records Forum 2016

NHTSA ASSESSMENTS Six Areas Crash Roadway Citation/Adjudication Driver Vehicle EMS Six Metrics Timeliness Accuracy Completeness Uniformity Integration Accessibility

NHTSA ASSESSMENTS 6 areas X 6 metrics = 36 combinations: –Split citation and adjudication –Consistent with the assessments –7 areas X 6 metrics = 42 combinations Ignore multiple metrics in EMS –Makes the illustration too complicated Illustrate with 42 metrics

A CI PROGRAM Metrics: –Some seem obvious –Others are not as obvious A lot of states start with a few obvious metrics and fade out relatively quickly A comprehensive program: –Needs to start with a comprehensive view

TOP DOWN – STOP LIGHT CHART TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash Roadway Citations Adjudication Driver Vehicle EMS

CONCEPTUAL FRAMEWORK Continuous Improvement: –Identify metrics (e.g., Crash Record Entry Delay) –Establish benchmarks (e.g., “within 48 hours”) –Measure performance –Report results to the TRCC and stakeholders –Work with stakeholders to make improvements Assumption: You can’t go from 0 to 42 instantaneously –Improving continuous improvement is itself continuous improvement.

CONTINUOUS IMPROVEMENT Determine what to measure Measure Be transparent as to the outcome –Reporting metric results should be standard at the TRCC meetings Involve the stakeholders Let the metrics (help) drive the plan

CATEGORIZE METRICS Who/what does this metric reflect? –Infrastructure –Processes –People What is the potential for a measure to change with changing conditions? –Metric has growth potential –Metric can’t change much or at all

WHO/WHAT IS REFLECTED? Scenario #1: –Paper crash forms –Centralized data entry in state capital Metric: –What is the delay between the date/time of the crash and the point at which the record is available for analysis? –Delay = 15 months What is this delay measuring? –Could be slow completion of form (people or process) –Could be slow transmission of the form (people or process) –Could be bottleneck in data entry (might be people or process, definitely technology) Dominant issue is probably the technology!

WHO/WHAT IS REFLECTED? Scenario #2: –Electronic system at roadside –Local approval process –Electronic submission of the data once approved to centralized repository Metric: –What is the delay between the date/time of the crash and the point at which the record is available for analysis? –Delay = 2 weeks What is this delay measuring? –Could be slow completion of form (people or process) –Could be slow transmission of the form (people or process) –Less likely to be the technology

WHAT IS THE POTENTIAL FOR CHANGE? Some metrics can’t change much Others have great potential to go substantially up or down Possibilities: –Static – This item can’t change –Diminishing upside and/or downside – Diminishing returns has made a particular thing pretty static Crash records 99.6% complete after 48 hours Can’t change much – it’s virtually static –High potential value – A metric that you really want to measure because it’s got substantial change potential. These are the “good metrics”

PUTTING THESE TOGETHER “Static” metrics: –Mostly useless for continuous improvement –Sometimes they are the only metric for a particular box –Need to supply those for your assessments, but they don’t have practical value Infrastructure versus people/process: –Hopefully your measure will touch at least one –Sometimes there may be a metric for both –42 metrics could expand to 84

STOP LIGHT CHARTS 2 charts: –One for infrastructure –One for people/process Red/yellow/green: –How well the metric meets the benchmark Black: –No metric in that category Static metrics

BIG PICTURE TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash Roadway Citations Adjudication Driver Vehicle EMS People/Process TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash Roadway Citations Adjudication Driver Vehicle EMS Tech Infrastructure

CRASH LINES IN CHARTS TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash T = Crash to final record by agency [operational] A = Number of inaccurate plate numbers [operational] C = Number of missing records by agency [operational] U = MMUCC potential realized [infrastructure] I = No metric Ac = Percentage of stakeholders with access to CARE [static 100%] TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash People/Process Infrastructure

CRASH LINES IN CHARTS TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash T = Crash to final record by agency [operational] A = Number of inaccurate plate numbers [operational] C = Number of missing records by agency [operational] U = MMUCC potential realized [infrastructure] I = DEFERRED – CAN’T YET ADDRESS Ac = Percentage of stakeholders with access to CARE [static, infrastructure, 100%] TimelinessAccuracyCompleteUniformityIntegrationAccessibility Crash People/Process Infrastructure

SUMMARY Try to find at least one metric that fits every box Try to find some area that the box can grow (i.e., metric isn’t static) Red boxes indicate that something needs to be fixed: –Use the red boxes to motivate projects

THANK YOU Questions and Discussion