Download presentation
Presentation is loading. Please wait.
Published byIrene Blake Modified over 8 years ago
1
Office of Research and Information Technology CVSA and FMCSA SSDQ Performance Measures: Crash Timeliness Inspection Timeliness April 21--22, 2013 CVSA Spring Workshop
2
Office of Research and Information Technology 2 Candy Brown Presenter, Timeliness Performance Measures and Reports Kevin Berry Presenter, Improvement Strategies Introduction
3
Office of Research and Information Technology 3 Overview of Timeliness Performance Measures Why Timeliness Matters Training Objectives and Expected Outcomes How State Ratings Are Determined How to Interpret Data Quality Reports When and How to Improve Data Quality Agenda
4
Office of Research and Information Technology 4 Overview of Timeliness Performance Measures Crash Timeliness: the percentage of fatal and non-fatal crash records submitted to Motor Carrier Management Information System (MCMIS) within 90 days of the crash event over a 12-month period Inspection Timeliness: the percentage of inspection records submitted to MCMIS within 21 days of the inspection event over a 12-month period
5
Office of Research and Information Technology 5 State Safety Data Quality (SSDQ) Measures Driver Identification Evaluation Vehicle Identification Evaluation Overall State Rating VIN Accuracy Timeliness Accuracy Record Completeness Non-Fatal Completeness Fatal Completeness Timeliness Accuracy Driver Identification Evaluation Vehicle Identification Evaluation Record Completeness Inspection Crash Consistency (Overriding Indicator)
6
Office of Research and Information Technology 6 Why Timeliness Matters The Safety Measurement System (SMS) Behavior Analysis and Safety Improvement Categories (BASICs) need timely data to target the right carriers for safety performance interventions. The SMS weighs recent events more heavily; missing events mean that carriers’ SMS BASICs could be better or worse than they should be Carriers may not be targeted for investigations Inconsistent timeliness among States could skew the SMS results in favor of carriers operating in States that report late SMS Weights Crashes by Time 12-24 months Event Date Most recent 6 months 7-12 months Weight 3 Weight 2 Weight 1
7
Office of Research and Information Technology 7 Explain the Crash and Inspection Timeliness performance measures Explore Timeliness reports Show how data collection and processing errors can affect Timeliness ratings Identify FMCSA resources for improving data quality Training Objectives
8
Office of Research and Information Technology 8 Understand Timeliness performance measure methodology Interpret Timeliness rating results Interpret Timeliness State Data Analysis Reports (SDAR) and custom reports/SAFETYNET Queries Identify potential sources of collection and reporting issues Identify FMCSA resources for improving data quality Expected Outcomes
9
Office of Research and Information Technology 9 Training How State Ratings Are Determined
10
Office of Research and Information Technology 10 Crash Timeliness Determines a crash rating (Good, Fair, Poor) based on the percent of records reported to MCMIS within 90 days of the crash event 12-month time span Evaluates fatal and non-fatal crash records Inspection Timeliness Determines an inspection rating (Good, Fair, Poor) based on the percent of inspection records reported to MCMIS within 21 days of the inspection event 12-month time span Evaluates inspection records Methodology
11
Office of Research and Information Technology 11 Evaluation Period = Event Date Range 12 Months of MCMIS Data Based on event date, not upload date “Rolling” 12-month period Excludes the most recent 3 months
12
Office of Research and Information Technology 12 Timeliness ratings calculated every month Results posted on the A&I Data Quality Website Ratings Number of Records Reported On Time Number of Total Records Evaluated = Percent of Timely Records RatingCrash Criteria GoodPercentage reported within 90 days is ≥ 90% FairPercentage reported within 90 days is 65 - 89% PoorPercentage reported within 90 days is < 65%
13
Office of Research and Information Technology 13 Training How to Interpret Data Quality Reports
14
Office of Research and Information Technology 14 Three types of reports: 1 Rating Results 2 State Data 3 Custom Reports & Analysis Reports SAFETYNET Queries What you can do with them: How to Use Data Quality Reports Spot trends in reporting Identify how late records are when reported Identify late records by jurisdiction Monitor upload frequency to MCMIS
15
Office of Research and Information Technology 15 Monthly Rating Results MCMIS snapshot was taken March 22, 2013 12 Months of MCMIS data Event Date Range is 1/1/2012 – 12/31/2012 Not Actual Data
16
Office of Research and Information Technology 16 Crash Timeliness Ratings How to Interpret Report displays the last 13 ratings in a bar chart and a table Each rating based on the percentage of timely records in MCMIS Compares current and previous results to identify trends When to Act Unusual or significant change in percent or number of timely records Slow decline in rating Even when the rating is Good
17
Office of Research and Information Technology 17 State Data Analysis Reports (SDAR) How to Interpret Details of evaluation period Timeliness of records by month of event Trends in timeliness and counts When to Act Downward trend in timeliness Change in counts between months Crash Record Timeliness − Monthly Analysis
18
Office of Research and Information Technology 18 SDAR (cont.) How to Interpret Three days or more between uploads to MCMIS Trends in timeliness and volume When to Act Frequent instances of uploads taking place over three days apart Significant change in volume Crash Record Timeliness − Number of Days Between Record Uploads to MCMIS
19
Office of Research and Information Technology 19 SDAR (cont.) How to Interpret Sort by: Inspector ID Number or percentage of on- time and late records Total evaluated records When to Act Inspectors with high numbers/ percentage of late records Widespread distribution of late records Inspection Timeliness − Records Reported by Inspector Insp #’s hidden from view
20
Office of Research and Information Technology 20 Explore specific data quality issues: Crash event date by upload date The timeliness of records in each upload to MCMIS When there are late records or change in counts Comparison of timeliness by agency Create a SAFETYNET query, including: List of records input late to SAFETYNET Custom Reports and SAFETYNET Queries
21
Office of Research and Information Technology 21 Custom Reports Crash Event Date by Upload Date Event Dates
22
Office of Research and Information Technology 22 Custom Reports Comparison of Timeliness by Agency Agency Crash Records Submitted On Time % On Time Crash Records Submitted Late % Late Total Crash Records State Police 48089%5711%537 City #1 Police Dept 7576%2424%99 City #2 Police Dept 2533%5167%76 Other Local Agencies 21473%8127%295 (blank) 14567%7333%218 TOTAL 93977%28623%1,225
23
Office of Research and Information Technology 23 SAFETYNET Queries Report # Crash Date Reporting Agency Officer Badge # Import Date Days to SAFETYNET XXXXXXXXXXX1/6/2013City Police99993/28/201381 XXXXXXXXXXX1/6/2013State Patrol99993/28/201381 XXXXXXXXXXX1/6/2013Local Police99993/28/201381 XXXXXXXXXXX1/6/2013City Police99993/28/201381 XXXXXXXXXXX1/5/2013Dept. of Public Safety99993/28/201382 XXXXXXXXXXX1/5/2013City Police99993/28/201382 XXXXXXXXXXX1/5/2013Local Police99993/28/201382 XXXXXXXXXXX1/1/2013Dept. of Public Safety99993/28/201386 XXXXXXXXXXX1/1/2013State Patrol99993/28/201386 XXXXXXXXXXX1/1/2013City Police99993/28/201386 XXXXXXXXXXX1/1/2013Local Police99993/28/201386 Days from Event Date to SAFETYNET
24
Office of Research and Information Technology 24 Training When and How to Improve Data Quality
25
Office of Research and Information Technology 25 Data Collection and Reporting Process Key to improving Crash and Inspection Timeliness: Understand your State’s collection and reporting process. All Crashes Meeting FMCSA Criteria Must Be Reported On Time Collect SelectReport Law Enforcement State Organization MCSAP Office
26
Office of Research and Information Technology 26 Crash Data Collection and Reporting Process
27
Office of Research and Information Technology 27 Possible Actions by Law Enforcement Ensure officers understand the importance of timeliness to FMCSA Formal training Feedback to individual officers and/or agencies Ensure reports are transferred promptly Prioritize FMCSA crash reports Pay attention to system updates or changes in electronic collection Collect and Transfer Crash Reports Promptly Review/ Correct Collect Data at Scene Transfer
28
Office of Research and Information Technology 28 Possible Actions at State Crash Repository Assess applicable procedures to ensure records are reviewed and promptly transferred to the MCSAP Office Prioritize FMCSA crash reports for processing Prioritize FMCSA crash reports for transfer to MCSAP Office Track crash reports sent back to officer for correction Pay attention to system updates or changes in electronic transfer to ensure records are not delayed Process and Transfer Crash Reports Promptly Transfer Receive Forward Report Review/ Input ID FMCSA Reportables
29
Office of Research and Information Technology 29 Possible Actions in the MCSAP Office Identify and implement improvements for processing and uploading reports Validate number of reports received from the State crash repository Track crash reports sent back to officer for correction Consider SMS weightings and Timeliness cut-offs when prioritizing backlogs Address backlogs by adding/reassigning staff Upload to MCMIS daily Check activity logs daily for rejected records Process and Upload Crash Reports Promptly Forward Report ID FMCSA Reportables Receive Upload to MCMIS Review/ Input to SAFETYNET
30
Office of Research and Information Technology 30 What to Do Next Local Law Enforcement Agencies State Police State Crash Agency Other State Agencies MCSAP Office Interagency Coordination: How Does It Work in Your State?
31
Office of Research and Information Technology 31 Candy Brown SSDQ Measure Development and Analysis Candace.Brown@dot.gov 617-494-3856 Kevin Berry Technical Analyst Kevin.Berry@dot.gov 617-494-2857 Contacts
32
Office of Research and Information Technology 32 I am now able to: Understand Timeliness performance measure methodology Interpret Timeliness rating results Interpret Timeliness SDAR Identify potential sources of collection and reporting issues Identify FMCSA resources for improving data quality Training Recap
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.