Office of Research and Information Technology CVSA and FMCSA SSDQ Performance Measures: Crash Timeliness Inspection Timeliness April 21--22, 2013 CVSA.

Slides:



Advertisements
Similar presentations
Safety Data Analysis Tools Workshop Michael S. Griffith, Director FMCSA Office of Research and Analysis TRB Transportation Safety Planning Working Group.
Advertisements

1 Commonwealth of Virginia Executive Branch Strategic Planning, Service Area Planning, and Performance-Based Budgeting Agency Strategic & Service Area.
Comprehensive Safety Analysis (CSA) 2010.
CSA and Roadside Inspections Trooper John Sova Motor Carrier Operations Division North Dakota Highway Patrol.
FIELD COMPLIANCE UPDATE CDR Thomas R. Berry, RPh FDA, Investigator RAL-RP / ATL-DO.
1 MEASURING THE EFFECTIVENESS OF THE NATION’S FOODSERVICE AND RETAIL FOOD PROTECTION SYSTEM.
A Study of the 2004 Street Smart Communications Program Prepared by Riter Research for: Metropolitan Washington Council of Government’s May 2004 Advertising.
A Tool to Monitor Local Level SPF SIG Activities
ASAP (Assessment Submission and Processing) Submission Processing Overview for IRF-PAI IRF Conference May 2, 2012.
Improving Data Entry of CD4 Counts March Welcome! The State Office of AIDS (OA) is continuing to work with providers to improve the quality of data.
An Introduction to the Federal Motor Carrier Safety Administration.
PRISM Data Flow & Timeliness Last Updated: February 2009.
1 Comprehensive Safety Analysis CSA 2010 January 28, 2009.
Federal Motor Carrier Safety Administration & DOT Safety Regulations Updates.
1 Commonwealth of Virginia Executive Branch Strategic Planning, Service Area Planning, and Performance-Based Budgeting Agency Strategic & Service Area.
 Why Change?  What is CSA?  Field Test & National Rollout  Summary.
CVEO 3 Larry Pasco Spokane Washington State Patrol Commercial Vehicle Division District Supervisor 1.
Compliance, Safety, Accountability Update to MCSAC October 28, 2014.
Risk Management, Assessment and Planning Committee III-4.
Transportation & Logistics Anthony P. Gallo, CFA(410) office, (410) mobile Michael R. Busche(704)
TRACKING AND REPORTING PROGRESS AND CONTINUOUS IMPROVEMENT AmeriCorps Program Directors’ Kickoff: 2015 –
Overview of the Evaluation of CSA 2010 Operational Model Test Daniel Blower December 5, 2012.
1 Commercial Vehicle Safety Data Quality and Uniformity.
Guide to Credentials Administration: Appendix B PRISM and CVISN B - 1 Appendix B PRISM and CVISN - Explaining the Relationship.
Local Government Pension Scheme 16 September 2009 PENSION LIAISON OFFICERS’ GROUP (PLOG)
ALICE ADVANCED USERS TRAINING April 10, Welcome and Introductions  Alice for Advanced Users  FCADV Staff Support for Alice  address for.
Operational Excellence in Effort Reporting Conference Call: March 14, 2012 Vision Statement: Implement a compliant, streamlined, electronic effort reporting.
Talking Freight Seminar Presenter Chuck Horan Director of Enforcement and Compliance.
June 2009 | U.S. Department of Transportation Federal Motor Carrier Safety Administration Comprehensive Safety Analysis (CSA) 2010 Overview and Oversize/Overweight.
SANEDI. INDEX  KEY ACTIVITIES DURING FINANCIAL YEAR  DISCUSSIONS ON KEY ACTIVITIES  CONCLUSION  APPRECIATION.
Module Crash Data Collection and Uses Describe the process by which crash data are collected and used in road safety management.
FMCSA Tools To Improve State Data Quality Presented By: Shaun Dagle.
CSA Driver Training. Who is subject?  Carriers and their drivers are subject if the carrier:  has a U.S. DOT Number; and  operates commercial motor.
PRISM Data Flow & Timeliness for Law Enforcement FMCSA - PRISM Technical Support Last Updated:
Driver Information Resource Introduction and User Lessons.
What It Means for Great Dane Customers Comprehensive Safety Analysis (CSA) 2010.
Quality Assurance Programme of the Canadian Census of Population Expert Group Meeting on Population and Housing Censuses Geneva July 7-9, 2010.
PRISM Data Flow & Timeliness for Law Enforcement Updated: February 2009.
Improvement Planning Mischele McManus Infant/Toddler and Family Services Office of Early Childhood Education and Family Services July 20, 2007
CSA 2010 DRIVER INFORMATION 11/27/ WHAT IS CSA 2010? CSA 2010 is a government initiative to make roads safer by contacting motor carriers sooner.
U.S. Department of Transportation Federal Motor Carrier Safety Administration Listening Session #2 12/10/2009 Comprehensive Safety Analysis (CSA) 2010.
University of Minnesota Internal\External Sales “The Internal Sales Review Process” An Overview of What Happens During the Review.
CSA 2010 Overview Presented by Brandon Putz Loss Control & Safety Representative.
November 2015 Common weaknesses in local authorities judged inadequate under the single inspection framework – a summary.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Office of Research and Information Technology FMCSA Data Quality Program October 27, 2014 Traffic Records Forum.
FMCSA ANALYSIS DIVISIONData Quality Program August 2005 Federal Motor Carrier Safety Administration What Programs are Available to the States to Improve.
Using Total Quality Management Tools to Improve the Quality of Crash Data John Woosley Louisiana State University.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Revision N° 11ICAO Safety Management Systems (SMS) Course01/01/08 Module N° 9 – SMS operation.
Commercial Motor Vehicle Safety in Washington State.
SBA Roundtable| February 14, Compliance, Safety, Accountability (CSA) Small Business Administration February 2012.
Illness & Injury Prevention Program (IIPP) The Town of Los Gatos’ Updated Illness & Injury Prevention Program (IIPP) A Roadmap to a Safer Organization.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
The FMCSA Program Effectiveness Study How Effective Are Compliance Reviews?
Overview: Analysis & Information Online & Driver Information Resource (DIR) 2007 Eastern In-Service Training Williamsburg, VA August 2007.
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
Continuous Quality Improvement Basics Created by Michigan’s Campaign to End Homelessness Statewide Training Workgroup 2010.
Crash Profiles Online Overview. What are Crash Profiles? What is Crash Profiles Online? What Data Sources are used to Compile Crash Profiles? FARS Data.
FMCSA Data for Safety Stakeholders| December FMCSA Data for Safety Stakeholders.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Compliance, Safety, Accountability (CSA) and Drivers Winter 2016.
1 42nd International Forum on Traffic Records & Highway Information Systems Traffic Records Electronic Data System (TREDS) Virginia Highway Safety Office.
Data Quality We’ve Got Your Number
Federal Programs – FMCSA
Training Police Officers in Investigating
Advances in Aligning Performance Data and Budget Information:
CSA 2010 The New Approach.
Comprehensive Safety Analysis (CSA) 2010 Supplemental Slides to discuss Data Preview with Motor Carriers U.S. Department of Transportation Federal Motor.
FMCSA Data Assistance Grant Programs
Presentation transcript:

Office of Research and Information Technology CVSA and FMCSA SSDQ Performance Measures: Crash Timeliness Inspection Timeliness April , 2013 CVSA Spring Workshop

Office of Research and Information Technology 2 Candy Brown Presenter, Timeliness Performance Measures and Reports Kevin Berry Presenter, Improvement Strategies Introduction

Office of Research and Information Technology 3  Overview of Timeliness Performance Measures  Why Timeliness Matters  Training Objectives and Expected Outcomes  How State Ratings Are Determined  How to Interpret Data Quality Reports  When and How to Improve Data Quality Agenda

Office of Research and Information Technology 4 Overview of Timeliness Performance Measures  Crash Timeliness: the percentage of fatal and non-fatal crash records submitted to Motor Carrier Management Information System (MCMIS) within 90 days of the crash event over a 12-month period  Inspection Timeliness: the percentage of inspection records submitted to MCMIS within 21 days of the inspection event over a 12-month period

Office of Research and Information Technology 5 State Safety Data Quality (SSDQ) Measures Driver Identification Evaluation Vehicle Identification Evaluation Overall State Rating VIN Accuracy Timeliness Accuracy Record Completeness Non-Fatal Completeness Fatal Completeness Timeliness Accuracy Driver Identification Evaluation Vehicle Identification Evaluation Record Completeness Inspection Crash Consistency (Overriding Indicator)

Office of Research and Information Technology 6 Why Timeliness Matters The Safety Measurement System (SMS) Behavior Analysis and Safety Improvement Categories (BASICs) need timely data to target the right carriers for safety performance interventions.  The SMS weighs recent events more heavily; missing events mean that carriers’ SMS BASICs could be better or worse than they should be  Carriers may not be targeted for investigations  Inconsistent timeliness among States could skew the SMS results in favor of carriers operating in States that report late SMS Weights Crashes by Time months Event Date Most recent 6 months 7-12 months Weight 3 Weight 2 Weight 1

Office of Research and Information Technology 7  Explain the Crash and Inspection Timeliness performance measures  Explore Timeliness reports  Show how data collection and processing errors can affect Timeliness ratings  Identify FMCSA resources for improving data quality Training Objectives

Office of Research and Information Technology 8  Understand Timeliness performance measure methodology  Interpret Timeliness rating results  Interpret Timeliness State Data Analysis Reports (SDAR) and custom reports/SAFETYNET Queries  Identify potential sources of collection and reporting issues  Identify FMCSA resources for improving data quality Expected Outcomes

Office of Research and Information Technology 9 Training How State Ratings Are Determined

Office of Research and Information Technology 10 Crash Timeliness  Determines a crash rating (Good, Fair, Poor) based on the percent of records reported to MCMIS within 90 days of the crash event  12-month time span  Evaluates fatal and non-fatal crash records Inspection Timeliness  Determines an inspection rating (Good, Fair, Poor) based on the percent of inspection records reported to MCMIS within 21 days of the inspection event  12-month time span  Evaluates inspection records Methodology

Office of Research and Information Technology 11 Evaluation Period = Event Date Range 12 Months of MCMIS Data  Based on event date, not upload date  “Rolling” 12-month period  Excludes the most recent 3 months

Office of Research and Information Technology 12  Timeliness ratings calculated every month  Results posted on the A&I Data Quality Website Ratings Number of Records Reported On Time Number of Total Records Evaluated = Percent of Timely Records RatingCrash Criteria GoodPercentage reported within 90 days is ≥ 90% FairPercentage reported within 90 days is % PoorPercentage reported within 90 days is < 65%

Office of Research and Information Technology 13 Training How to Interpret Data Quality Reports

Office of Research and Information Technology 14 Three types of reports: 1 Rating Results 2 State Data 3 Custom Reports & Analysis Reports SAFETYNET Queries What you can do with them: How to Use Data Quality Reports Spot trends in reporting Identify how late records are when reported Identify late records by jurisdiction Monitor upload frequency to MCMIS

Office of Research and Information Technology 15 Monthly Rating Results MCMIS snapshot was taken March 22, Months of MCMIS data Event Date Range is 1/1/2012 – 12/31/2012 Not Actual Data

Office of Research and Information Technology 16 Crash Timeliness Ratings How to Interpret  Report displays the last 13 ratings in a bar chart and a table  Each rating based on the percentage of timely records in MCMIS  Compares current and previous results to identify trends When to Act  Unusual or significant change in percent or number of timely records  Slow decline in rating  Even when the rating is Good

Office of Research and Information Technology 17 State Data Analysis Reports (SDAR) How to Interpret  Details of evaluation period  Timeliness of records by month of event  Trends in timeliness and counts When to Act  Downward trend in timeliness  Change in counts between months Crash Record Timeliness − Monthly Analysis

Office of Research and Information Technology 18 SDAR (cont.) How to Interpret  Three days or more between uploads to MCMIS  Trends in timeliness and volume When to Act  Frequent instances of uploads taking place over three days apart  Significant change in volume Crash Record Timeliness − Number of Days Between Record Uploads to MCMIS

Office of Research and Information Technology 19 SDAR (cont.) How to Interpret  Sort by:  Inspector ID  Number or percentage of on- time and late records  Total evaluated records When to Act  Inspectors with high numbers/ percentage of late records  Widespread distribution of late records Inspection Timeliness − Records Reported by Inspector Insp #’s hidden from view

Office of Research and Information Technology 20  Explore specific data quality issues:  Crash event date by upload date  The timeliness of records in each upload to MCMIS  When there are late records or change in counts  Comparison of timeliness by agency  Create a SAFETYNET query, including:  List of records input late to SAFETYNET Custom Reports and SAFETYNET Queries

Office of Research and Information Technology 21 Custom Reports Crash Event Date by Upload Date Event Dates

Office of Research and Information Technology 22 Custom Reports Comparison of Timeliness by Agency Agency Crash Records Submitted On Time % On Time Crash Records Submitted Late % Late Total Crash Records State Police 48089%5711%537 City #1 Police Dept 7576%2424%99 City #2 Police Dept 2533%5167%76 Other Local Agencies 21473%8127%295 (blank) 14567%7333%218 TOTAL 93977%28623%1,225

Office of Research and Information Technology 23 SAFETYNET Queries Report # Crash Date Reporting Agency Officer Badge # Import Date Days to SAFETYNET XXXXXXXXXXX1/6/2013City Police99993/28/ XXXXXXXXXXX1/6/2013State Patrol99993/28/ XXXXXXXXXXX1/6/2013Local Police99993/28/ XXXXXXXXXXX1/6/2013City Police99993/28/ XXXXXXXXXXX1/5/2013Dept. of Public Safety99993/28/ XXXXXXXXXXX1/5/2013City Police99993/28/ XXXXXXXXXXX1/5/2013Local Police99993/28/ XXXXXXXXXXX1/1/2013Dept. of Public Safety99993/28/ XXXXXXXXXXX1/1/2013State Patrol99993/28/ XXXXXXXXXXX1/1/2013City Police99993/28/ XXXXXXXXXXX1/1/2013Local Police99993/28/ Days from Event Date to SAFETYNET

Office of Research and Information Technology 24 Training When and How to Improve Data Quality

Office of Research and Information Technology 25 Data Collection and Reporting Process Key to improving Crash and Inspection Timeliness: Understand your State’s collection and reporting process. All Crashes Meeting FMCSA Criteria Must Be Reported On Time Collect SelectReport Law Enforcement State Organization MCSAP Office

Office of Research and Information Technology 26 Crash Data Collection and Reporting Process

Office of Research and Information Technology 27 Possible Actions by Law Enforcement  Ensure officers understand the importance of timeliness to FMCSA  Formal training  Feedback to individual officers and/or agencies  Ensure reports are transferred promptly  Prioritize FMCSA crash reports  Pay attention to system updates or changes in electronic collection Collect and Transfer Crash Reports Promptly Review/ Correct Collect Data at Scene Transfer

Office of Research and Information Technology 28 Possible Actions at State Crash Repository  Assess applicable procedures to ensure records are reviewed and promptly transferred to the MCSAP Office  Prioritize FMCSA crash reports for processing  Prioritize FMCSA crash reports for transfer to MCSAP Office  Track crash reports sent back to officer for correction  Pay attention to system updates or changes in electronic transfer to ensure records are not delayed Process and Transfer Crash Reports Promptly Transfer Receive Forward Report Review/ Input ID FMCSA Reportables

Office of Research and Information Technology 29 Possible Actions in the MCSAP Office  Identify and implement improvements for processing and uploading reports  Validate number of reports received from the State crash repository  Track crash reports sent back to officer for correction  Consider SMS weightings and Timeliness cut-offs when prioritizing backlogs  Address backlogs by adding/reassigning staff  Upload to MCMIS daily  Check activity logs daily for rejected records Process and Upload Crash Reports Promptly Forward Report ID FMCSA Reportables Receive Upload to MCMIS Review/ Input to SAFETYNET

Office of Research and Information Technology 30 What to Do Next Local Law Enforcement Agencies State Police State Crash Agency Other State Agencies MCSAP Office Interagency Coordination: How Does It Work in Your State?

Office of Research and Information Technology 31 Candy Brown SSDQ Measure Development and Analysis Kevin Berry Technical Analyst Contacts

Office of Research and Information Technology 32 I am now able to: Understand Timeliness performance measure methodology Interpret Timeliness rating results Interpret Timeliness SDAR Identify potential sources of collection and reporting issues Identify FMCSA resources for improving data quality Training Recap