Traffic Records Assessment Overview and Insights Luke Johnson | John Siegler Traffic Records Forum August 8, 2016.

Slides:



Advertisements
Similar presentations
Environmental Health Tracking Technical Team Meeting 1 Future Assessment and Needs Assessment Advisory Discussion Craig Wolff IT/GIS Manager March 5, 2003.
Advertisements

GOAL: IMPROVE ILLINOIS TRAFFIC RECORDS Illinois Data Strategic Plan.
Database Planning, Design, and Administration
Nebraska Crash Outcome Data Evaluation System (CODES) Commercial Motor Vehicle Crashes in Nebraska, Dan Christensen Ming Qu Prabhakar Dhungana.
Quality evaluation and improvement for Internal Audit
9 1 Chapter 9 Database Design Database Systems: Design, Implementation, and Management, Seventh Edition, Rob and Coronel.
Measuring the effectiveness of government IT systems Current ANAO initiatives to enhance IT Audit integration and support in delivering Audit outcomes.
Purpose of the Standards
Configuration Management
Auditing Standards IFTA\IRP Audit Guidance Government Auditing Standards (GAO) Generally Accepted Auditing Standards (GAAS) International Standards on.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Internal Auditing and Outsourcing
What is Business Analysis Planning & Monitoring?
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
New Approaches to Data Transfer DOT Daniel Morgan 29 October 2014.
1 Road Crash and Victim Information System (RCVIS) Mr Sem Panhavuth Road Crash and victim information System Project Manager Handicap International Belgium.
10/16/ State Strategic Plan Review 10/16/063 Section 408 Program Matrix Systems: Crash Roadway Vehicle Driver Citation / Adjudication Injury Surveillance.
How To Build a Testing Project 1 Onyx Gabriel Rodriguez.
ZLOT Prototype Assessment John Carlo Bertot Associate Professor School of Information Studies Florida State University.
Introduction to the new SHC Health Information Record Manual Presented by Rhonda Anderson, RHIA President Anderson Health Information Systems, Inc
System Analysis of Virtual Team Collaboration Management System based on Cloud Technology Panita Wannapiroon, Ph.D. Assistant Professor Division of Information.
United States Department of Agriculture Food Safety and Inspection Service 1 National Advisory Committee on Meat and Poultry Inspection August 8-9, 2007.
Module Crash Data Collection and Uses Describe the process by which crash data are collected and used in road safety management.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Unit 3: Identifying and Safeguarding Vital Records Unit Introduction and Overview Unit objective:  Describe the elements of an effective vital records.
RTCC Performance Improvement South East Regional Trauma Coordinating Committee Meeting January 9, 2009 Temecula, CA.
National Highway Traffic Safety Administration Results from 22 Traffic Records Assessments John Siegler National Driver Register and Traffic Records Division.
Updating Recommendations for Injury Surveillance in State Health Departments Report from the Injury Surveillance Workgroup.
1 Cross-Cutting Issues 5310-JARC-New Freedom U.S. Department of Transportation Federal Transit Administration SAFETEAU-LU Curriculum August 7, 2007.
Model Minimum Uniform Crash Criteria (MMUCC) Everything you wanted to know about MMUCC.
Traffic Records Assessment Assessor Training October 2015.
National Highway Traffic Safety Administration What to Expect When You’re Expecting a Traffic Records Assessment Luke Johnson 2015 Traffic Records Forum.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
National Highway Traffic Safety Administration Training and Technical Assistance Programs Sarah Weissman Pascual National Driver Register and Traffic Records.
To protect, promote and improve the health of all people in Florida through integrated state, county, and community efforts. EMSTARS Constituency Briefing.
~ pertemuan 4 ~ Oleh: Ir. Abdul Hayat, MTI 20-Mar-2009 [Abdul Hayat, [4]Project Integration Management, Semester Genap 2008/2009] 1 PROJECT INTEGRATION.
Public Hearing on the Red Light Camera Ordinance Presented by Dana Crosby, Assistant County Attorney July 13, 2010 Board of County Commissioners.
Washington Traffic Records Committee Creating & Coordinating a Shared Vision for Traffic Records 2006 Traffic Records Forum August 1, 2006.
David M. Kroenke and David J. Auer Database Processing Fundamentals, Design, and Implementation Appendix B: Getting Started in Systems Analysis and Design.
California Department of Public Health / 1 CALIFORNIA DEPARTMENT OF PUBLIC HEALTH Standards and Guidelines for Healthcare Surge during Emergencies How.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
Elizabeth A. Baker, Ph.D.. NHTSA’s Assessment program provides technical assistance to State Highway Safety Offices, Emergency Medical Services Offices.
Organization and Implementation of a National Regulatory Program for the Control of Radiation Sources Program Performance Criteria.
TITLE OF AUDIT Author Date of presentation. Background Why did you do the audit? e.g. high risk / high cost / frequent procedure? Concern that best practice.
HCS 465 OUTLET Experience Tradition /hcs465outlet.com FOR MORE CLASSES VISIT
1 42nd International Forum on Traffic Records & Highway Information Systems Traffic Records Electronic Data System (TREDS) Virginia Highway Safety Office.
© 2016 Chapter 6 Data Management Health Information Management Technology: An Applied Approach.
1 How MMUCC, NIEM, VPIC, & EDT Can Help State Crash Data.
Traffic Records Assessment Training August Advisory Updates Traffic Records Program Assessment Advisory update: Assessment participants have been.
Update on the Latest Developments in Government Auditing Standards
Traffic Records Programs and Resources
Florida Citation Inventory System and Improvements to Citation / Adjudication Data 4/11/2016 4/11/2016.
Model Minimum Uniform Crash Criteria (MMUCC) 5th Edition
NHTSA Initiatives in Traffic Safety and the NTDETAS Strategic Plan
Developing Information Systems
Effective Safety Data Governance (Illinois Experience)
Lecture 3 Change Management
TRCC Roundtable August 9, 2016.
Software Configuration Management
NHTSA Initiatives in Traffic Safety and the NTDETAS Strategic Plan
Update on the Developments in Government Auditing Standards
TRD IVR\ Voice Portal Phase 1 PROJECT CLOSE 12/21/ /17/2018.
Monitoring and Evaluation using the
Canadian Auditing Standards (CAS)
Data Integration Roundtable
Automated Driving Systems: A Vision for Safety
GPP Training Toolkit An Introduction European Commission
Indiana Traumatic Brain Injury State Plan 2018 – 2023
Second U.S. Roadway Safety Data Capabilities Assessment
Presentation transcript:

Traffic Records Assessment Overview and Insights Luke Johnson | John Siegler Traffic Records Forum August 8, 2016

NEW ASSESSMENT: WHY & HOW Relaunched in 2012 – Based on feedback from States, GAO, assessors All new methodology developed with: – OMB PART – SMEs from GHSA, ATSIP, et. al. Key Changes – NHTSA underwrites – Uniform question set – Data collection & analysis online – Double the # of assessors – High level recommendations & detailed “considerations” – Archives answers & evidence for next time 2

ASSESSMENT SCOPE 3 TRCC MANAGEMENT CRASHDRIVERROADWAYVEHICLECITATION/ ADJUDICATON INJURY SURVEILLANCE DATA USE & INTEGRATION STRATEGIC PLANNING

Basis of the Assessment 4 Since 2012, NHTSA has assessed State traffic records systems using a standard set of questions and criteria. The questions and criteria are based upon the ideal traffic records system as defined by a broad group of SMEs in the Traffic Records Program Assessment Advisory.

Comparing States to the Ideal System Assessment questions allow assessors to: Identify strengths and challenge areas Rank questions to help prioritize investment Supply brief recommendations for improvement 5

Scoring 6 Question RatingQuestion Weight Meets3Very Important3 Partially Meets2Somewhat Important2 Does Not Meet1Less Important1 Possible Points = Question Weight X 3 (Meets) Question Score = Actual/Possible Points The Traffic records assessment is based on OMB’s Program Assessment Rating Tool (PART), which requires respondents to provide evidence for each question.

FIVE YEAR ASSESSEMENT CYCLE 7

IMPROVEMENTS FOR THE NEXT CYCLE Updated Advisory (2017) Rework/Replace/Eliminate problematic questions Clarify & expand the SUGGESTED evidence Reorganize Injury Surveillance section Updated STRAP online assessment tool Leverage data collected from prior assessments Update old responses Identify prior respondents for each question Track changes made since last assessment 8

9 17 Respondents 151 Hours States 14 Assessors 264 Hours NHTSA Average Level of Effort for Assessments

Module Scores 10

Score Distribution by Assessment Module 11

Assessment Scores 12 Overall National Average 66.3%

System Module Component Scores 13

Description and Contents Using the advisory criteria, States are assessed by their peers on 1.How they describe the purpose and function of each system module, 2.the data that they collect, and 3.the ownership and administration of each system. 14

Use data to identify crash risk factors, prioritize law enforcement resources, and evaluate programs Have criteria for PDO crash reports More than 75% of the States Assessed Include rehabilitation data in the Injury Surveillance System Less than 25% of the States Assessed Description and Contents Strengths and Opportunities 15

Applicable Guidelines Using the advisory criteria, States are assessed by their peers on the data standards and guidelines for the different components of their traffic records systems. 16

Use MMUCC to identify crash data elements and attributes to collect Has data on vehicle records recommended by AAMVA and/or received through NMVTIS Data interacts with the national driver registers PDPS and CDLIS Are NEMSIS-compliant More than 75% of the States Assessed Do not derive AIS and ISS scores from the State emergency department and hospital discharge data for motor vehicle crash patients Less than 25% of the States Assessed Applicable Guidelines Strengths and Opportunities 17

Data Dictionary Using the advisory criteria, States are assessed by their peers on the content and use of data dictionaries for each component system. 18

Have data dictionaries for EMS, hospital discharge, and trauma registry systems. More than 75% of the States assessed Data Dictionary Strengths 19

Procedures and Process Flows Using the advisory criteria, States are assessed by their peers on the ideal procedures for the collecting and managing the data for each system module 20

Have established procedures for identifying internal and external driver license fraud and CDL fraud A single entity collects and compiles data from local EMS, hospital discharge Have separate procedures for paper and electronic filing of EMS patient care reports Allow outside parties to access aggregate vital records data for analytic purposes More than 75% of the States Assessed Procedures and Process Flow: Strengths 21

Interface with Other Systems Using the advisory criteria, States are assessed by their peers on the ideal real time relationships between data sets that need to be connected and accessible at all times 22

Retrieve vehicle records using VIN, title number and license plate numbers. Driver information is accessed by authorized law enforcement and court personnel More than 75% of the States Assessed Share data between Crash and citation and Adjudication Crash and injury surveillance EMS and (1)Emergency department, (2) hospital discharge, and (3) trauma registry Less than 25% of the States assessed Interface with Other Systems Strengths and Opportunities 23

Data Quality Control Using the advisory criteria, States are assessed by their peers on the ideal practices for the data quality management for each component system. 24

Authorize staff to amend obvious errors and omissions in the vehicle database More than 75% of the States Assessed Data Quality Strengths 25

Conduct independent sample-based audits of crash reports and related database content. Produce data quality reports for their vehicle and driver databases. Less than 25% of the States Assessed Crash: Data Quality Opportunities 26

TimelinessAccuracy CompletenessUniformityIntegration Accessibility Crash XX Vehicle XXXXX Driver XXXXX Roadway XXXXX Citation & Adjudication X Less than 25% of the 22 States Assessed had Performance Measures in the following areas: 27

TimelinessAccuracy CompletenessUniformityIntegration Accessibility EMS XXX Emergency Room X XXXX Trauma Registry XXX Hospital Discharge Vital Records X XXXXX Less than 25% of the 22 States assessed had performance measures in the following areas: 28

Traffic Records Assessments In comparing a State’s traffic records system to the ideal outlined in the Advisory, assessments: 29 Identify strengths and challenge areas Rank questions to help prioritize investment Supply recommendations & considerations for improvement How do we move forward?

Programs to Improve Data Quality Go Team Training & Technical Assistance Crash Data Improvement Program (CDIP) Rodway Data Improvement Program (RDIP) State-To-State (S2S) Project Vehicle Product Information Catalog (VPIC) 30

QUESTIONS? 31