Traffic Records Assessment Assessor Training October 2015.

Slides:



Advertisements
Similar presentations
Gaining Senior Leadership Support for Continuity of Operations
Advertisements

SUPPLIER DATA INTEGRITY PROGRAM Supplier Contact Information SUPPLIER DATA INTEGRITY PROGRAM A Solution for Accurate and Up-To-Date Supplier Contact Information.
Summer Internship Program Outline
A Youth Capability Building Project- Muslim Youth Scholar Training Internship (MyStI) A Presentation by MY -FIKR (Muslim Youth Foundation of Intellectual.
PROJECT TITLE Project Leader: Team: Executive Project Sponsor (As Required): Date: Month/Day/Year 110/17/2014 V1.
PROJECT TITLE Project Leader: Team: Executive Project Sponsor (As Required): Date: Month/Day/Year 110/17/2014 V1.
Bridging Research, Information and Culture An Initiative of the Research and Planning Group for California Community Colleges Your Name Your Institution.
Overview of District Self Study. Objectives To understand specific aspects of the District Self Study: A.Purpose and use B.Recommended Practices C.Content:
The Assistant Principal Pool Process 2014
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Office of Assessment October 22, Smarter ELA/Literacy Smarter Mathematics Smarter Interim Comp Assessments Smarter Digital Library DCAS Science.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
Westminster City Council and Westminster Primary Care Trust Voluntary Sector Funding 2009/10 Voluntary Sector Funding Eligibility, Application Form Funding,
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
1 Review of NASBA Standards and Requirements Presented by Amy Greenhoe October 28, :30 Eastern Time.
Capability Cliff Notes Series PHEP Capability 14—Responder Safety and Health What Is It And How Will We Measure It?
Assessment Leader Training General Education Student Learning Assessment GEO Training Series: 2 of 5 Spring 2012 February 13, 2012.
Recruiting and Retaining Volunteers L. Jane Hansen Director, Region VI.
Do it pro bono. Competitor/Collaborator Analysis Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Competitor/Collaborator.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Better Business Cases “Investing for change” Overview
CriteriaExemplary (4 - 5) Good (2 – 3) Needs Improvement (0 – 1) Identifying Problem and Main Objective Initial QuestionsQuestions are probing and help.
Preceptor Orientation
ISO 9001: 2000 Certified Audit Process What to do.
Summary of Assessment Reports and Gap Analysis
John Stem 7 th Annual Summit on VR PEQA Louisville, Kentucky.
Before you begin. For additional assistance, contact your club’s Information Technology Chairperson or Electronic Learning at:
10/16/ State Strategic Plan Review 10/16/063 Section 408 Program Matrix Systems: Crash Roadway Vehicle Driver Citation / Adjudication Injury Surveillance.
A COMPETENCY APPROACH TO HUMAN RESOURCE MANAGEMENT
InTeGrate Assessment David Steer, University of Akron Ellen Iverson, Carleton College May 2012.
UK COMMISSION FOR EMPLOYMENT AND SKILLS The Employability Challenge Alison Morris Programme Manager UK Commission for Employment and Skills 2 December.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AND COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Excellence in Executive Leadership UNCLASSIFIED – For Official Use Only (FOUO) APEX 29 Case Study DoD Succession Management September 2009.
Columbia University School of Engineering and Applied Science Review and Planning Process Fall 1998.
Fig 1. Sample SHSPs and signing of the SAFETEA-LU Legislation supporting EMS’s SHSP role, August 10, 2005 Strategic Highway Safety Plans: Where is EMS?
Competency Assessment Advisory Team (CAAT) QUANTITATIVE REASONING DEPARTMENT OF MATHEMATICS REP – ROB NICHOLS 1.
National Highway Traffic Safety Administration Results from 22 Traffic Records Assessments John Siegler National Driver Register and Traffic Records Division.
Atlantic Innovation Fund Round VIII February 5, 2008.
TTI Performance Evaluation Training. Agenda F Brief Introduction of Performance Management Model F TTI Annual Performance Review Online Module.
EMS Technical Assessments for Critical Access Hospital Communities Mary Sheridan ORHP Grantee Partnership Meeting September 1, 2009 …improving access to.
Assessment Entry Module (AEM) Kick-off November 15, 2012 interRAI Preliminary Screener Toronto Central LHIN.
National Highway Traffic Safety Administration What to Expect When You’re Expecting a Traffic Records Assessment Luke Johnson 2015 Traffic Records Forum.
U.S. Department of Agriculture eGovernment Program eDeployment Kickoff August 26, 2003.
IAEA International Atomic Energy Agency. IAEA Outline LEARNING OBJECTIVES REVIEW TEAM AMD COUNTERPARTS Team Composition Qualification PREPARATORY PHASE.
National Highway Traffic Safety Administration Training and Technical Assistance Programs Sarah Weissman Pascual National Driver Register and Traffic Records.
 Credit Demonstrated by Mastery Principals’ Meeting September 2014.
Washington Traffic Records Committee Creating & Coordinating a Shared Vision for Traffic Records 2006 Traffic Records Forum August 1, 2006.
BSBPMG501A Manage Project Integrative Processes Manage Project Integrative Processes Project Integration Processes – Part 2 Diploma of Project Management.
Programme Description The Duke CE Advanced Certificate in Management for Oil and Gas (NQF Level 8) is aimed at Middle Managers making a transition to Senior.
ICAJ/PAB - Improving Compliance with International Standards on Auditing Planning an audit of financial statements 19 July 2014.
Elizabeth A. Baker, Ph.D.. NHTSA’s Assessment program provides technical assistance to State Highway Safety Offices, Emergency Medical Services Offices.
Evaluation. What is important??? Cost Quality Delivery Supplier Expertise Financial Stability Coverage Product Offerings Do you intend to negotiate?
Traffic Records Assessment Overview and Insights Luke Johnson | John Siegler Traffic Records Forum August 8, 2016.
Traffic Records Assessment Training August Advisory Updates Traffic Records Program Assessment Advisory update: Assessment participants have been.
Understanding Standards: Nominee Training Event
2016 Year-End Performance Management
NSF/NIH Review Processes University of Southern Mississippi
NSF/NIH Review Processes University of Southern Mississippi
Effective Safety Data Governance (Illinois Experience)
Recording Blackboard Collaborate Sessions
Conducting the performance appraisal
Conducting the performance appraisal
Business Process Outsourcing (BPO) for OEM Enterprises
Perspective Interview: Sofia Perez
NM Traffic Records Coordinating Committee
Schoolwide Programs.
Leadership and Nominating Committee
Second U.S. Roadway Safety Data Capabilities Assessment
Presentation transcript:

Traffic Records Assessment Assessor Training October 2015

What & Where are the Criteria? The Traffic Records Program Assessment Advisory: Provides guidance on the necessary contents, capabilities, and data quality measurements for a comprehensive traffic records system, Describes an ideal traffic records system, one that supports high-quality decisions that enable cost-effective improvements to highway and traffic safety, 2 Poses a uniform set of questions that reveals the performance of the State traffic records system relative to the ideal.

Scope of the Assessment TRCC Management Strategic Planning Data Use & Integration Six Core Data Systems: 3 CRASHDRIVERROADWAYVEHICLECITATION/ ADJUDICATON INJURY SURVEILLANCE

Assessment Questions TRCC Management19 Strategic Planning16 Crash44 Driver45 Vehicle39 Roadway38 Citation / Adjudication54 Injury Surveillance123* Data Use & Integration13 Total391 * Injury Surveillance now includes sub-sections on EMS, Emergency Room, Hospital Discharge, Trauma Registry, and Vital Records 4

Standards of Evidence The Advisory supplies a standard of evidence for each question. Describes the information needed to support State assertions that a traffic records system possesses the specific capability referred to in the question Determined by subject matter expert panels with State, Federal, academic, and other representation. 5

Assessment Ratings Upon review, assessors will make a determination for each question that reflects how the State’s traffic records systems are performing relative to the ideal detailed in the Advisory. MEETS the description of the ideal traffic records system PARTIALLY MEETS the description of the ideal traffic records system 6 DOES NOT MEET the description of the ideal traffic records system

Assessment Schedule Finalized ScheduleHoliday SunMonTueWedThuFriSat November STRAP Training Kickoff Meeting Round 1: Data Collection December Round 1: Analysis Round 2: Data Collection January Round 2: Analysis Round 3: Final Data Collection February Round 3: Final Analysis Facilitator Finalizes Report March Final Report Submitted Report Out

Time and Knowledge Needed Expect to spend no more than 40 hours working over the course of an assessment for an average module. (40-50 questions) What qualifies someone as an expert? “Expertise is a broad knowledge of how a system or process works; skill, training and experience can give a person a level of expertise, but the ability to analyze various aspects of a program, solve problems and offer solutions is central to being a subject matter expert. This doesn't mean you have to know all there is to know about a subject, more that you are aware of the broader issues such as management principles that foster improvement and excellence.” 8

Average Assessor Time in STRAP 9

Average Hours Per Phase 10

STRAP Process Flow State GR submits written request to NHTSA Region State Respondents answer assigned questions & supply evidence NHTSA Region forwards written request to NHTSA TR Team Assessors review answers & evidence; provide findings & ratings NHTSA TR Team confirms State request & schedules calls NHTSA TRA team hosts initial call (~4 months prior to kickoff) Facilitator leads ASSESSMENT KICKOFF MEETING State Coordinator sends Respondent info & assigns questions STRAP Tech Support hosts State Coordinator training webinar STRAP Tech Support Sends State Coordinator STRAP tokens STRAP Tech Support Launches Data Collection & Analysis Phases Facilitator hosts call (1 month prior to kickoff) STRAP Tech Support Sends Respondent STRAP tokens STRAP Tech Support sends Assessor STRAP tokens Facilitator review; forwards to NHTSA TR Team NHTSA TR Team generates Final Report; sends to State Facilitator leads ASSESSMENT REPORT OUT WEBINAR Assessors confirm final findings, ratings and summaries x3 Answers that meet the ideal are not returned to respondents in subsequent rounds NHTSA TR Team Facilitator State Leadership State Respondents NHTSA Region Assessors STRAP Support

State Traffic Records Assessment Process (STRAP) STRAP Overview

13

STRAP Overview

15

STRAP Overview 16

STRAP Overview – Module Leader 17

STRAP Overview – Respondent View 18

Sample Q & A Module: TRCC 1. Does the State have both an executive and a technical TRCC? Evidence Requirement: Provide a charter and/or MOU. Also provide a roster with all members' names, affiliations, and titles for both the executive and technical TRCC. State Response: Yes the State's TRCC, is organized as a 3-level tier. The top tier is the Executive level, the second tier is the technical level and the third are the working groups created to address current projects or challenges on a working level. See the attached TRCC charter and executive level roster. Rating: Meets the Advisory Ideal - Both an executive TRCC and a technical TRCC are in place, and detailed rosters illustrating the structure at each level have been provided. 19

Advice DO communicate with your co-assessor and/or Module Leader. (Printouts) DON’T use outside information for Ratings/Findings. Use only information provided by respondents. DO use outside information for clarification requests. Ask the respondent to confirm/corroborate potential outside information. DO use outside information for Module Summaries. When applicable, note what was provided in-system did not match external knowledge, but ratings had to be made based on what was actually submitted. 20

Advice DON’T restate the question or evidence requirement in the ballot and/or finding. DO Provide enough information so someone reading the report at a later date will understand how and why you reached your conclusion. DON’T write “State says no.” Instead write: "The State's crash and driver systems are not linked." 21

Final Report The result of the assessment is a report with the final question ratings and findings. Each Module leader also creates a Module Summary similar to the previous assessment that lays out the opportunities and strengths for each module. 22

Final Report 23 Recommendations are now automatically generated based on the score each module receives. Scores are calculated using a combination of the question importance and rating.

National Ratings 24

Becoming an Assessor If you would like to be considered as an assessor for future assessments of other States’ traffic records systems, please fill out the form provided at the back of the room. The form and other information is also available at: All names are sent to NHTSA for vetting and then put into the Subject Matter Expert Pool for selection. 25

Questions?