Bill Stone Research Administrator Missouri Department of Transportation Presented at 2014 AASHTO Research Advisory Committee Meeting, Madison WI, July.

Slides:



Advertisements
Similar presentations
External Quality Assessments Frequently Occurring Findings Observed by The IIA QA Teams.
Advertisements

DC Responses Received WA OR ID MT WY CA NV UT CO AZ NM AK HI TX ND SD NE KS OK MN IA MO AR LA WI IL MI IN OH KY TN MS AL GA FL SC NC VA WV PA NY VT NH.
National Core Indicators Overview for the State of Washington Lisa A. Weber, Ph.D. Division of Developmental Disabilities.
The Research Behind Strengthening Families. Building protective and promotive factors, not just reducing risk An approach – not a model, a program or.
Preparation for the RPSGT Board Certification Exam Overview of Features.
EDUCATION SERVICE UPDATE
AASHTOWare Program Benefits Standing Committee on Highways October 18, 2013 Tom Cole, Idaho DOT AASHTO Special Committee on Joint Development.
By Saurabh Sardesai October 2014.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Legal Department Outside Counsel Management 1 Enhanced Outside Counsel / Vendor Management Processes Timekeeper level budgeting for litigation matters.
Grants Business Process Re-Engineering (BPR) Overview
Trini Torres-Carrion. AGENDA Overview of ED 524B Resources Q&A.
Technical Services Assessment in Pennsylvania Academic Libraries Rebecca L. Mugridge University at Albany, SUNY American Library Association ALCTS Affiliates.
Transportation leadership you can trust. presented to NCHRP Project Panel presented by Cambridge Systematics, Inc. with PB Consult Inc. Texas Transportation.
1 FTA Program Update 2009 SCOPT Winter Meeting Phoenix, AZ December 1 – 4, 2009.
Overview of Features and Reports Version 2.0 Send inquiries to:
Update for 2011 ITRC Spring Anna Willett ITRC Director 2011 ITRC Spring Meeting Minneapolis, MN, April 4-8, 2011.
Monash University Library Quality Cycle EXCELLENCE AND DIVERSITY and LEADING THE WAY Monash University’s strategic framework and overall directions MONASH.
Wave 1 Project Current Status (identify recent engagements, deliverables, etc.) What’s Next Strategic Purchasing – Scientific Supplies MRO Supplies Computer.
MEASUREMENT PLAN SOFTWARE MEASUREMENT & ANALYSIS Team Assignment 15
Slide 1 D2.TCS.CL5.04. Subject Elements This unit comprises five Elements: 1.Define the need for tourism product research 2.Develop the research to be.
Evaluation in the GEF and Training Module on Terminal Evaluations
NATIONAL CORE INDICATORS ADULT CONSUMER SURVEY
ASCA PBMS Implementation Is your agency ready to participate in PBMS? Let’s look at the issues.
Engineers Without Borders – USA Project Kickoff Guidelines, Deadlines and other Important Information.
Transportation Research Board National Cooperative Research Programs RAC 101 July 26, 2010 Kansas City.
PROJECT MANAGEMENT. A project is one – having a specific objective to be completed within certain specifications – having defined start and end dates.
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Institutional Development Institutional Effectiveness Public Information.
School Improvement Planning and Reporting for Cohort 2 and Cohort 3 Schools June 8, 2015June 8, 2015.
2012 D.C. Trip Report June 5-7, 2012 Washington, D.C.
August 15 & 16, 2012 FFY2013 EAP Annual Training FFY2013 EAP Annual Training Part 8: EAP/WAP Internal Controls Assessment; Local Plan; Grant Contract;
Guidance Tool for Implementation of Traffic Incident Management Performance Measures (NCHRP 07-20) December 16, 2014.
TIG 2012 INFORMATION SESSION David Bonebrake, Glenn Rawdon, Jane Ribadeneyra Legal Services Corporation.
Douglas Townes FHWA Resource Center 3D Engineered Models for Construction - Implementation March 14, 2013 | Washington, DC TRB Committee on Intelligent.
“How the Task Forces are Helping the RAC Community” Taskforce Co-chairs: Linda Taylor, MnDOT – RAC Region 3 Mark Morvant – RAC Region 2 Bill Stone, MoDOT.
The Year-Long Journey Of a State Data Coordinator December, 2014.
SHSP Progress and Future Federal Perspective Beth Alicandri Office of Safety 2011 SHSP Peer Exchange June 14 – 15, 2011 Crown Plaza Hotel, Austin, Texas.
FGDC Business Update Ivan DeLoatch FGDC Steering Committee Meeting January 29, 2008.
Step Therapy State Legislation Update AK HI CA AZ NV OR MT MN NE SD ND ID WY OK KS CO UT TX NM SC FL GAALMS LA AR MO IA VA NC TN IN KY IL MI.
Marketing Efforts by State DOT Research Program: Survey by RAC Value of Research Task Force Pat Casey Value of Research Task Force Meeting July 29, 2015.
Impact Assessment Indicators for Administration of Public Transportation Grants NCHRP Task 49 Viktor Zhong Scott Baker Dec 3, 2015.
Internal Auditing Effectiveness
The Research Behind Strengthening Families. Implementation w/ Fidelity Implementation w/ Fidelity Results Model Tested by RCT Model Tested by RCT Traditional.
The Year-Long Journey Of a State Data Coordinator December, 2015.
Communities of Practice & L ESSONS L EARNED Budget, Finance, and Award Management Large Facilities Office May 2016 Large Facilities Workshop 2016 S. Dillon.
1 Measuring Impact Guide This guide is an introduction to assessing the impact and spending effectiveness of your district’s initiatives and the resources.
Asian Pacific Endowment 2015 Grantmaking Information Packet July 14, 2015.
Component D: Activity D.3: Surveys Department EU Twinning Project.
National Journal Presentation Credits Producers: Katharine Conlon Director: Afzal Bari House Committee Maps Updated: March 19, 2015.
STEP Systematic Tracking of Exchanges in Procurement.
Research Performance Measures Survey 2014
Presented by Jean Fecteau OEO Fiscal Analyst
Update on Mission: Lifeline Boston University Medical Center
Research Performance Measures Survey 2014
IRS Large Business & International Division (LB&I)
The State of the States Cindy Mann Center for Children and Families
DevInfo Claes Johansson
Current Status of State Medicaid Expansion Decisions
Status of State Medicaid Expansion Decisions
Current Status of State Medicaid Expansion Decisions
Status of State Medicaid Expansion Decisions
A: Always check your funds netting totals.
AASHTO OC LRFD Survey LRFD Scoreboard LRFD Scoreboard
LRFD Scoreboard LRFD Scoreboard LRFD Scoreboard
Transportation Research Board National Cooperative Research Programs
National Core Indicators
Train-the-Trainer Sessions 429 sessions with 12,141 participants
ACNM Affiliates Open-Agenda Webinar/ Teleconference
Presentation transcript:

Bill Stone Research Administrator Missouri Department of Transportation Presented at 2014 AASHTO Research Advisory Committee Meeting, Madison WI, July 24, 2014

Identified very broadly defined performance measures used by state DOTs at the time Measures taken from public sources such as accountability reports, state DOT websites, and other web-based resources Out of 40 states, 10 had published measures and 30 had no public information Survey lists metric, source and URL

The DOT State Stats report is a synthesis of facts, figures, statistics and metrics pulled from accountability reports, online performance measurement dashboards and fact books. 2009: Need existed to easily find and access local statistics Editions available:

Annual figures compiled in easy-to- use synthesis report online. Excel dataset also available online. Created and launched by previous MoDOT Librarian; assisted by members of the Midwest Transportation Knowledge Network (MTKN) and hosted on their website edition included metrics from 42 states. Publication is used by the transportation community at large.

Searching on “research” retrieves only 4 results out of 1,580 measures Synthesis report  no index Dataset  searchable but not categorized No longer current  not updated since 2012

Obtain input from research administrators actual or anticipated use of research performance measures to document the progress and success of their research programs Measures with performance targets or metrics that track activity Measures that were program-specific not project-specific Measures from all research program areas (pure research projects, library and/or technology transfer program)

Have future plans (5) High interest (1) Have potential measure in mind (1) Discussed it (1) Track but not as measure (1) No comment (1)

Track activity 56% Have performance targets 28%

20% of states collect or report statistics for more than just one time period

CURRENT USE Customer satisfaction (AZ) Monitor progress toward strategic goals (DC); adjust performance (TX) Make a difference in organizational business processes (MO) Determine program effectiveness (IA) Identify strengths and weaknesses (IN); what’s working or not working (MD) Tied to individual performance evaluations (LA);

CURRENT USE (CONT’D) Assess researcher performance and the usefulness of completed research (NC) Select new projects (TX) Ensure that projects move forward according to set budget and schedule (UT, WY) Develop and monitor SPR Part 2 program (WI) FUTURE USE Will use to direct funding (FL) Will use to gauge and improve performance (IL)

Applied 31 unique categories to 103 submitted individual measures Measures by Status Current: 69 measures Planned: 21 measures Considering: 10 measures Past: 2 measures In progress: 1 measures Some measures have multiple categories applied

TOP CURRENT: Completion (14), Implementation (9), Library Utilization/Processing (8), Training (7), Satisfaction (6) TOP IN PROGRESS/PLANNED/POTENTIAL: Implementation (6), Completion (3), Cost Savings (3), Timeliness (3), Within Budget (3)

Related to number No. of completed research projects Related to on-time or on-budget % of research projects meeting original completion date No. of projects completed in the FY on schedule % of projects completed according to the initial budget % of projects completed on time/on budget % of studies completed within the approved schedule of the Work Program

Related to satisfaction % of completed studies deemed satisfactory by the project sponsor Customer satisfaction survey for completed Research Related to implementation In past 5 years, 75% of completed research projects provide recommendations for implementation of results endorsed by the Project Review Committee

Related to number No. of research results and best practices implemented No. of NCHRP and other external research program results implemented % of projects implemented Fully, partially, later, cannot/or should not be implemented, within 2 years of final research report (using 5 years of data) % of projects actively utilized in the field

Related to number cont’d % of projects that have resulted in a spec, policy, procedure, manual, requirement or material change In past 5 years, 75% of completed research projects provide recommendations for implementation of results endorsed by the Project Review Committee Multi-year tracking of implementation work Related to funding Amount of funding for implementation activities

CA and MN: Reviewed DC & LA: Used as basis for developing PMs IL: Used to develop a structure for functional PMs IN: Exploring how to move measures to RPM UT: Use to upload measures ID, LA, MI, MO, PA and UT: Use for HVR submittals

PROS “RPM Web was extremely useful in helping us grasp the different types of performance measures and developing a structure for functional performance measures.” “Need to re-visit the site now that our program is more developed to see how it might help.” CONS “Adapting to RPM Web would require a lot of work.” “Have mixed feelings about the usefulness.” “Did not find the tools very beneficial, because the benefits need to be customized for individual states …” “Information may be duplicated elsewhere … Would need to build the measures into projects prior to start … Input not required/hasn’t been a priority.”

Who bears the cost of developing measures for each individual project? Researcher? Program director? What is the benefit/cost difference of applying measures to all projects versus a sample? What is the right balance between qualitative vs. quantitative data to demonstrate program effectiveness? Issues/challenges with what administration wants or needs (prediction versus real cost or savings)

Bill Stone, PE Construction, Materials and Research 1617 Missouri Blvd. P.O. Box 270 Jefferson City Missouri Research Ahead