Good Practice Performance Indicators Engineering Practices Working Group Performance Indicator Sub-Group Engineering Performance Metrics D. C. Lowe October.

Slides:



Advertisements
Similar presentations
SITE HUMAN PERFORMANCE EVENT-FREE DAY RESET CRITERIA
Advertisements

KPI Familiarisation.
14th Annual RETS/REMP Workshop June 28-30, 2004 U.S. Nuclear Power Sister Plant Radiological Effluent Release Comparisons J.T. Harris 1,3, D.W. Miller.
Ensure Vendor/Engineer of Choice Product Quality
Impact of Business Analytics on Supply Performance: A Caribbean Perspective Inter-American Supply Chain Forum March 15-16, 2012 Presenter : Mushtaq Mohammed.
Advancing Excellence in America’s Nursing Homes A Review of 2 Clinical Tools: Pressure Ulcer and Restraints.
Quality Control Chapter 8- Control Charts for Attributes
Chapter 9- Control Charts for Attributes
Identifying the Underlying Factors Related to Placement Stability in Florida Penelope (Penny) L. Maza, Ph.D. Consultant National Resource Center for Child.
SOFTWARE PROJECT MANAGEMENT Project Quality Management Dr. Ahmet TÜMAY, PMP.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
Energy Audit- a small introduction A presentation by Pune Power Development Pvt. Ltd.
Key Performance Indicators - KPI’s
SchedulingProducingControllingSchedulingProducingControlling.
1 Issues Management at the Hanford Nuclear Reservation Tank Farms September 13, 2004.
© URENIO Research Unit 2004 URENIO Online Benchmarking Application Thessaloniki 7 th of October 2004 Isidoros Passas BEng Computer System Engineering.
Introduction to Software Quality Assurance (SQA)
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 14 Sampling Variation and Quality.
4. Establishing goals to guide & Measures to track.
FRANKLIN engineering group, inc. Start-up Shutdown Malfunction Plan Development and Implementation Duncan F. Kimbro
Chapter 6 : Software Metrics
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
ISMS Best Practices Workshop Initial Steps to Integrate HPI into ISMS Continuous Improvement CH2M HILL Hanford Group, Inc. September 12-13, 2006.
2007 CMBG Conference David Hembree Institute of Nuclear Power Operations June 20, 2007 Charleston, SC INPO Perspective on Configuration Management.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
NDIA Systems Engineering Supportability & Interoperability Conference October 2003 Using Six Sigma to Improve Systems Engineering Rick Hefner, Ph.D.
2 FOR INTERNAL USE ONLY Project Chartering  Define the components of a project charter  Develop a project idea into an effective project charter  Review.
Ahmad Al-Ghoul. Learning Objectives Explain what a project is,, list various attributes of projects. Describe project management, discuss Who uses Project.
Brookhaven National Laboratory Lessons Learned (LL)Program.
Margin Management. PAGE 2 Margin Management Plant Shutdowns 1.Late 1990’s – numerous “surprise” long-term plant shutdowns 2.Shutdowns resulted when a.
Lecture 4 Software Metrics
BPA M&V Protocols Overview of BPA M&V Protocols and Relationship to RTF Guidelines for Savings and Standard Savings Estimation Protocols.
CHAPTER 9 INSPECTIONS AS AN UP-FRONT QUALITY TECHNIQUE
F E D E R A L A V I A T I O N A D M I N I S T R A T I O N A I R T R A F F I C O R G A N I Z A T I O N 1 Federal Aviation Administration System Airport.
Performance Based Trending of Waste Disposition Project Data Quality Presented by Gary Coleman M.S. Nuclear Safety Engineer Bechtel Jacobs Company LLC.
Integrating Human Performance Improvement Concepts and Tools into Work Planning CH2M HILL Hanford Group, Inc. September 12-13, 2006.
Capturing the requirements  Requirement: a feature of the system or a description of something the system is capable of doing in order to fulfill the.
SchedulingProducingControllingSchedulingProducingControlling.
12/2/2015 2:45 AM 1 PROJECT TIME PLANNING Process and Bar Chart Technique.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
Integrating EM QA Performance Metrics with Performance Analysis Processes August 26, 2009 Robert Hinds, Manager, Quality Assurance Engineering & Greg Peterson,
BIMCO Shipping KPI System and Standard. Here's Why KPIs Are Linked To Your Success 29th October 2015NAMEPA Annual Conference2 Mike Harden, CEO of Clarity.
1 City of Shelby Wastewater Treatment Division Becomes State’s Second Public Agency to Implement a Certified Environmental Management System CERTIFICATION.
Adequate Yearly Progress By Allyson, Brette, and Riley.
Qualification & Training of Work Planners Steven K. Little Work Control Department Manager.
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.
Standard Design Process Overview
Office of Research and Information Technology CVSA and FMCSA SSDQ Performance Measures: Crash Timeliness Inspection Timeliness April , 2013 CVSA.
Industry Standard CM Indicators Mike Hayes Exelon Nuclear CM SME CMBG Steering Committee.
Change Management A process for process change by Cory R. Peters Exelon PowerLabs.
Department of Defense Voluntary Protection Programs Center of Excellence Development, Validation, Implementation and Enhancement for a Voluntary Protection.
Department of Defense Voluntary Protection Programs Center of Excellence Development, Validation, Implementation and Enhancement for a Voluntary Protection.
Design Oversight Working Group (DOWG) – Next Steps
Area Wide Optimization
DNP Initiative ENG-003 Standard Design Process Overview Configuration Management Benchmarking Group June 12, 2017.
Software Quality Control and Quality Assurance: Introduction
Student Support Study Methodology Training
Statistical Techniques
TA-V eB Performance Improvement Overview
NEI Configuration Control Benchmark – Palo Verde
Tech 31: Unit 4 Control Charts for Attributes
Configuration Control and the Standard Nuclear Performance Model
Prime Contractor Performance
Software metrics.
Hanford Performance Indicator Forum
Trending Requirements and Results
GSIP.
Nonconformity Writing
Presentation transcript:

Good Practice Performance Indicators Engineering Practices Working Group Performance Indicator Sub-Group Engineering Performance Metrics D. C. Lowe October 21, 2003

2 Performance Indicator Sub-Group Participants Dave Lowe, CH2M HILL Hanford - Chair Herb Mumford, Bechtel, INEEL Barb Quivey, LLNL Tom Monahon, Westinghouse Savannah River Harry Gullet, Sandia National Laboratories

3 Objectives Identify a set of “good practice” example PIs from commercial nuclear and DOE sites grouped into general categories applicable to engineering that could be adopted or modified, as appropriate, for use by engineering groups within the DOE complex. Demonstrate that engineering groups are practicing “good engineering.” Identify where engineering should focus attention to satisfy customer needs. Identify trends in equipment or system performance to focus resources correctly. Monitor engineering costs and efficiency.

4 Approach Gather input/examples from INPO, commercial nuclear, and DOE sites. Identify general categories that engineering groups typically monitor. Evaluate input from participating sites/plants corresponding to general categories. Provide good practice examples in each category.

5 PI Example Contributors DOE Sites Hanford Site (Tank Farms) Savannah River Site INEEL Commercial Nuclear Plants Columbia Generating Station Davis Besse McGuire Nuclear Station Watts Bar Nuclear Plant Wolf Creek Nuclear

6 Engineering PI Categories A. Product Quality/Technical Rigor B. Safety System Performance C. Configuration Control D. Production/Productivity E. Continuous Improvement F. Training and Qualification G. Engineering Cost

7 Product Quality/Technical Rigor Good Practice Examples 1. Document Quality –Change package quality –Engineering technical adequacy 2. Human Performance –Organizational quality clock –Plant engineering personnel error rate 3. Rework –Unplanned change package revisions due to design errors

8

9 Engineering Technical Adequacy

10 Project Management Section Clock

11 Plant Engineering Personnel Error Rate The Number of Significant Errors Recorded per 10,000 Hours Worked, 6 Mo.Rolling Average. Notes: Responsible Mgr:Data Provided By: PJ Inserra T Neidhold (C Leon) Definition: Any event that resets our department clock. There were no Human Performance Errors in the department for July.

12 Unplanned Changed Package Revisions Due to Design Errors

13 Path Forward Finalize “Safety System Performance” category Sub-group review of additional categories and selection of good practice examples Compile file report of good practice examples Target completion: March 1, 2004

14 Backup Slides

15 Document Quality Change Package Quality This PI tracks the quality of change packages as determined from a review performed by the responsible supervisor. Strengths: All the reviews and scoring are performed to a set of predetermined criteria. The PI includes descriptive information on what is measured, how it is measured and what is considered good performance. An action level is indicated on the chart. The change package customer rates the quality of the product.

16 Suggested Improvement: The PI would benefit from a clear visual indication that shows the direction of increasing or decreasing performance. Total population of work packages reviewed vs. the percentage meeting quality expectations would provide information that could be used to benchmark among different sites. Information should be provided on how the one-year weighted average plot data is calculated. Major rating criteria should be stated. Document Quality Change Package Quality

17 Document Quality Engineering Technical Adequacy This PI tracks the quality of engineering documentation as determined by a quality review group comprised of senior level engineers. Strengths: All the reviews and scoring are performed to a set of predetermined criteria. The PI includes a sub-chart that shows what attributes are reviewed and an average score for each attribute. A goal is indicated on the chart. A three-month rolling average is included so that trends are not masked by an individual month’s data.

18 Suggested Improvement: The PI would benefit from a clear visual indication that shows the direction of increasing or decreasing performance. Total population of documents reviewed vs. percentage meeting quality expectations would provide information that could be used to benchmark among different sites. Information should be provided on how the average document rating is calculated. Major rating criteria should be stated. Document Quality Engineering Technical Adequacy

19 Human Performance Project Management Section Clock This PI tracks the number of resets of the event free clock for a specific engineering group on a monthly basis. Strengths: The direction of increasing or decreasing quality is indicated on the chart. The PI includes descriptive information on what is measured and a color rating scale. The performance goal is stated. Monthly score (color) is directly indicated on the charted data.

20 Suggested Improvements: More definitive criteria should be included so that objective performance can be compared to other sites. This should include a reporting threshold and examples for events that would be counted. A rolling average could be included so that trends are not masked by an individual month’s data. If the indicator included information on the number of hours worked in a given month, data could be compared to other sites using the error rate method of tracking human performance. Human Performance Project Management Section Clock

21 Human Performance Plant Engineering Personnel Error Rate This chart plots the rolling average personnel error rate per 10,000 hours worked for the specific engineering department. Strengths: The direction of increasing or decreasing quality is clearly indicated by the color scale on the chart. The performance goal “green” is shown on the chart. Data are normalized (i.e., errors per 10,000 hours worked) so that information can be benchmarked among different sites. A rolling average is used so that trends are not masked by an individual month’s data.

22 Suggested Improvements: The chart should include descriptive information on what the threshold is for determining whether an error is significant. This was verbally provided as any entry into the plant corrective action system that had human error as a cause code and attributed to the Plant Engineering Department. The chart should include descriptive information on what is measured, why it is measured and how it is measured as will as recovery actions when performance goals are not met. Human Performance Plant Engineering Personnel Error Rate

23 Rework Unplanned Change Package Revisions Due to Design Errors This PI tracks the percentage of work packages that must be revised because design errors were detected during the review performed by the responsible supervisor. Strengths: The data are presented as a percentage vs. raw numbers so that information can be benchmarked among different sites. The minimum performance goal is shown on the chart. A rolling average is used so that trends are not masked by an individual month’s data. The PI includes descriptive information on what is measured, how it is measured and what is considered good performance.

24 Rework Unplanned Change Package Revisions Due to Design Errors Suggested Improvements: The PI would benefit from a clear visual indication that shows the direction of increasing or decreasing performance. Total population of work packages reviewed vs. the percentage meeting quality expectations would provide information that could be used to benchmark among different sites. Criteria should be provided for work package grading and the criteria for determining when a design error is significant, causing the package to be revised vs. inconsequential changes.