Download presentation
Presentation is loading. Please wait.
Published byAnnis Andrews Modified over 9 years ago
1
Quality Assurance Improvement Plan and Software Quality Assurance Implementation Plan February 23, 2004 Defense Nuclear Facilities Safety Board Frank Russo Deputy Assistant Secretary For Corporate Performance Assessment
2
2 Accomplishments Quality Assurance Improvement Plan EM: 12 of 16 commitments completed NNSA: 7 of 8 commitments completed SQA Implementation Plan EH: 14 of 26 commitments completed
3
3 QA Program Improvements Established DOE Office of QA Programs Established EFCOG QA Group – tasks in development Quality Workshop conducted December 2003 QA Directive revisions in progress
4
4 QA Directive Status QA Order 414.1B nearing issuance – awaiting Secretarial Office concurrences Suspect/Counterfeit Items Guide 414.1-3 draft ready for February release QA Guide 414.1-2A revision initiated with target release date April 2004
5
Software Quality Assurance Implementation Plan February 23, 2004 Defense Nuclear Facilities Safety Board Chip Lagdon Director Office of Quality Assurance Programs
6
6 Overview Recent Accomplishments Toolbox Code Status Design Code Survey Assessment Schedules SQA Knowledge Portal SQA Directives Status Summary and Path Forward
7
7 2002-1 IP Accomplishments CommitmentDescriptionResponsibilityStatus 4.1.1 Define SQA roles and responsibility EHCompleted 4.1.2 Develop SQA qualification standards for Federal personnel Federal Technical Capability Panel Completed 4.1.3 Identify Federal SQA positions EM, NNSACompleted 4.1.4 Qualify Federal personnel EM, NNSA Open (9/04) 4.1.5 Revise Functions, Responsibilities and Authorities Manual (FRAM) EHCompleted 4.1.6 Revise FRA documents EM, NNSA Open (4/04)
8
8 2002-1 IP Accomplishments (Cont.) CommitmentDescriptionResponsibilityStatus 4.2.1.1 Identify safety analysis codes for “toolbox” EHCompleted 4.2.1.2 Establish SQA criteria for toolbox codes EHCompleted 4.2.1.3 Perform SQA gap analyses on toolbox codes EH Initial Report Completed 4.2.1.4 Develop safety analysis code guidance reports EHCompleted 4.2.1.5 Perform design code survey EH Initial Report Completed
9
9 2002-1 IP Accomplishments (Cont.) CommitmentDescriptionResponsibilityStatus 4.2.2 Establish Central Registry for toolbox codes EHCompleted 4.2.3.1 4.2.4.1 Develop SQA Criteria and Approach Document (CRAD) EHCompleted 4.2.3.2 4.2.4.2 Establish site assessment schedule EM, NNSACompleted 4.2.3.3 4.2.4.3 Conduct site assessments EM, NNSA Open (per schedule) 4.3.1 Review industry or Federal SQA standards EHCompleted 4.3.2.1 Establish schedule on SQA Directives EHCompleted
10
10 2002-1 IP Accomplishments (Cont.) CommitmentDescriptionResponsibilityStatus 4.3.2.2 Issue SQA Directives EH Open (12/04) 4.3.3 Review SQA Directives NA, NNSA, etc. Open (per issuance) 4.4.1 Establish corporate SQA function within EH EH Open (3/04) 4.4.2 Identify methods for capturing SQA lessons learned EHCompleted 4.4.3 Establish relationship with outside SQA organizations EHCompleted 5.2.1 Conduct periodic briefings to DNFSB EHOngoing
11
11 Overview of Code Commitments 4.2.1.14.2.1.24.2.1.34.2.1.44.2.1.5 2002-1 Mar 03Sep 03Jan 04Sep 03Dec 03 Implementation Plan (IP) Commitments 2002-1 IP Deliverables Status Identify Safety Analysis Codes Establish SQA Criteria Perform Gap Analysis of Each Code Develop Guidance Report for Each Code Survey Design Codes Safety Analysis Survey Report SQA Plan and Criteria Report Gap Analysis Reports Code Guidance Reports Design Code Survey Report Ongoing Review CENTRAL REGISTRY [http://tis.eh.doe.gov/techstds/toolsframe.html] COMPLETE
12
12 Issued interim gap analysis reports for ALOHA, MACCS2, EPIcode, MELCOR, CFAST and GENII toolbox codes Reports Available: http://tis.eh.doe.gov/techstds/toolsframe.html http://tis.eh.doe.gov/techstds/toolsframe.html No evidence of software-induced errors in the codes that would have led to non- conservatisms at defense nuclear facilities SQA improvements identified for safety analysis software Gap Analysis of Six Toolbox Codes
13
13 Toolbox Code Gap Analysis Summary Criteria MACCS2ALOHAEPIcodeMELCOR GENII 1.485 GENII 2.0 CFAST Software Classification Yes Procedures/Plan Part/No YesPart/No Requirements Part/No YesPart/No Design Part/No YesPart/No Implementation Part/No Yes Part/No Testing Part/No YesPart/No Instructions Part/No YesPart/No Accept Test Yes Part/No Config. Cont. Part/No Yes Part/No Error Impact Part/No Yes Resources (FTE) 2.0 yr1.25 yr 2.0 yr1.0 yrN/A0.5 yr
14
14 Conducted a survey of design codes currently in use to determine if any should be included as part of the toolbox Survey period from September to December 2003 14 organizations at 13 DOE sites responded Issued Initial Report in December 2003 Design Code Survey
15
15 Multiple-Use Design Software Characteristics Proprietary Extensive Worldwide User Groups Website Well-Supported in Many Cases Various methods on SQA - 10 CFR 50 Appendix B - NQA-1 - ISO 9000/9001
16
16 Software Assessment Schedules SiteDateSite Team LeadEH Rep Y-121/20-23/04Sherry HardgravePranab Guha ORP2/16-20/04Dave BrownSubir Sen LANL2/9-13/04Chris MurnanePranab Guha SRS – NNSA2/16-20/04Gregg NelsonChip Lagdon SRS – EM2/16-20/04Bill RolandChip Lagdon Pantex3/22-26/04Al MacDougallDebra Sparkman SNL3/16-19/04Mark HamiltonSubir Sen IDApril or JuneBob BlythPranab Guha NTS4/26-30/04Tim HendersonTBD RLMarchShiv SethSubir Sen LLNLTBD. No later then 8/15/04 Adeliza CordisTBD
17
17 SQA Knowledge Portal Incorporates functions of Central Registry and SQA list server Repository for SQA knowledge Toolbox Code information, Reports and Standards, Training Courses, Procedures Collaboration space for SQA Community SQA SME locator SQA Discussion Forum
18
18
19
19 SQA Directives Status Software Categorization Grading Work Paper Developed First SME Panel Review Complete Additions to software type/approach table Definitions clarified for software types and their application Concern grading definitions go beyond SQA IP scope
20
20 SQA Directives Status Order and Guide Development March – Sept 04 SME Input throughout DOE O 414.1C on REVCOM – Sept 04 Consider extending DOE N 411.1 until DOE O 414.1C is issued DOE O 414.1B contains SQA responsibilities, clarify safety system software w/in scope
21
21 2002-1 IP Commitment Summary and Path Forward Developed SQA Knowledge Portal with Central Registry Finalize safety analysis code guidance and gap analysis reports by April 2004 and develop path forward Issue SQA directives by end of 2004 Participate in NNSA and EM SQA site assessments Partner with NNSA and EM to complete site assessments NNSA and EM to update FRA documents Host SQA training May 2004 Partner with NNSA and EM in monitoring of cross- cutting SQA issues
22
Possible Backup Slides May also be incorporated into speakers notes
23
23 SQA Evaluation Process 1. Review Documentation: Software Developer Reports Previous Evaluations Journal & Conference Documents Primary SQA Criteria Implementation Criteria 2. Process Information Template 3. Assess Software Quality Assurance Plan 4. Assess Software Engineering Documentation Software Requirements Document Software Design Document Test Case Description/Report Software Configuration and Control User’s Instructions Error Notification 4.a Assess Training & Identify Software Development Plans SQA Software Modifications 5. Document in Gap Analysis Report Compliant Areas Areas for Improvement Recommendations from DOE Users Estimate of Resources
24
24 MACCS2 Gap Analysis Two of ten code developer SQA areas satisfactory Recommend improvement in eight other areas Key areas of improvement Update model description and finalize University of New Mexico Verification document Add prototypic problems and error diagnostics to documentation Enhance user feedback and technical information exchange Version 1.13 to be released mid-2004 Fixes multiple plume segment and emergency preparedness models New version and guidance report address Tech-25 issues Fire plume phenomenology, code errors, end user QA problem (dose conversion factors), and documentation quality Conservative estimate to upgrade: 2 FTE-years
25
25 ALOHA Gap Analysis Software Classification and User Instructions SQA areas are satisfactory Recommend improvements in eight other SQA areas ALOHA areas of improvement (new version in progress) Correct IDLH Isopleth footprint Write-protect chemical library Release duration/distance Allow multiple receptors in one run Vapor pressure, pool and evaporation models, others Complete NOAA Theoretical Description Memorandum Conservative estimate to upgrade: ~16 FTE-months
26
26 EPIcode Gap Analysis Software Classification and User Instructions SQA areas are satisfactory Recommend improvements in eight other SQA areas EPIcode areas of improvement Add dense gas model capability Hourly input of meteorology Surface roughness adjustment Conservative estimate to upgrade: ~16 FTE-months
27
27 CFAST Gap Analysis Software Classification and Configuration Control SQA areas are satisfactory Recommend improvements in eight other SQA areas CFAST areas of improvement Provide comprehensive output description Include training requirements Document acceptance test for users Implement formal error notification and corrective action process Graphical User Interface for Version 5.0.1 should be released Conservative estimate to upgrade: 0.5 FTE-year
28
28 MELCOR Gap Analysis Software Classification, Implementation, User Instructions, Acceptance Testing and Configuration Control SQA areas are satisfactory Recommend improvements in five other SQA areas MELCOR areas of improvement Include sample problems relevant to Leak Path Factor analysis Improve user control of output Conservative estimate to upgrade: 1 - 2 FTE-years
29
29 GENII Gap Analysis Older Version (1.485) SQA satisfactory in nine of ten areas Exception in error impact Newer Version (2.0) SQA satisfactory in two of ten areas Recommend improvements in eight areas GENII 1.485 Applicable to DSAs No longer in development by PNNL GENII 2.0 Still in Testing, Not Finalized => Not applicable to DSAs Conservative estimate to upgrade: ~10 FTE-months
30
30 Strategy for Design Software Different from safety analysis software Proprietary, commercially competitive Frequently updated Define value-added course of action Current protocol of development, verification, testing, and control measures is adequate Facilitate reporting and disposition errors and deficiencies Lessons learned among DOE users Software training Develop web-based information systems Arranged by design software category Re-survey as appropriate
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.