GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process GLAST LAT GLAST LAT System Engineering Gamma-ray Large Area Space Telescope.

Slides:



Advertisements
Similar presentations
Configuration Management
Advertisements

MODELING THE TESTING PROCESS Formal Testing (1.0) Requirements Software Design Risk Data Approved, Debugged, Eng. Tested Code Automated Test Tools Tested.
Chapter 7: Key Process Areas for Level 2: Repeatable - Arvind Kabir Yateesh.
Georgia Department of Labor IDENTITY THEFT Presenter: Racquel Robinson.
DoD Information Technology Security Certification and Accreditation Process (DITSCAP) Phase III – Validation Thomas Howard Chris Pierce.
Lecture 8. Quality Assurance/Quality Control The Islamic University of Gaza- Environmental Engineering Department Environmental Measurements (EENV 4244)
GLAST LAT ProjectLAT Engineering Meeting, April 1, 2003 GLAST Large Area Telescope: Performance & Safety Assurance Darren S. Marsh Stanford Linear Accelerator.
GLAST LAT Project June 6, 2007 NCRs and Waivers 1 GLAST Large Area Telescope Systems Engineering Test Status, NCRs and Verification Status Pat Hascall.
COMP8130 and 4130Adrian Marshall 8130 and 4130 Test Execution and Reporting Adrian Marshall.
GLAST LAT Project Aug 1, 2007 NCRs and Waivers 1 GLAST Large Area Telescope Systems Engineering Test Status, NCRs and Verification Status Pat Hascall Systems.
GLAST LAT Project Oct 5, GLAST Large Area Telescope Systems Engineering Test Status, NCRs and Verification Status Pat Hascall Systems Engineering.
GLAST LAT Project ISOC Peer Review - March 2, 2004 Document: LAT-PR Section 2.3 Verification and Validation 1 Gamma-ray Large Area Space Telescope.
Software Configuration Management (SCM)
GLAST LAT Project4 April 2005 LAT System EngineeringLAT Test Planning Meeting GLAST LAT GLAST LAT System Engineering Gamma-ray Large Area Space Telescope.
Secure System Administration & Certification DITSCAP Manual (Chapter 6) Phase 4 Post Accreditation Stephen I. Khan Ted Chapman University of Tulsa Department.
GLAST LAT Project Technical/Cost/Schedule Review 03/30/ Integration and Test 1 GLAST Large Area Telescope: I & T Input to Monthly Technical/Cost/Schedule.
GLAST LAT ProjectMay 25, 2006: Pre-Environmental Test Review LAT Test Verification Process Presentation 4 of 12 GLAST Large Area Telescope Gamma-ray Large.
COMMERCIAL “SOLE SOURCE” PROPOSAL ANALYSIS ROADMAP 1. Is information available within the Government? Step 1 – Information within the Government If Yes.
GLAST LAT Project28 March 2005 LAT System EngineeringLAT Test Planning Meeting GLAST LAT GLAST LAT System Engineering Gamma-ray Large Area Space Telescope.
GLAST LAT ProjectSeptember 15, 2006: Pre-Shipment Review LAT Requirement Verification Presentation 5 of 12 GLAST Large Area Telescope Gamma-ray Large Area.
GLAST LAT Project Oct 5, GLAST Large Area Telescope Systems Engineering Test Status, NCRs and Verification Status Pat Hascall Systems Engineering.
GLAST LAT Project May 2, 2007 NCRs and Waivers 1 GLAST Large Area Telescope Systems Engineering Test Status, NCRs and Verification Status Pat Hascall Systems.
Quality Assurance/Quality Control Policy
CBIIT Quality Assurance Process Preston Wood NCI CBIIT Government Quality Representative (GQR) January 2014 RS.
CEN th Lecture CEN 4021 Software Engineering II Instructor: Masoud Sadjadi Change Control.
The Key Process Areas for Level 2: Repeatable Ralph Covington David Wang.
Commissioning of Fire Protection and Life Safety Systems Presented by: Charles Kilfoil Bechtel National Waste Treatment Plant Richland WA.
OSF/ISD Project Portfolio Management Framework January 17, 2011.
GLAST Large Area Telescope LAT Flight Software System Checkout TRR Current Status Sergio Maldonado FSW Test Team Lead Stanford Linear Accelerator Center.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
SERVICE MANAGER 9.2 CHANGE PRESENTATION JUNE 2011.
How To Build a Testing Project 1 Onyx Gabriel Rodriguez.
Monitoring IRB Monitoring of Clinical Trials. Types of Monitoring Internally Internally Externally Externally.
Software Project Management
Background Management Council (MC) was briefed on approach in early Feb 2003 and approved it Agreed that every Service Group (SG) will participate in.
Third Party Verification Services. Who are these people?  These are 3 rd party verification companies hired by large companies, often called Owner.
© Mahindra Satyam 2009 Configuration Management QMS Training.
SOFTWARE CONFIGURATION MANAGEMENT. Change is inevitable when computer software is built. And change increases the level of confusion among software engineers.
GLAST LAT Project4 April 2005 LAT System EngineeringLAT Test Planning Meeting GLAST LAT GLAST LAT System Engineering Gamma-ray Large Area Space Telescope.
Submitting Course Outlines for C-ID Designation Training for Articulation Officers Summer 2012.
® Incentive Programs New Functionality December 6, 2010.
GLAST LAT Project LAT System Engineering 1 GLAST Large Area Telescope: LAT System Engineering Pat Hascall SLAC System Engineering Manager
LAT-MD August 2006 LAT Requirement Sell-off Package GLAST Large Area Telescope Gamma-ray Large Area Space Telescope GLAST Large Area Telescope.
Unit 17: SDLC. Systems Development Life Cycle Five Major Phases Plus Documentation throughout Plus Evaluation…
Certified Evaluation Orientation August 19, 2011.
HNDIT23082 Lecture 09:Software Testing. Validations and Verification Validation and verification ( V & V ) is the name given to the checking and analysis.
GLAST LAT ProjectCDR/CD3 Review May 12-16, 2003 Document: LAT-PR-01967Section XX 1 PREP For CDR - April Schedule Wall Walk GSFC Chart Flip (Select) Post.
GLAST LAT ProjectFace to Face, 14 April 2004 LAT System Engineering 1 GLAST Large Area Telescope: EGSE and Interface Verification Pat Hascall SLAC System.
SwCDR (Peer) Review 1 UCB MAVEN Particles and Fields Flight Software Critical Design Review Peter R. Harvey.
Chapter 3-Auditing Computer-based Information Systems.
GLAST Large Area Telescope LAT Flight Software System Checkout TRR Software Quality Assurance Kelly Burlingham SQE Stanford Linear Accelerator Center Gamma-ray.
6/6/ SOFTWARE LIFE CYCLE OVERVIEW Professor Ron Kenett Tel Aviv University School of Engineering.
GLAST LAT Project13 June 2005 LAT System EngineeringLAT Test Planning Meeting GLAST LAT GLAST LAT System Engineering Gamma-ray Large Area Space Telescope.
DEVELOPING POLICIES & PROCEDURES Presented by Lynn L. Buchanan President Buchanan & Associates Consulting For Alaska Association Medical Staff Services.
Establishing and Managing a Schedule Baseline
Software Configuration Management
GLAST Large Area Telescope
IEEE Std 1074: Standard for Software Lifecycle
The V Model The V Model Damian Gordon Damian Gordon.
GLAST LAT System Engineering
National Product Catalogue Overview
GLAST Large Area Telescope
SQA Role during Software Requirements Phase
GLAST LAT System Engineering
GLAST Large Area Telescope GLAST Large Area Telescope
Thermal Vacuum Test: Plans and Procedures
GLAST Large Area Telescope GLAST Large Area Telescope
Lecture 09:Software Testing
GLAST LAT System Engineering
Attendance Data
Presentation transcript:

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process GLAST LAT GLAST LAT System Engineering Gamma-ray Large Area Space Telescope LAT Level Test Verification Process “The Run for the Record” Rich Baun (650)

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 2 Agenda  Test case run review  Test case run buy-off  Requirement Verification

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 3 Test Case/Reqt Buy-Off Overview

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 4 Test Case Run Review  Test Case Run Review –Test cases are run by I&T in virtually any order Systems/Commissioner should review plans to insure all case interdependencies are considered –After each test case is run, I&T reviews the data and notifies Systems that a run was valid A valid run is one in which there were no script, database, test set, test software, configuration errors, or operator errors that invalidate the test This review must be within 24 hours of test execution Obviously, any issues determined to be anomalies must be reported through the appropriate system (JIRA or NCR) –Test script, software, and config file errors must be documented –I&T must establish and maintain a Test Case Review Log that contains the following minimum information Test case name and number, script, start time, run number, LAT configuration discrepancies, I&T reviewer, and the numbers of any NCRs or JIRAs written during the test case –Log should allow space for expert/Systems sign-off

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 5 Test Case Run Buy-off  Test Case Run Buy-Off –Once I&T has indicated a test run is valid, Systems, the Commissioner and/or assigned experts will review the run data –Buy-off Process Each test case will have at least one assigned reviewer –Systems Engineering will produce a matrix of who is responsible to review test case/requirement data –Requirement owners will also review the data collected in each test case that is pertinent to their requirements –Reviewers and requirement owners will sign-off on the Test Case Review Log when review is complete If the reviewer finds the run sufficient, they note the test case as “complete” on the Test Case Review Log –Test cases with more than one reviewer are not sold until all sign-off If a reviewer detects an anomaly or that the run was insufficient, they will report this through appropriate documentation (JIRA) –A JIRA is needed to justify modification or a re-run of the test Buy-off must be complete within one week of test execution or before test exit, which ever occurs first

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 6 Requirement Sell-Off ( 1 or 2 )  Requirement Sell-Off, High-level approach –Only functional requirements will be sold in the Baseline LAT Test (or at the earliest opportunity) –All performance requirements will not be sold until after all environments have been completed, that is following TV Performance requirement owners will be expected to look at performance data from all test phases and perform an “early” assessment –Environmental requirements will be sold only after testing in the appropriate environment is performed For example, vibe related requirements will not be sold until after Sine-Vibe  Requirement Sell-Off, Concept of Operations –Nominally, each requirement owner will have 1 week after test execution to review the test data Owners requiring longer time periods will need to negotiate with Systems –For most requirements, simply executing the test case and certifying by review the test was successful is sufficient for sell-off For each requirement, reviewer names, when the data was reviewed, test case numbers, scripts, run times, and run numbers will be logged in the VCRM

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 7 Requirement Sell-Off (2 of 2)  Requirement Sell-Off, Concept of Operations (continued) –For those requirements needing post-test analysis The requirement owner will have 2 weeks after test execution to produce a Requirement Verification Report (RVR) –Owners requiring longer time periods will need to negotiate with Systems Once an RVR is written, it will be reviewed and approved by Systems and submitted to the customer The requirements in this category will be agreed to in advance with the customer and the Commissioner –A weekly Running Sell meeting (as required) will be held The customer, the Commissioner, Systems, and the appropriate requirement owners who are presenting will attend Purpose is to review test case runs that sold requirements by their successful completion to –Obtain the customer’s concurrence on each RVR and test case completed that week –Review the updated VCRM

GLAST LAT Project5 April 2006 LAT System EngineeringLAT Verification Process 8 Requirements Verification Reports  For an agreed to subset of LAT requirements only, not all reqts  Must have an assigned LAT Docs number  Minimum RVR contents –Requirement owner’s name –Requirement source, text, and number –Verification Plan text and number –Supporting test case name and number (if applicable) –Script name(s) used to run test –Date, time, and run number when supporting test case(s) were run If data was collected from multiple runs, each run must be noted –Compliance statement, i.e. the report must state definitively the LAT complies with the requirement Margin to the requirement must be stated where appropriate –Supporting documentation of test data, analysis, and results –Deviations from the verification plan and the reasons for the deviations must be stated –Any NCRs or JIRAs associated with the requirement must be listed