Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation.

Slides:



Advertisements
Similar presentations
Introduction to Maven 2.0 An open source build tool for Enterprise Java projects Mahen Goonewardene.
Advertisements

INTRODUCTION TO ORACLE Lynnwood Brown System Managers LLC Backup and Recovery Copyright System Managers LLC 2008 all rights reserved.
Ted Hesselroth USCMS T3 Meeting Abhishek Singh Rana and Frank Wuerthwein UC San Diego Ted Hesselroth STAF/STAX OSG Storage Demo of.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
By SAG Objectives Cross platform QA Automation for web applications Scheduling the automation Automatically build the test scripts Generate the.
The LEGO Train Framework
1 Databases in ALICE L.Betev LCG Database Deployment and Persistency Workshop Geneva, October 17, 2005.
XML Parsing Using Java APIs AIP Independence project Fall 2010.
Automated Tests in NICOS Nightly Control System Alexander Undrus Brookhaven National Laboratory, Upton, NY Software testing is a difficult, time-consuming.
Automating Drupal Deployment Dominique De Cooman.
CFT Offline Monitoring Michael Friedman. Contents Procedure  About the executable  Notes on how to run Results  What output there is and how to access.
Framework for Automated Builds Natalia Ratnikova CHEP’03.
Unit tests, Integration tests Physics tests Andrea Dotti, Gunter Folger, Pere Mato CERN – PH/SFT Geant4 workshop 2012.
Introduction to Hall-D Software February 27, 2009 David Lawrence - JLab.
Status of the Cuore software Marco Pallavicini Cuore Co-Software coordinator with M. Pavan (Milano) Università di Genova & INFN.
Business Unit or Product Name © 2007 IBM Corporation Introduction of Autotest Qing Lin.
The Pipeline Processing Framework LSST Applications Meeting IPAC Feb. 19, 2008 Raymond Plante National Center for Supercomputing Applications.
INFSOM-RI Juelich, 10 June 2008 ETICS - Maven From competition, to collaboration.
Design and Programming Chapter 7 Applied Software Project Management, Stellman & Greene See also:
Experience with analysis of TPC data Marian Ivanov.
Object-Oriented Analysis & Design Subversion. Contents  Configuration management  The repository  Versioning  Tags  Branches  Subversion 2.
Framework for Raw Data Thomas Kuhr Offline Week 29/06/2004.
1 Performance Optimization In QTP Execution Over Video Automation Testing Speaker : Krishnesh Sasiyuthaman Nair Date : 10/05/2012.
Progress with migration to SVN Part3: How to work with g4svn and geant4tags tools. Geant4.
1 / 22 AliRoot and AliEn Build Integration and Testing System.
Introduction Advantages/ disadvantages Code examples Speed Summary Running on the AOD Analysis Platforms 1/11/2007 Andrew Mehta.
Validation related issues Auger-Lecce, 10 November 2009  BuildBot- introduction   Site Wide Installation  Issues related to Install/Config/Valid.
DAQ Data Processing Chain Vasilis Vlachoudis Feb 2015.
Partners’ Webinar 01/31/2013 Karol Jarkovsky Solution Architect Upgrading Kentico.
A Technical Validation Module for the offline Auger-Lecce, 17 September 2009  Design  The SValidStore Module  Example  Scripting  Status.
INFSO-RI Enabling Grids for E-sciencE SCDB C. Loomis / Michel Jouvin (LAL-Orsay) Quattor Tutorial LCG T2 Workshop June 16, 2006.
Semi-Automatic patch upgrade kit
M Gallas CERN EP-SFT LCG-SPI: SW-Testing1 LCG-SPI: SW-Testing QMTest test framework LCG AppArea meeting (16/07/03) LCG/SPI LCG Software.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
Trigger Software Validation Olga Igonkina (U.Oregon), Ricardo Gonçalo (RHUL) TAPM Open Meeting – April 12, 2007 Outline: Reminder of plans Status of infrastructure.
STAR Collaboration, July 2004 Grid Collector Wei-Ming Zhang Kent State University John Wu, Alex Sim, Junmin Gu and Arie Shoshani Lawrence Berkeley National.
September 2007CHEP 07 Conference 1 A software framework for Data Quality Monitoring in ATLAS S.Kolos, A.Corso-Radu University of California, Irvine, M.Hauschild.
Firmware - 1 CMS Upgrade Workshop October SLHC CMS Firmware SLHC CMS Firmware Organization, Validation, and Commissioning M. Schulte, University.
PROOF and ALICE Analysis Facilities Arsen Hayrapetyan Yerevan Physics Institute, CERN.
Jean-Roch Vlimant, CERN Physics Performance and Dataset Project Physics Data & MC Validation Group McM : The Evolution of PREP. The CMS tool for Monte-Carlo.
1 Checks on SDD Data Piergiorgio Cerello, Francesco Prino, Melinda Siciliano.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Trilinos Framework: Next Steps Thursday November 9, :45-9:30 a.m. Jim Willenbring Mike Phenow.
A python-based testing infrastructure Colin Bernet (IPNL)
October Test Beam DAQ. Framework sketch Only DAQs subprograms works during spills Each subprogram produces an output each spill Each dependant subprogram.
Transition to SVN server: follow up P.Hristov 24/01/2008.
SVN code server for AliRoot P.Hristov 18/10/2007.
Feedback from CMS Andrew Lahiff STFC Rutherford Appleton Laboratory Contributions from Christoph Wissing, Bockjoo Kim, Alessandro Degano CernVM Users Workshop.
11/01/20081 Data simulator status CCRC’08 Preparatory Meeting Radu Stoica, CERN* 11 th January 2007 * On leave from IFIN-HH.
20 October 2005 LCG Generator Services monthly meeting, CERN Validation of GENSER & News on GENSER Alexander Toropin LCG Generator Services monthly meeting.
JRA1 Meeting – 09/02/ Software Configuration Management and Integration EGEE is proposed as a project funded by the European Union under contract.
XML scheme for configuration data V.Černý on behalf of the Bratislava group Initiated by Marco’s call:...a software layer for implementing the handling.
Some topics for discussion 31/03/2016 P. Hristov 1.
Maite Barroso – WP4 Workshop – 10/12/ n° 1 -WP4 Workshop- Developers’ Guide Maite Barroso 10/12/2002
Analysis Model Zhengyun You University of California Irvine Mu2e Computing Review March 5-6, 2015 Mu2e-doc-5227.
AliRoot survey: Calibration P.Hristov 11/06/2013.
AAF tips and tricks Arsen Hayrapetyan Yerevan Physics Institute, Armenia.
Problem Solving With C++ SVN ( Version Control ) April 2016.
DAQ thoughts about upgrade 11/07/2012
ANALYSIS TRAIN ON THE GRID Mihaela Gheata. AOD production train ◦ AOD production will be organized in a ‘train’ of tasks ◦ To maximize efficiency of full.
CMS High Level Trigger Configuration Management
Subversion Reasons to use How it works Subversion important commands
Work report Xianghu Zhao Nov 11, 2014.
Tree based validation tool for track reconstruction
Analysis Trains - Reloaded
Generator Services planning meeting
CVS revisions UML diagram
Analysis framework - status
UNITY TEAM PROJECT TOPICS: [1]. Unity Collaborate
Ns-3 Training Debugging support ns-3 training, June 2016.
Presentation transcript:

Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation of the goodness of the offline. It monitors that a subset of physical quantities (variables) doesn't change during the development process. The Technical Validation environment uses BuildBot as testing automated framework in order to automatically rebuild and test the tree and run the builds on a variety of platforms. The structure of the validation procedure is based on the use of a reference root file containing a library of persistent objects. The reference validation objects, contain the information that one would like to monitor in order to identify that a certain Stage or Module has undergone some changes between the releases. Documentation is on: GAP2009_132 + Twiki.

BackCompatibility test started having in mind this main idea: Need to check that new releases of Offline can read files produced with older versions. How to approach this: Trigger the BuildBot build on EventIO change. As Input – A list of reference Events with different versions A script running a read test A script running the hybrid Simulation+Reconstruction + writing the Event + reco/sim test. ValidationTests TAG 1 I/O TAG 2 I/O TAG …I/O DEV N I/O TAG N-1 I/O Code 1 Code 2 Code... Code N-1 Code DEV Sim Rec ref

The goal: checking that new releases of Offline can read files produced with older versions. The problem arises mainly in relation with EventIO changes: BuildBot build and tests triggered by EventIO change The Script for BackCompatibility test contains: A Running-Executing part where: 1.It builds StandardApplications/HdSimulationReconstruction 2.Run the StandardApplications using as input Corsika 3.Write the output in Offline format in a directory containing in the name the date of the running. (A different name can be agreed, e.g. a possible name can contain the svn VersionNumber) A Reading part where 1.Iteratively access all the directories containing Offline files produced in point 3 (running-executing part). 2.Build and execute a StandardApplications/HdMCReconstruction saving the output in a log (we can stream only the StreamerCheck error out of it) ValidationTest BackCompatibility(I) In place Possible mods in italic

TO DO list: In the script the svn commit of outputs produced in the Running-Executing part (3). (automatic/semi-automatic under what conditions?) A less trivial reading test: We tried a more specific reading test, in order to avoid the too trivial streaming out of StreamerCheck error proposed in Reading Part(2). What we did: 1.Build of a HdReconstruction ad hoc, where the ModuleSequence contains all the Reconstruction part of HdSimulationReconstruction and it uses as input the Offline produced in (Running-Executing part (3)). 2.The Reconstructed part should give an equal result within the machine precision. The problem is that the results are different! Moreover if during the Running-Executing part (3) we output FdRaw, the FdCalibratorOG module does not allow the reconstruction at all. ValidationTest BackCompatibility(II)

How to stage the work done up to now? STEP 1 - immediate Everything in BackCompatibility(I) can be released. Need to define/agree on the naming, and on what to stream out from the reading test. STEP 2 Commit in svn repository. Include the automatic(?) filling of svn repository in the script. STEP 3 Less trivial reading test (to be understood). ValidationTest BackCompatibility(III)