A Technical Validation Module for the offline Auger-Lecce, 17 September 2009  Design  The SValidStore Module  Example  Scripting  Status.

Slides:



Advertisements
Similar presentations
Building Portals to access Grid Middleware National Technical University of Athens Konstantinos Dolkas, On behalf of Andreas Menychtas.
Advertisements

Professional Toolkit V2.0 C:\Presentations - SmartCafe_Prof_V2.0 - bsc page 1 Professional Toolkit 2.0.
Reconstruction and Analysis on Demand: A Success Story Christopher D. Jones Cornell University, USA.
EE694v-Verification-Lect5-1- Lecture 5 - Verification Tools Automation improves the efficiency and reliability of the verification process Some tools,
Chapter 7 Managing Data Sources. ASP.NET 2.0, Third Edition2.
Maintaining and Updating Windows Server 2008
CLEO’s User Centric Data Access System Christopher D. Jones Cornell University.
Introduction to HP LoadRunner Getting Familiar with LoadRunner >>>>>>>>>>>>>>>>>>>>>>
L3 Filtering: status and plans D  Computing Review Meeting: 9 th May 2002 Terry Wyatt, on behalf of the L3 Algorithms group. For more details of current.
SCRAM Software Configuration, Release And Management Background SCRAM has been developed to enable large, geographically dispersed and autonomous groups.
Offline Performance Monitoring for Linux Abhishek Shukla.
CFT Offline Monitoring Michael Friedman. Contents Procedure  About the executable  Notes on how to run Results  What output there is and how to access.
Introduction to Hall-D Software February 27, 2009 David Lawrence - JLab.
Designing For Testability. Incorporate design features that facilitate testing Include features to: –Support test automation at all levels (unit, integration,
Guide to Linux Installation and Administration, 2e1 Chapter 8 Basic Administration Tasks.
Capture and Replay Often used for regression test development –Tool used to capture interactions with the system under test. –Inputs must be captured;
CISC105 General Computer Science Class 1 – 6/5/2006.
J OINT I NSTITUTE FOR N UCLEAR R ESEARCH OFF-LINE DATA PROCESSING GRID-SYSTEM MODELLING FOR NICA 1 Nechaevskiy A. Dubna, 2012.
Storage Manager Overview L3 Review of SM Software, 28 Oct Storage Manager Functions Event data Filter Farm StorageManager DQM data Event data DQM.
Job Options and Printing 1 LHCb software tutorial - September 2011.
Interfaces to External EDA Tools Debussy Denali SWIFT™ Course 12.
The huge amount of resources available in the Grids, and the necessity to have the most up-to-date experimental software deployed in all the sites within.
Recent Software Issues L3 Review of SM Software, 28 Oct Recent Software Issues Occasional runs had large numbers of single-event files. INIT message.
G.Corti, P.Robbe LHCb Software Week - 19 June 2009 FSR in Gauss: Generator’s statistics - What type of object is going in the FSR ? - How are the objects.
Gnam Monitoring Overview M. Della Pietra, D. della Volpe (Napoli), A. Di Girolamo (Roma1), R. Ferrari, G. Gaudio, W. Vandelli (Pavia) D. Salvatore, P.
Validation related issues Auger-Lecce, 10 November 2009  BuildBot- introduction   Site Wide Installation  Issues related to Install/Config/Valid.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
Alexander Richards, UCL 1 Atlfast and RTT (plus DCube) Christmas Meeting 18/12/2007.
ALICE Condition DataBase Magali Gruwé CERN PH/AIP Alice Offline week May 31 st 2005.
Analysis trains – Status & experience from operation Mihaela Gheata.
STAR Event data storage and management in STAR V. Perevoztchikov Brookhaven National Laboratory,USA.
5/2/  Online  Offline 5/2/20072  Online  Raw data : within the DAQ monitoring framework  Reconstructed data : with the HLT monitoring framework.
VApp Product Support Engineering Rev E VMware Confidential.
Trigger Software Validation Olga Igonkina (U.Oregon), Ricardo Gonçalo (RHUL) TAPM Open Meeting – April 12, 2007 Outline: Reminder of plans Status of infrastructure.
David Adams ATLAS DIAL: Distributed Interactive Analysis of Large datasets David Adams BNL August 5, 2002 BNL OMEGA talk.
Introduction Selenium IDE is a Firefox extension that allows you to record, edit, and debug tests for HTML Easy record and playback Intelligent field selection.
CBM ECAL simulation status Prokudin Mikhail ITEP.
PESAsim – the e/  analysis framework Validation of the framework First look at a trigger menu combining several signatures Short-term plans Mark Sutton.
Linda R. Coney – 5 November 2009 Online Reconstruction Linda R. Coney 5 November 2009.
Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
Jean-Roch Vlimant, CERN Physics Performance and Dataset Project Physics Data & MC Validation Group McM : The Evolution of PREP. The CMS tool for Monte-Carlo.
MICE CM28 Oct 2010Jean-Sebastien GraulichSlide 1 Detector DAQ o Achievements Since CM27 o DAQ Upgrade o CAM/DAQ integration o Online Software o Trigger.
Claudio Grandi INFN-Bologna CHEP 2000Abstract B 029 Object Oriented simulation of the Level 1 Trigger system of a CMS muon chamber Claudio Grandi INFN-Bologna.
Online Monitoring System at KLOE Alessandra Doria INFN - Napoli for the KLOE collaboration CHEP 2000 Padova, 7-11 February 2000 NAPOLI.
SPI NIGHTLIES Alex Hodgkins. SPI nightlies  Build and test various software projects each night  Provide a nightlies summary page that displays all.
1 Offline Week, October 28 th 2009 PWG3-Muon: Analysis Status From ESD to AOD:  inclusion of MC branch in the AOD  standard AOD creation for PDC09 files.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Alignment in real-time in current detector and upgrade 6th LHCb Computing Workshop 18 November 2015 Beat Jost / Cern.
Analysis experience at GSIAF Marian Ivanov. HEP data analysis ● Typical HEP data analysis (physic analysis, calibration, alignment) and any statistical.
Level 1-2 Trigger Data Base development Current status and overview Myron Campbell, Alexei Varganov, Stephen Miller University of Michigan August 17, 2000.
StEvent I/O Model And Writing a Maker Or How to Add a New Detector Akio Ogawa BNL 2003 Nov Dubna.
Calibration algorithm and detector monitoring - TPC Marian Ivanov.
20 October 2005 LCG Generator Services monthly meeting, CERN Validation of GENSER & News on GENSER Alexander Toropin LCG Generator Services monthly meeting.
JRA1 Meeting – 09/02/ Software Configuration Management and Integration EGEE is proposed as a project funded by the European Union under contract.
Some topics for discussion 31/03/2016 P. Hristov 1.
Maintaining and Updating Windows Server 2008 Lesson 8.
Analysis Model Zhengyun You University of California Irvine Mu2e Computing Review March 5-6, 2015 Mu2e-doc-5227.
AliRoot survey: Calibration P.Hristov 11/06/2013.
MAUS Status A. Dobbs CM43 29 th October Contents MAUS Overview Infrastructure Geometry and CDB Detector Updates CKOV EMR KL TOF Tracker Global Tracking.
Fermilab Scientific Computing Division Fermi National Accelerator Laboratory, Batavia, Illinois, USA. Off-the-Shelf Hardware and Software DAQ Performance.
ANALYSIS TRAIN ON THE GRID Mihaela Gheata. AOD production train ◦ AOD production will be organized in a ‘train’ of tasks ◦ To maximize efficiency of full.
HYDRA Framework. Setup of software environment Setup of software environment Using the documentation Using the documentation How to compile a program.
THIS MORNING (Start an) informal discussion to -Clearly identify all open issues, categorize them and build an action plan -Possibly identify (new) contributing.
SQL Database Management
CMS High Level Trigger Configuration Management
Analysis framework - status
Data Analysis in Particle Physics
Production client status
Presentation transcript:

A Technical Validation Module for the offline Auger-Lecce, 17 September 2009  Design  The SValidStore Module  Example  Scripting  Status

Technical Validation What we call technical validation is a testing framework of the AUGER offline to monitor the code development process. What Technical Validation is not! It is not a validation of the goodness of the offline. It doesn’t assert “this result make sense or it does not make sense” What Technical Validation is: It just monitors that a physical quantity(variable) doesn't change during the development process and if it changes sends an alert message to the developers responsible of the change asking information. Report is on BuildBot waterfall page. Massive production/reconstruction validation do not enter in this kind of validation User debug utilities tools do not enter in this.

Design The structure of the validation procedure is based on the use of a reference root file containing a library of persistent objects. The reference validation objects, one for each Stage, contain the information that one would like to monitor in order to identify that a certain Stage or Module has undergone some changes between the releases. The whole process is managed by a single module (SValidStore) that is responsible for the access to the reference root file and for the evocation of the validation object (ValidationObj) The single ValidationObj knows the kind of information needed and how a persistent object to be stored in the reference file should be filled SValidStore TValidationObj ValidationObj reference root file TValidationObj Write or Check DPA Sequence/Module

The SValidStore module(1) The SValidStore Module is the manager of the validation procedure. His main task is to create the validation object and to perform actions on it, such as storing or checking. A module configuration xml file is used to define the parameters characterizing the algorithm. In the present version of the code this module has three parameters: Filename: name of the file where the reference results are stored. StoreObj: name of the validation object that has to be invoked in the current job Mode: mode of operation. Three mode are possible o recreate write in the reference file, if the file is not present a new file is created otherwise the file is overwritten. o update add a new object to the reference file o check compare the results obtained by the object in memory with the information previously stored in the reference file

The SValidStore module(2) This is the base class of the validation object. This class is responsible for the filling of the persistent information, kept in the corresponding persistent object TValidationObj. Since the filling is strongly dependent from the kind of object, all member functions are made virtual and the implementation is deferred to the derived class. The base class of the persistent object is TValidationObj. This class encapsulates all the informations to check (Validate) the reconstruction stage. SValidStore TValidationObj ValidationObj TValidationObj Write or Check ValidationObj

Example: FIOValidation- How to … (1) A sequence for FD monocular reconstruction contains: 1-Event Extractor 2-Signal Analyzer 3-Pulse Finder 4-SDP Finder 5-Axis Finder 6- Profile Reconstructor 7-Energy Finder Step 1- Define a sequence to validate Step 2-Identify a set of significative variables to monitor SDPtheta: Shower Detector Plane normal-zenith SDPphi: Shower Detector Plane normal–azimuth Axistheta: Shower axis-theta Axisphi: Shower axis-phi Xmax: Depth at the shower max Eem: ElectroMagnetic Energy Etot: Total Energy Write the Module Sequence you want to Validate, inserting as last the module: SValidStore … try to use variables covering the various Modules in order to better track down problems

Example: FIOValidation- How to…(2) class FIOValidationObj : public ValidationObj {... virtual bool Fill(const evt::Event& event);... } Step 3- Implement the Fill() and Validate() methods FIOValidationObj.h TFIOValidationObj.h class TFIOValidationObj : public TValidationObj {... virtual bool Validate(TValidationObj * e);... } The agreement between the set of variables has to be better than to pass the test- note that the tolerances and the matching are Step 4- Create the dictionary for the persistent object $ROOTSYS/bin rootcint –f (fileDict.cc) –c (file.h)

Example: FIOValidation- How to…(3) Step 4- few mods on SValidStore.cc to allow the instance of the new object. Step 6- compile and run… Step 5- Write the SValidStore.xml Configuration file #include ‘’FIOValidationObj.h’’... ValidationObj* SValidStore::Fill(const string& objName)... if(objName ==‘’FIOValidation’’ || objName ==‘’FIOValidationObj’’) return new FIOValidationObj();

Scripting A set of scripts have been written for the Validation. They allow to: Run the validation assuming that all the code is installed in the default mode under the AUGER_BASE directory and assuming that the Validation code and scripting are installed in a ValidationTests sub-directory of AUGER_BASE All the scripts are written to run in BuildBot environment. The scripts 1.create all the need xml files to run the Offline with the validation module (ModuleSequence, bootstrap, EventFileReader, SValidStore, etc.) 2.Compile and run the Offline and check the exit status. If the status is not equal to 0 a test fault is send to BuildBot and reported in the waterfall web page.

Status The ValidationObj implemented refer to: FReconstruction ModuleSequence (old-non standard) SReconstruction ModuleSequence The tests perform nightly check of: FRec and SRec stability I/O ( Write data in different format than read and reconstruct) Automatic creation and use of logfile.xml Documentation: GAP first draft in place Twiki page (first draft) in place: mainly a cut and paste from the GAP

to do (? !) Reconstruction side Use of standard Module Sequences Reco for Hybrid Reco for FD Stereo reconstruction Introduction of Histograms. Simulation side How to compare a reference simulation with a new one? It is not possible to define few quantities that monitor a simulation chain (like reconstruction). There are many used simulation modules for the same stuff. Therefore implementation of a FSimulation and a SSimulation ValidationObj is not enough. Irrelevant changes in the random sequence can produce relevant changes in the simulated showers. It is necessary to implement a statistical comparison between reference and under-develop simulation chains-modules. Many showers must be generated in the same condition to create the distribution of values to compare. But this means important amount of CPU time needed to make a test…

to do (? !) Trigger: Fixed input ad hoc some tracks and check of reproducible modules IO: From Tom Paul: checking that new releases of Offline can read files produced with older versions. How to approach this : Trigger the BuildBot build on EventIO change. Input – A list of reference Events with different versions A script running a read test A script running the hybrid Simulation+Reconstruction and writing the Event chained with some reco/sim test. Input Files selection for specific Module Sequences? Hybrid Sim Validatrix.... (not understood fully the architecture behind it! re-iteration with Darko, Tom)

Do we want to set up a BuildBot slave? If yes, it will be reasonable to have two configurations in place corresponding to the two kinds of software suites we have on the farm: 32/64 machines Different operating system are of interest? If yes, can these be in place via virtual machines? Documentation The idea is to release the GAP in one week in such a way that any following work has a clear reference. to do (? !)