Presentation is loading. Please wait.

Presentation is loading. Please wait.

Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation.

Similar presentations


Presentation on theme: "Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation."— Presentation transcript:

1 Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation of the goodness of the offline. It monitors that a subset of physical quantities (variables) doesn't change during the development process. The Technical Validation environment uses BuildBot as testing automated framework in order to automatically rebuild and test the tree and run the builds on a variety of platforms. The structure of the validation procedure is based on the use of a reference root file containing a library of persistent objects. The reference validation objects, contain the information that one would like to monitor in order to identify that a certain Stage or Module has undergone some changes between the releases. Documentation is on: GAP2009_132 + Twiki.

2 BackCompatibility test started having in mind this main idea: Need to check that new releases of Offline can read files produced with older versions. How to approach this: Trigger the BuildBot build on EventIO change. As Input – A list of reference Events with different versions A script running a read test A script running the hybrid Simulation+Reconstruction + writing the Event + reco/sim test. ValidationTests TAG 1 I/O TAG 2 I/O TAG …I/O DEV N I/O TAG N-1 I/O Code 1 Code 2 Code... Code N-1 Code DEV Sim Rec ref

3 The goal: checking that new releases of Offline can read files produced with older versions. The problem arises mainly in relation with EventIO changes: BuildBot build and tests triggered by EventIO change The Script for BackCompatibility test contains: A Running-Executing part where: 1.It builds StandardApplications/HdSimulationReconstruction 2.Run the StandardApplications using as input Corsika 3.Write the output in Offline format in a directory containing in the name the date of the running. (A different name can be agreed, e.g. a possible name can contain the svn VersionNumber) A Reading part where 1.Iteratively access all the directories containing Offline files produced in point 3 (running-executing part). 2.Build and execute a StandardApplications/HdMCReconstruction saving the output in a log (we can stream only the StreamerCheck error out of it) ValidationTest BackCompatibility(I) In place Possible mods in italic

4 TO DO list: In the script the svn commit of outputs produced in the Running-Executing part (3). (automatic/semi-automatic under what conditions?) A less trivial reading test: We tried a more specific reading test, in order to avoid the too trivial streaming out of StreamerCheck error proposed in Reading Part(2). What we did: 1.Build of a HdReconstruction ad hoc, where the ModuleSequence contains all the Reconstruction part of HdSimulationReconstruction and it uses as input the Offline produced in (Running-Executing part (3)). 2.The Reconstructed part should give an equal result within the machine precision. The problem is that the results are different! Moreover if during the Running-Executing part (3) we output FdRaw, the FdCalibratorOG module does not allow the reconstruction at all. ValidationTest BackCompatibility(II)

5 How to stage the work done up to now? STEP 1 - immediate Everything in BackCompatibility(I) can be released. Need to define/agree on the naming, and on what to stream out from the reading test. STEP 2 Commit in svn repository. Include the automatic(?) filling of svn repository in the script. STEP 3 Less trivial reading test (to be understood). ValidationTest BackCompatibility(III)


Download ppt "Technical Validation The Technical Validation is a testing framework of the AUGER offline to monitor the code development process. It is not a validation."

Similar presentations


Ads by Google