Presentation is loading. Please wait.

Presentation is loading. Please wait.

Jean-Yves Nief, CC-IN2P3 CC-IN2P3 KEK-CCIN2P3 meeting on Grids. September 11th – 12th, 2006.

Similar presentations


Presentation on theme: "Jean-Yves Nief, CC-IN2P3 CC-IN2P3 KEK-CCIN2P3 meeting on Grids. September 11th – 12th, 2006."— Presentation transcript:

1 Jean-Yves Nief, CC-IN2P3 SRB @ CC-IN2P3 KEK-CCIN2P3 meeting on Grids. September 11th – 12th, 2006

2 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 2 Overview. 3 SRB servers: –1 Sun V440, 1 Sun V480 (Ultra Sparc III), 1 Sun v20z (AMD Opteron). –OS: Solaris 9 and 10. –Total disk space: ~ 8 TB –HPSS driver (non DCE): 2003. Using HPSS 5.1. MCAT: –Oracle 10g. Environment with multiple OS for clients or other SRB servers: –Linux: RedHat, Scientific Linux, Debian. –Solaris. –Windows. –Mac OS. Interfaces: –Scommands invoked from the shell (script based on them). –Java APIs. –Perl APIs. –Web interface mySRB.

3 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 3 Who is using SRB @ CC-IN2P3 ? In green = pre-production. High Energy Physics: –BaBar (SLAC, Stanford). –CMOS (International Linear Collider R&D). –Calice (International Linear Collider R&D). Astroparticle: –Edelweiss (Modane, France). –Pierre Auger Observatory (Argentina). Astrophysics: –SuperNovae Factory (Hawaii). Biomedical applications: –Neuroscience research. –Mammography project. –Cardiology research.

4 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 4 Babar, SLAC & CC-IN2P3. BaBar: High Energy Physics experiment closed to Stanford (California). SLAC and CC-IN2P3 first opened to the BaBar collaborators data analysis. Both held complete copies of data (Objectivity). Now only SLAC hold a complete copy of the data. Natural candidates for testing and deployment of grid middleware. Data should be available in a delay of 24/48 hours. SRB: chosen for data distribution of hundreds of TBs of data.

5 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 5 SRB BaBar architecture. CC-IN2P3 (Lyon) HPSS/Lyon SRB SLAC (Stanford, CA) SRB MCAT (1) (3) (2) SRB MCAT 2 Zones (SLAC + Lyon) HPSS/SLAC

6 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 6 Extra details (BaBar). Hardware: –SUN servers (Solaris 5.8, 5.9, 5.10): NetraT V240, V440, V480, V20z. Software: –Oracle 10g for the SLAC and Lyon MCAT. MCATs synchronization: only users and physical resources. Comparison of the MCATs contents to transfer the data. Step (1), (2), (3) multithreaded under client control: very little latency. Advantage: –External client can pick up data from SLAC or Lyon without interacting with the other site.

7 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 7 Overall assessment for BaBar. A lot of time saved for developping applications thanks to the SRB. Transparent access to data: –Very useful in an hybrid environment (disk, tape). –Easy to scale the service (adding/removing new servers on the fly). –Not dependent of physical locations changes in the client application. Fully automated procedure on both sides. Easy for SLAC to recover corrupted data. 300 TB (530,000 files) shipped to Lyon. Up to 3 TB /day from tape to tape (minimum latency). Going up to 5 TB / day now.

8 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 8 Fermilab (US)  CERN SLAC (US)  IN2P3 (FR) 1 Terabyte/day SLAC (US)  INFN Padva (IT) Fermilab (US)  U. Chicago (US) CEBAF (US)  IN2P3 (FR) INFN Padva (IT)  SLAC (US) U. Toronto (CA)  Fermilab (US) Helmholtz-Karlsruhe (DE)  SLAC (US) DOE Lab  DOE Lab SLAC (US)  JANET (UK) Fermilab (US)  JANET (UK) Argonne (US)  Level3 (US) Argonne  SURFnet (NL) IN2P3 (FR)  SLAC (US) Fermilab (US)  INFN Padva (IT) ESNET Traffic with one server on both sides (April 2004).

9 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 9 CMOS, Calice: ILC. HPSS/Lyon SRB IReS (Strasbourg) CC-IN2P3 (2 TB) CMOS: 5 to 10 TBs / year HPSS/Lyon SRB User PC CC-IN2P3 Calice: 2 to 5 TBs / year

10 SuperNovae Factory. Telescope data stored into the SRB, processed in Lyon (almost online). Collaborative tool + backup (files exchanged between French and US users). Hawaii telescope HPSS/Lyon SRB CC-IN2P3 a few GBs / day SRB HPSS/NERSC Berkeley (project) SRB needed for the « online »!

11 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 11 Neuroscience research. DICOM IRM Siemens MAGNETOM Sonata Maestro Class 1.5 T Consol Siemens Celsius Xeon (Window NT) Acquisition DICOM Export PC Dell PowerEdge 800  FTP,  File sharing,  … DICOM

12 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 12 Neuroscience research (II). Goal: make SRB invisible to the end user. More than 500,000 files registered. Data pushed from Lyon, Strasbourg hospital: –Automated procedure including anonymization. Now interfaced within the MATLAB environment. ~ 1.5 FTE for 6 months… Next step: –Ever growing community (a few TBs / year). Goal: –Join the BIRN network (US biomedical network).

13 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 13 Mammography. Database of X ray pictures (Florida) stored into SRB: –Reference pictures of various type of breast cancers. Analyze a X ray picture of a breast: –Submitting a job in EGEE framework. Compare with the ones in the reference database: –Pick up from the SRB.

14 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 14 Cardiology. PACS (hospital internal info system): invisible from the outside world. Being interfaced with the SRB at the hospital using SRB/DICOM driver (thanks to CR4I, Italy!). PACS data published in the SRB anonymized on the fly. Possibility to exchange data in a secure way. SRB (Lyon Hospital) CC-IN2P3 PACS Deployed but needs more testing.

15 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 15 GGF Data Grid Interoperability Demonstration. Goals: –Demonstrate federation of 14 SRB data grids (shared name spaces). –Demonstrate authentication, authorization, shared collections, remote data access. –CC-IN2P3 part of it. Organizers: Erwin Laure (Erwin.Laure@cern.ch)Erwin.Laure@cern.ch Reagan Moore (moore@sdsc.edu)moore@sdsc.edu Arun Jagatheesan (arun@sdsc.edu)arun@sdsc.edu Sheau-Yen Chen (sheauc@sdsc.edu)sheauc@sdsc.edu

16 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 16 GGF Data Grid Interoperability Demonstration (II). A few tests with KEK, RAL, IB (UK + New Zealand).

17 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 17 Summary. Lightweight administration for the entire system. Fully automated monitoring of the system health. For each project: –Training of the administrator(s) of the project. –Proposing the architecture. –User support and « consulting » on SRB. Different project = different needs, various aspects of SRB used. Over 1 million of files for some catalogs very soon. More projects coming to SRB: –Auger: CC-IN2P3 Tier 0, import from Argentina, real data and simulation distribution. –1 MegaStar project (Eros, astro): usage of HDF5 driver ? –BioEmergence.

18 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 18 What’s next ? Monitoring tools of the SRB systems for the users needed: (like Adil, Roger Downing did for CCLRC). Build with Adil some kind of European forum on SRB: –Already contacts in Italy, Netherlands, Germany. –Gather everybody experience on SRB. –Put in common tools, scripts developped. –Adil will host the first meeting in the UK. –Big party in his new appartment: everybody welcome! SRB-DSI.

19 KEK-CCIN2P3 meeting on Grids, September 11th-12th 2006 19 Involvement in iRODS. Many possibilities, some examples: –Interface with MSS: HPSS driver. Improvement of the compound resources (rules for migration, etc…). Mixing compound and logical resources. Containers ( see Adil @ CCLRC). –Optimization of the transfer protocol on long distance network wrt SRB (?). –Databases performance (RCAT, DAI). –Improvement of data encryption services. –Web interface (php ?).


Download ppt "Jean-Yves Nief, CC-IN2P3 CC-IN2P3 KEK-CCIN2P3 meeting on Grids. September 11th – 12th, 2006."

Similar presentations


Ads by Google