24th May 20021 PARTICIPATION OF IFIC Coordinated Project Plan Nacional de Altas Energías y Grandes Aceleradores IFIC (Instituto de Física Corpuscular)

Slides:



Advertisements
Similar presentations
WGISS #19 Plenary, CONAE, Cordoba, Argentina, March 2005 Cluster and Grid Project: Status & Update Pakorn Apaphant Geo-Informatics and Space Technology.
Advertisements

Tony Doyle - University of Glasgow GridPP EDG - UK Contributions Architecture Testbed-1 Network Monitoring Certificates & Security Storage Element R-GMA.
Istituto Nazionale di Fisica Nucleare Italy LAL - Orsay April Site Report – R.Gomezel Site Report Roberto Gomezel INFN - Trieste.
CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia Javier Sánchez.
Whats Inside a PC?. UNT in partnership with TEA, Copyright © All rights reserved.2 II. Major Components: Motherboard 1. Referred to as heart of.
CROSSGRID WP41 Valencia Testbed Site: IFIC (Instituto de Física Corpuscular) CSIC-Valencia ICMoL (Instituto de Ciencia Molecular) UV-Valencia 28/08/2002.
Introduction to Computer Hardware Dr. Steve Broskoske Misericordia University.
Computer Cluster at UTFSM Yuri Ivanov, Jorge Valencia.
Presented by: Yash Gurung, ICFAI UNIVERSITY.Sikkim BUILDING of 3 R'sCLUSTER PARALLEL COMPUTER.
12. March 2003Bernd Panzer-Steindel, CERN/IT1 LCG Fabric status
HELICS Petteri Johansson & Ilkka Uuhiniemi. HELICS COW –AMD Athlon MP 1.4Ghz –512 (2 in same computing node) –35 at top500.org –Linpack Benchmark 825.
ACAT 2002, Moscow June 24-28thJ. Hernández. DESY-Zeuthen1 Offline Mass Data Processing using Online Computing Resources at HERA-B José Hernández DESY-Zeuthen.
Current Monte Carlo calculation activities in ATLAS (ATLAS Data Challenges) Oxana Smirnova LCG/ATLAS, Lund University SWEGRID Seminar (April 9, 2003, Uppsala)
1 VALENCIA TESTBED SITE IFIC (Instituto de Física Corpuscular) Universitat de València-CSIC.
MCell Usage Scenario Project #7 CSE 260 UCSD Nadya Williams
Task 3.5 Tests and Integration ( Wp3 kick-off meeting, Poznan, 29 th -30 th January 2002 Santiago González de la.
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
Group 11 Pekka Nikula Ossi Hämäläinen Introduction to Parallel Computing Kentucky Linux Athlon Testbed 2
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
1 MASTERING (VIRTUAL) NETWORKS A Case Study of Virtualizing Internet Lab Avin Chen Borokhovich Michael Goldfeld Arik.
Cluster computing facility for CMS simulation work at NPD-BARC Raman Sehgal.
Test Configuration for Control For the test configuration we used a VME based control system, constituted by a VME crate with a VMPC4a from Cetia (CPU.
1 A Basic R&D for an Analysis Framework Distributed on Wide Area Network Hiroshi Sakamoto International Center for Elementary Particle Physics (ICEPP),
Cluster currently consists of: 1 Dell PowerEdge Ghz Dual, quad core Xeons (8 cores) and 16G of RAM Original GRIDVM - SL4 VM-Ware host 1 Dell PowerEdge.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Crossgrid kick-off meeting, Cracow, March 2002 Santiago González de la Hoz, IFIC1 Task 3.5 Test and Integration (
IFIC-València Proposal 1 CDF Exec Board, March 3 rd, 2005 Proposal by IFIC-València for membership in the CDF collaboration S. Martí García S. Cabrera.
23 Oct 2002HEPiX FNALJohn Gordon CLRC-RAL Site Report John Gordon CLRC eScience Centre.
Cosener’s House – 30 th Jan’031 LHCb Progress & Plans Nick Brook University of Bristol News & User Plans Technical Progress Review of deliverables.
Nick Brook Current status Future Collaboration Plans Future UK plans.
TRIUMF Site Report for HEPiX/HEPNT, Vancouver, Oct20-24/2003 – Corrie Kost TRIUMF SITE REPORT Corrie Kost Head Scientific Computing.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
BesIII Computing Environment Computer Centre, IHEP, Beijing. BESIII Computing Environment.
International Workshop on HEP Data Grid Nov 9, 2002, KNU Data Storage, Network, Handling, and Clustering in CDF Korea group Intae Yu*, Junghyun Kim, Ilsung.
6/26/01High Throughput Linux Clustering at Fermilab--S. Timm 1 High Throughput Linux Clustering at Fermilab Steven C. Timm--Fermilab.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
Manchester HEP Desktop/ Laptop 30 Desktop running RH Laptop Windows XP & RH OS X Home server AFS using openafs 3 DB servers Kerberos 4 we will move.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
Management of the LHCb DAQ Network Guoming Liu * †, Niko Neufeld * * CERN, Switzerland † University of Ferrara, Italy.
KISTI-GSDC SITE REPORT Sang-Un Ahn, Jin Kim On the behalf of KISTI GSDC 24 March 2015 HEPiX Spring 2015 Workshop Oxford University, Oxford, UK.
Services provided by CERN’s IT Division Ethernet in controls applications European Organisation for Nuclear Research European Laboratory for Particle.
Summary Collaborative tools track Eva Hladká Masaryk University & CESNET Czech Republic.
Monte Carlo Data Production and Analysis at Bologna LHCb Bologna.
CDF computing in the GRID framework in Santander
7 March 2000EU GRID Project Proposal Meeting CERN, M. Lokajicek 1 Proposal for Participation of the Czech Republic in the EU HEP GRID Project Institute.
IHEP(Beijing LCG2) Site Report Fazhi.Qi, Gang Chen Computing Center,IHEP.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Port d’Informació Científica Puerto de Información Científica Scientific Information GridPort M. Delfino / UAB / LCG-ES Plans of Spanish Groups.
HEPiX 2 nd Nov 2000 Alan Silverman Proposal to form a Large Cluster SIG Alan Silverman 2 nd Nov 2000 HEPiX – Jefferson Lab.
CERN The GridSTART EU accompany measure Fabrizio Gagliardi CERN
Activity 1 5 minutes to discuss and feedback on the following:
1 EUROPEAN COMMISSION Tempus JEP – – 2006 Supporting and facilitating active uptake to Information and Communication Technologies for University.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
DataGrid and CrossGrid Helpdesk CERN, 30th October 2002 EDG WP6 meeting Sophie Nicoud (CNRS-Marseille) and José Salt (IFIC-Valencia)
Running clusters on a Shoestring US Lattice QCD Fermilab SC 2007.
Bernd Panzer-Steindel CERN/IT/ADC1 Medium Term Issues for the Data Challenges.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
Buying into “Summit” under the “Condo” model
Grid related projects CERN openlab LCG EDG F.Fluckiger
UK GridPP Tier-1/A Centre at CLRC
NIKHEF Data Processing Fclty
Gridifying the LHCb Monte Carlo production system
Grid activities in NIKHEF
QMUL Site Report by Dave Kant HEPSYSMAN Meeting /09/2019
Presentation transcript:

24th May PARTICIPATION OF IFIC Coordinated Project Plan Nacional de Altas Energías y Grandes Aceleradores IFIC (Instituto de Física Corpuscular) Universitat de València-CSIC

24th May20022 Contents 4 I.- Main Areas of Interest 4 II.- Infrastructure in Valencia 4 III.- GRID activities 4 IV.- ATLAS Computing

24th May20023 I.- Main Areas of Interest 4 Fabric Management and Operation –Automatization –Monitoring 4 Experiment -oriented use of the farm –Data Challenges in ATLAS (Monte Carlo Production) –Data Analysis

24th May20024 II.- Infrastructure at IFIC GoG (Grupo de Ordenadores para el GRID) Farm Athlon PC (134 IFIC + 58 ICMOL) –VIA KT133A & KT266A based MotherBoard –CPU: AMD Athlon 1.2 & 1.4 GHz –RAM: 2x512 Mbytes (SDRAM & DDR SDRAM) –HD: 40 Gbytes –NIC: 3COM 905CX & RealTek RTL8139 (Fast Ethernet + PXE) –2U chassis

24th May20025 GoG Farm 4 9 racks (800x600 mm) 4 22 PCs + 1 network switch per rack 4 2U chassis with 3 fans –pros: less space required –cons: add 240 euros per PC (aprox.) heat concentration

24th May20026 GoG Farm

24th May20027 GoG Farm (front view)

24th May20028 GoG Farm (back view)

24th May20029 Local Network

24th May Local Network 4 All Worker Nodes have private IP addresses in the net /22 –secure environment –administrative independence from University network management staff 4 Communication equipment have public IP addresses in the net /26 –can be monitored and upgraded from University centralized service

24th May Infrastructure in Valencia : Human resources 4 5 staff persons + 1 postdoc ( EDP = 3.5) 4 support from the Informatic Section of IFIC 4...(2 new contracts in the next two months)

24th May III - GRID Activities 4 In DataGrid, involved in WP6 (Testbed and Demonstrators) 4 In CrossGrid: –HEP Data Analysis (CrossGrid Appl. Development) –New GRID Services and Tools (Resource Management and Integration) –International Testbed Organisation –

24th May IV- ATLAS Computing 4 Data Challenges, involved in phase 1: full chain implemented and tested at IFIC Event generator Det. SimulationReconstruction Analysis Files: Dijets and Zjets (10 Kevts) Computing Resources available from June to December: 100 PC’s (4 KSpecInt95) Massive Production (10**7 evts for the whole ATLAS Coll.): with EDG 1.2 by the end of June