K. De UTA Grid Workshop April 2002 ATLAS Pre-packaged Kaushik De University of Texas at Arlington.

Slides:



Advertisements
Similar presentations
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Advertisements

WP 1 Grid Workload Management Massimo Sgaravatto INFN Padova.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
GRID workload management system and CMS fall production Massimo Sgaravatto INFN Padova.
Andrew McNab - Manchester HEP - 6 November Old version of website was maintained from Unix command line => needed (gsi)ssh access.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
DataGrid Kimmo Soikkeli Ilkka Sormunen. What is DataGrid? DataGrid is a project that aims to enable access to geographically distributed computing power.
GRID Workload Management System Massimo Sgaravatto INFN Padova.
K.Harrison CERN, 23rd October 2002 HOW TO COMMISSION A NEW CENTRE FOR LHCb PRODUCTION - Overview of LHCb distributed production system - Configuration.
Pilots 2.0: DIRAC pilots for all the skies Federico Stagni, A.McNab, C.Luzzi, A.Tsaregorodtsev On behalf of the DIRAC consortium and the LHCb collaboration.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks gLite Release Process Maria Alandes Pradillo.
Grappa: Grid access portal for physics applications Shava Smallen Extreme! Computing Laboratory Department of Physics Indiana University.
XCAT Science Portal Status & Future Work July 15, 2002 Shava Smallen Extreme! Computing Laboratory Indiana University.
5 November 2001F Harris GridPP Edinburgh 1 WP8 status for validating Testbed1 and middleware F Harris(LHCb/Oxford)
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
C. Loomis – Testbed Status – 28/01/2002 – n° 1 Future WP6 Tasks Charles Loomis January 28, 2002
Slide 1 Experiences with NMI R2 Grids Software at Michigan Shawn McKee April 8, 2003 Internet2 Spring Meeting.
Pacman issues affect Core software Infrastructure Testbed Preparing demos Preparing interoperability tests This is a critical time for establishing good.
STAR scheduling future directions Gabriele Carcassi 9 September 2002.
RLS Tier-1 Deployment James Casey, PPARC-LCG Fellow, CERN 10 th GridPP Meeting, CERN, 3 rd June 2004.
03/27/2003CHEP20031 Remote Operation of a Monte Carlo Production Farm Using Globus Dirk Hufnagel, Teela Pulliam, Thomas Allmendinger, Klaus Honscheid (Ohio.
Compiled Matlab on Condor: a recipe 30 th October 2007 Clare Giacomantonio.
Grid Workload Management & Condor Massimo Sgaravatto INFN Padova.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
Distribution After Release Tool Natalia Ratnikova.
China Grid Activity on SIG Presented by Guoqing Li At WGISS-21, Budapest 8 May, 2006.
Nadia LAJILI User Interface User Interface 4 Février 2002.
DDM-Panda Issues Kaushik De University of Texas At Arlington DDM Workshop, BNL September 29, 2006.
ETICS All Hands meeting Bologna, October 23-25, 2006 NMI and Condor: Status + Future Plans Andy PAVLO Peter COUVARES Becky GIETZEL.
ATLAS Data Challenge Production Experience Kaushik De University of Texas at Arlington Oklahoma D0 SARS Meeting September 26, 2003.
CMS Stress Test Report Marco Verlato (INFN-Padova) INFN-GRID Testbed Meeting 17 Gennaio 2003.
Status of Grid-enabled UTA McFarm software Tomasz Wlodek University of the Great State of TX At Arlington.
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
E-science grid facility for Europe and Latin America E2GRIS1 Gustavo Miranda Teixeira Ricardo Silva Campos Laboratório de Fisiologia Computacional.
4/5/20071 The LAW (Linux Applications on Windows) Project Sudhamsh Reddy University of Texas at Arlington.
Condor Usage at Brookhaven National Lab Alexander Withers (talk given by Tony Chan) RHIC Computing Facility Condor Week - March 15, 2005.
Virtual Batch Queues A Service Oriented View of “The Fabric” Rich Baker Brookhaven National Laboratory April 4, 2002.
Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Usage of virtualization in gLite certification Andreas Unterkircher.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Packaging & Testing: NMI & VDT.
10 May 2001DataGrid WP6 Meeting, CERN1 GRID The GRID Installation Toolkit A brief report F. Donno INFN - Pisa.
LHCb-ATLAS GANGA Workshop, 21 April 2004, CERN 1 DIRAC Software distribution A.Tsaregorodtsev, CPPM, Marseille LHCb-ATLAS GANGA Workshop, 21 April 2004.
Site Management & Support for 1.2 Technical Problems Other Problems What is a Sysadmin? What is wrong with the way we work? Me as a Site Manger Me as an.
SAN DIEGO SUPERCOMPUTER CENTER Inca Control Infrastructure Shava Smallen Inca Workshop September 4, 2008.
Andrew McNab - Manchester HEP - 17 September 2002 UK Testbed Deployment Aim of this talk is to the answer the questions: –“How much of the Testbed has.
Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Tools and techniques for managing virtual machine images Andreas.
Issues on the operational cluster 1 Up to 4.4x times variation of the execution time on 169 cores Using -O2 optimization flag Using IBM MPI without efficient.
Yannick Patois - Datagrid Software Repository Presentation - March, n° 1 Datagrid Software Repository Presentation CVS, packages and automatic.
SDN Provisioning, next steps after ANSE Kaushik De Univ. of Texas at Arlington US ATLAS Planning, CERN June 29, 2015.
Jaime Frey Computer Sciences Department University of Wisconsin-Madison What’s New in Condor-G.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Grid Workload Management (WP 1) Massimo Sgaravatto INFN Padova.
Predrag Buncic (CERN/PH-SFT) Software Packaging: Can Virtualization help?
T3g software services Outline of the T3g Components R. Yoshida (ANL)
Tests at Saclay D. Calvet, A. Formica, Z. Georgette, I. Mandjavidze, P. Micout DAPNIA/SEDI, CEA Saclay Gif-sur-Yvette Cedex.
Andrew McNab - Globus Distribution for Testbed 1 Status of the Globus Distribution for Testbed 1 Andrew McNab, University of Manchester
Wouter Verkerke, NIKHEF 1 Using ‘stoomboot’ for NIKHEF-ATLAS batch computing What is ‘stoomboot’ – Hardware –16 machines, each 2x quad-core Pentium = 128.
14 June 2001LHCb workshop at Bologna1 LHCb and Datagrid - Status and Planning F Harris(Oxford)
Testbed Monitoring Kaushik De Univ. of Texas at Arlington
Dag Toppe Larsen UiB/CERN CERN,
U.S. ATLAS Grid Production Experience
Progress on NA61/NA49 software virtualisation Dag Toppe Larsen Wrocław
Dag Toppe Larsen UiB/CERN CERN,
PanDA setup at ORNL Sergey Panitkin, Alexei Klimentov BNL
Future Test Activities SA3 All Hands Meeting Dublin
US CMS Testbed.
LCG middleware and LHC experiments ARDA project
U.S. ATLAS Testbed Status Report
Porting LCG to IA64 Andreas Unterkircher CERN openlab May 2004
Alice Software Demonstration
Presentation transcript:

K. De UTA Grid Workshop April 2002 ATLAS Pre-packaged Kaushik De University of Texas at Arlington

K. De UTA Grid Workshop April 2002 Need for a Testbed Application  Need ATLAS application for a real test of Grid computing model  Should be easy to deploy on thousands of machines  Should be useful for physicists  Should test virtual data model  Explored various packages currently available  Explored making our own package

K. De UTA Grid Workshop April 2002 CERNLib Package  Necessary for analysis or builds  ATLAS Kit for GRID Testbed  Recommends RPM package for Cernlib /common/RPMS/cern i386.rpm  289 MB file - easy to install  but does not work!  have to set environment variables by hand or use cernlib i386.rpm found at linux/dist/redhat/6.2/i386/RedHat/RPMS/cernlib  Or install 6 separate packages from comps=/redhat/6.2/i386/RedHat/base/comps&g roup=cernlib  Or get from CERN or BNL cell afs/cern.ch/asis/RPMS/i386_redhat61/CERN.LIB_*

K. De UTA Grid Workshop April 2002 ATLAS_1.3.0_kit.tar.gzATLAS_1.3.0_kit.tar.gz  Advertised as ATLAS application for generic GRID use  Not a binary release!  Contains SRT, CLHEP and libraries  “main Atlas Applications : DiceMain, DicePytMain, AtreconMain”  Tried under RH 7.1  Failed miserably  SRT cannot even determine machine type  Gave up after debugging at least a dozen problems and still fail to build  Tried under RH 6.1  Only produced DiceMain executable  Other two builds crashed horribly  Gave up after many hours of debugging

K. De UTA Grid Workshop April 2002 Other Possibilites  EDG Release  Withdrawn in anticipation of version release!  Many activities in various WP groups - we need to identify person from our group to monitor them so that we do not duplicate effort  Julian Phillips release  Attempted to put in isolated cage  Reduced lib list from ~290 to <100  Works under RH had to get one more library  My package  Closed cage, cleaned up  Wrote Globus scripts  Here’s a demo!

K. De UTA Grid Workshop April 2002 What Next!  Job parameter handling  Condor issues  Choose best node to submit job  File cataloguing  Discussion tomorrow