Atlas Grid Status - part 1 Jennifer Schopf ANL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.

Slides:



Advertisements
Similar presentations
WP2: Data Management Gavin McCance University of Glasgow November 5, 2001.
Advertisements

ATLAS/LHCb GANGA DEVELOPMENT Introduction Requirements Architecture and design Interfacing to the Grid Ganga prototyping A. Soroko (Oxford), K. Harrison.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
USING THE GLOBUS TOOLKIT This summary by: Asad Samar / CALTECH/CMS Ben Segal / CERN-IT FULL INFO AT:
David Adams ATLAS DIAL Distributed Interactive Analysis of Large datasets David Adams BNL March 25, 2003 CHEP 2003 Data Analysis Environment and Visualization.
GRID DATA MANAGEMENT PILOT (GDMP) Asad Samar (Caltech) ACAT 2000, Fermilab October , 2000.
Workload Management Workpackage Massimo Sgaravatto INFN Padova.
The Globus Toolkit Gary Jackson. Introduction The Globus Toolkit is a product of the Globus Alliance ( It is middleware for developing.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
GRID Workload Management System Massimo Sgaravatto INFN Padova.
Magda – Manager for grid-based data Wensheng Deng Physics Applications Software group Brookhaven National Laboratory.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Commodity Grid (CoG) Kits Keith Jackson, Lawrence Berkeley National Laboratory Gregor von Laszewski, Argonne National Laboratory.
Grappa: Grid access portal for physics applications Shava Smallen Extreme! Computing Laboratory Department of Physics Indiana University.
XCAT Science Portal Status & Future Work July 15, 2002 Shava Smallen Extreme! Computing Laboratory Indiana University.
Workload Management WP Status and next steps Massimo Sgaravatto INFN Padova.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
CoG Kit Overview Gregor von Laszewski Keith Jackson.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
Grids and Portals for VLAB Marlon Pierce Community Grids Lab Indiana University.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
David Adams ATLAS DIAL status David Adams BNL July 16, 2003 ATLAS GRID meeting CERN.
GriPhyN Status and Project Plan Mike Wilde Mathematics and Computer Science Division Argonne National Laboratory.
Ruth Pordes, Fermilab CD, and A PPDG Coordinator Some Aspects of The Particle Physics Data Grid Collaboratory Pilot (PPDG) and The Grid Physics Network.
Grid Workload Management Massimo Sgaravatto INFN Padova.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
U.S. ATLAS Grid Testbed Status and Plans Kaushik De University of Texas at Arlington DoE/NSF Mid-term Review NSF Headquarters, June 2002.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
MAGDA Roger Jones UCL 16 th December RWL Jones, Lancaster University MAGDA  Main authors: Wensheng Deng, Torre Wenaus Wensheng DengTorre WenausWensheng.
29 May 2002Joint EDG/WP8-EDT/WP4 MeetingClaudio Grandi INFN Bologna LHC Experiments Grid Integration Plans C.Grandi INFN - Bologna.
US ATLAS Grid Projects Rob Gardner Indiana University Mid Year Review of US ATLAS Computing NSF Headquarters, Arlington VA June 20, 2002
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Production Tools in ATLAS RWL Jones GridPP EB 24 th June 2003.
Magda status and related work in PPDG year 2 Torre Wenaus, BNL/CERN US ATLAS Core/Grid Software Workshop, BNL May 6-7, 2002 CERN.
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
Grappa Grid Access Portal for Physics Applications CHEP 2003 UCSD March 24-28,2003 Daniel Engh (UC), Shava Smallen (IU), Liang Fang (IU), Jerry Gieraltowski.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Packaging & Testing: NMI & VDT.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
PPDG update l We want to join PPDG l They want PHENIX to join NSF also wants this l Issue is to identify our goals/projects Ingredients: What we need/want.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
GDB Meeting - 10 June 2003 ATLAS Offline Software David R. Quarrie Lawrence Berkeley National Laboratory
DGC Paris WP2 Summary of Discussions and Plans Peter Z. Kunszt And the WP2 team.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
Magda Distributed Data Manager Prototype Torre Wenaus BNL September 2001.
Status of Globus activities Massimo Sgaravatto INFN Padova for the INFN Globus group
1 A Scalable Distributed Data Management System for ATLAS David Cameron CERN CHEP 2006 Mumbai, India.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL DOE/NSF Review of US LHC Software and Computing Fermilab Nov 29, 2001.
1 Application status F.Carminati 11 December 2001.
Grid Activities in CMS Asad Samar (Caltech) PPDG meeting, Argonne July 13-14, 2000.
Planning Session. ATLAS(-CMS) End-to-End Demo Kaushik De is the Demo Czar Need to put team together Atlfast production jobs –Atlfast may be unstable over.
Current Globus Developments Jennifer Schopf, ANL.
Magda Distributed Data Manager Torre Wenaus BNL October 2001.
U.S. ATLAS Grid Production Experience
U.S. ATLAS Testbed Status Report
Status of Grids for HEP and HENP
Presentation transcript:

Atlas Grid Status - part 1 Jennifer Schopf ANL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001

Oct 2001 Jennifer Schopf, ANL 2 Globus/Atlas Interactions GRAPPA/ Gardner Grid Data Access/Malon Magda/ Wenaus GridView and other Monitoring De, Schopf, Yu Condor (G) GRAM GSI MDS/ GIIS/GRIS GridFTP Replica Cat Replica Mgr PacMan/ Youssef Packaging

Oct 2001 Jennifer Schopf, ANL 3 Globus Toolkit TM  Core protocols and services  Grid Security Infrastructure  Grid Resource Access & Management  MDS information service & monitoring  GridFTP data access & transfer  Data Grid technologies  Replica catalog, replica management service  Reliable file transfer  Defacto-standard for Grid projects:  GriPhyN, PPDG, NEES, EU DataGrid, ESG, Fusion Collaboratory, DISCOM, NASA IPG, NSF TeraGrid, DOE Science Grid, EU DataGrid, UK Grid Center, U.S. GRIDS Center, Access Grid, GridPort, MPICH-G2, Condor-G, GrADS, and others

Oct 2001 Jennifer Schopf, ANL 4 Globus Status  New release  In alpha4 now, beta before SC ‘01 (Nov 15), 4Q01 release  New packaging: enables modular binary and source distributions  GRAM 1.5 (job submission): enhanced robustness  MDS-2.1 (information service): security, better performance, etc.  GridFTP- secure large file transfer  Replica Mgt- Data management, catalogs for replicas  Continuing work  Community Authorization Services  Reliable file transfer  Java, other “Commodity Grid” toolkits

Oct 2001 Jennifer Schopf, ANL 5 ATLAS GriPhyN/iVDGL  GriPhyN funded Fall 2000, $11.9M/5 years  iVDGL funded Fall 2001, 5 years, $13.65M/5 year  Both involve ATLAS, CMS, Ligo and SDSS  ATLAS support: 2.5 FTE IU, 1.5 FTE BU, $331K/5yr IU HW  R. Gardner is the ATLAS lead, J. Schopf is the CS liaison  GriPhyN proposal emphasizes virtual data requirements, collaboration between experiments  iVDGL proposal emphasizes testbed and infrastructure issues across experiment

Oct 2001 Jennifer Schopf, ANL 6 GriPhyN  Principal ATLAS GriPhyN/iVDGL deliverables:  2001: Testbed with GriPhyN VDT 1.0, packaged with PacMan  2002: Serving DC1 data  2003: Dataset re-creation/Data signature  Additional efforts:  Monitoring: Dantong Yu, BNL, and J. Schopf,ANL, joint with PPDG  GRAPPA: Rob Gardner and Randy Bramley, IU

Oct 2001 Jennifer Schopf, ANL 7 GriPhyN Testbed Issues T GriPhyN is defining VDT 1.o q SW install for GriPhyN/PPDG, compatible with EDG as well q Globus 2.0 beta when it’s released (10/30) z GSI, GridFTP, MDS, repl, cat stuff, etc (Gram 1.5) q GDMP 2.0 (supports flat files) q Condor (also Condor-G, Dagman) T Extra tools for ATLAS q objectivity 6.1 q Magda T Still need to resolve CA issue between EDG and US test sites

Oct 2001 Jennifer Schopf, ANL 8 GriPhyN ATLAS Goal 1 Serving DC1 Data (July-Dec 2002) T Limited reconstruction analysis job using grid job submission interface T Serving the data results form DC1 q As part of DC1, data must be tagged with meta data for ease of access  Minimal keywords would be sufficient  Magda already implements portions of this T Job submission with minimal smarts q Extend GRAPPA work q Move compute resources to data sites

Oct 2001 Jennifer Schopf, ANL 9 GriPhyN ATLAS Goal 2 Dataset Re-creation (Jan-Sept 2003) T Goal: be able to re-create a data file T Need to evaluate what parameters need to be kept track of T Need to evaluate data needed for full data signature T Need to develop a metric for evaluating success - what is good enough?

Oct 2001 Jennifer Schopf, ANL 10 GRAPPA: Grid Access Portal for Physics Applications  Provide a point of access to ATLAS grid resources  IU (Physics, CS), Northwestern (ECE), ANL (CS), BU (Physics),  Provide a simple interface for physicists to submit and monitor jobs on the Grid  Web-based as well as script-based  Ability to “replay”  Compatible with both ATLSim and Athena architecture  Adaptable and/or “extensible” to new developments in Grid software, Athena, etc

Oct 2001 Jennifer Schopf, ANL 11 GRAPPA Components  User Interface  Job submission  Monitoring  Bookkeeping  Resource selection

Oct 2001 Jennifer Schopf, ANL 12 Current Status  Simple prototype that allows users to submit an Athena job from a web interface to the Condor pool on the Atlas IU cluster (via Globus)  Next steps  Adding more Athena functionality to the interface (e.g., user's defined libraries)  Experiment with other job launch mechanisms  Condor-G and DAGMAN description language  Web Services Flow Language as a more general workflow description  Explore interfaces for multiple metadata and replica catalog systems. 

Oct 2001 Jennifer Schopf, ANL 13 Grid-Enabled Data Access in Athena T David Malon, ANL T Integrate Grid Data Access techniques (Globus replica catalog and/or GDMP) into the Athena event selection module T When a file is needed, a check is done to see if the file is local q if not, use Grid Data Access tools to make it local

Oct 2001 Jennifer Schopf, ANL 14 Grid-Enabled Data Access in Athena  Athena and Globus  Search Globus replica catalog and select  Transfer file using the protocol associated with the location object in the catalog (gsiftp, https/globus-url-copy from a remote gass_server,...)  Athena and GDMP  Work to date uses GDMP 1.2.2, which had not yet incorporated the Globus replica catalog  Uses the GDMP import/export catalogs  Supports certain subscription-based approaches  Automatically updates Objectivity/DB internal catalog when Objectivity database files are transferred

Oct 2001 Jennifer Schopf, ANL 15 Status  Paper presented at CHEP  Prototype up and running on ANL systems  GDMP-based prototype work done between CERN and Milan  Next Steps:  GDMP will use Globus data replica work  Use GDMP to copy files  Requirements and design work for interfaces between Athena data producers and metadata catalogs and replica management services (e.g., Magda)  Metadata work – extending the Athena Event Selector properties to allow for data signature/virtual data

Oct 2001 Jennifer Schopf, ANL 16 Monitoring  Joint working group set up between PPDG and GriPhyN to investigate monitoring issues  Led by J. Schopf and D. Yu  Monitoring currently defined very broadly:  Is this router configured correctly?  Has the application finished using that file yet?  What information do I need to determine where to run my application?

Oct 2001 Jennifer Schopf, ANL 17 GridView (Kaushik De) T Tool to show the status of the 8 test bed machines on the web T Uses Globus GRAM to query sites every 30 mins T Hostname, Uptime, Idletime, # users, and Load average T Next steps include q Integration with Globus information service (MDS) q Visualization as part of cross PPDG/GriPhyN monitoring work

Oct 2001 Jennifer Schopf, ANL 18

Oct 2001 Jennifer Schopf, ANL 19 Application Level Monitoring (Taylor)  Initial testbed to incorporate GRID monitoring capabilities into Athena  Collaboration with Valerie Taylor, David Quarrie, and others  Very long ramp-up due to difficulty of “outsiders” running an Athena application  Working on developing an Auditor for Athena  ml

Oct 2001 Jennifer Schopf, ANL 20 Current Status  Mailing list has been set up   Defining usage cases  Sensors  Predictors  Archiving  Will gather requirements and look at extending Globus MDS as a common framework to meet these requirements

Oct 2001 Jennifer Schopf, ANL 21 Summary  Globus  GriPhyN/iVDGL  Grappa  Grid-Enabled Data Access  Monitoring and Visualization