K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.

Slides:



Advertisements
Similar presentations
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Advertisements

Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006.
Distributed IT Infrastructure for U.S. ATLAS Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
A conceptual model of grid resources and services Authors: Sergio Andreozzi Massimo Sgaravatto Cristina Vistoli Presenter: Sergio Andreozzi INFN-CNAF Bologna.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Experience with ATLAS Data Challenge Production on the U.S. Grid Testbed Kaushik De University of Texas at Arlington CHEP03 March 27, 2003.
Simo Niskala Teemu Pasanen
U.S. ATLAS Physics and Computing Budget and Schedule Review John Huth Harvard University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
Grid Information Systems. Two grid information problems Two problems  Monitoring  Discovery We can use similar techniques for both.
Grappa: Grid access portal for physics applications Shava Smallen Extreme! Computing Laboratory Department of Physics Indiana University.
XCAT Science Portal Status & Future Work July 15, 2002 Shava Smallen Extreme! Computing Laboratory Indiana University.
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
ATLAS Data Challenge Production and U.S. Participation Kaushik De University of Texas at Arlington BNL Physics & Computing Meeting August 29, 2003.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
Pacman issues affect Core software Infrastructure Testbed Preparing demos Preparing interoperability tests This is a critical time for establishing good.
Integration Program Update Rob Gardner US ATLAS Tier 3 Workshop OSG All LIGO.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
TechFair ‘05 University of Arlington November 16, 2005.
Distributed Facilities for U.S. ATLAS Rob Gardner Indiana University PCAP Review of U.S. ATLAS Physics and Computing Project Argonne National Laboratory.
UTA Site Report Jae Yu UTA Site Report 2 nd DOSAR Workshop UTA Mar. 30 – Mar. 31, 2006 Jae Yu Univ. of Texas, Arlington.
US-ATLAS Grid Efforts John Huth Harvard University Agency Review of LHC Computing Lawrence Berkeley Laboratory January 14-17, 2003.
U.T. Arlington High Energy Physics Research Summary of Activities August 1, 2001.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
GriPhyN Status and Project Plan Mike Wilde Mathematics and Computer Science Division Argonne National Laboratory.
Manuela Campanelli The University of Texas at Brownsville EOT-PACI Alliance All-Hands Meeting 30 April 2003 Urbana, Illinois GriPhyN.
U.S. ATLAS Grid Testbed Status and Plans Kaushik De University of Texas at Arlington DoE/NSF Mid-term Review NSF Headquarters, June 2002.
ATLAS Data Challenge Production Experience Kaushik De University of Texas at Arlington Oklahoma D0 SARS Meeting September 26, 2003.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
Status of Grid-enabled UTA McFarm software Tomasz Wlodek University of the Great State of TX At Arlington.
June 02 John Huth, LHC Computing 1 U.S. ATLAS Overview  Project ManagementJ. Huth  SoftwareT.Wenaus  ArchitectureD. Quarrie  PhysicsI. Hinchliffe 
US ATLAS Grid Projects Rob Gardner Indiana University Mid Year Review of US ATLAS Computing NSF Headquarters, Arlington VA June 20, 2002
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
Virtual Batch Queues A Service Oriented View of “The Fabric” Rich Baker Brookhaven National Laboratory April 4, 2002.
Atlas Grid Status - part 1 Jennifer Schopf ANL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
High Energy Physics & Computing Grids TechFair Univ. of Arlington November 10, 2004.
U.S. ATLAS Grid Testbed Status & Plans Kaushik De University of Texas at Arlington PCAP Review LBNL, November 2002.
Grid Production Experience in the ATLAS Experiment Horst Severini University of Oklahoma Kaushik De University of Texas at Arlington D0-SAR Workshop, LaTech.
Post-DC2/Rome Production Kaushik De, Mark Sosebee University of Texas at Arlington U.S. Grid Phone Meeting July 13, 2005.
K. De UTA Grid Workshop April 2002 ATLAS Pre-packaged Kaushik De University of Texas at Arlington.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
U.S. ATLAS Computing Facilities Bruce G. Gibbard GDB Meeting 16 March 2005.
PanDA & BigPanDA Kaushik De Univ. of Texas at Arlington BigPanDA Workshop, CERN October 21, 2013.
Thursday February 16th, System Center User Group Greater Philadelphia Entire 2012 meeting schedule has been posted on the user.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
The Earth System Grid (ESG) A Fault Monitoring System for ESG Components DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
Future of Distributed Production in US Facilities Kaushik De Univ. of Texas at Arlington US ATLAS Distributed Facility Workshop, Santa Cruz November 13,
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Planning Session. ATLAS(-CMS) End-to-End Demo Kaushik De is the Demo Czar Need to put team together Atlfast production jobs –Atlfast may be unstable over.
Collaborative Tools for the Grid V.N Alexandrov S. Mehmood Hasan.
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
Update on CHEP from the Computing Speaker Committee G. Carlino (INFN Napoli) on behalf of the CSC ICB, October
Testbed Monitoring Kaushik De Univ. of Texas at Arlington
Computing Operations Roadmap
U.S. ATLAS Grid Production Experience
U.S. ATLAS Tier 2 Computing Center
U.S. ATLAS Testbed Status Report
The EU DataGrid Project Tutorial
SAM Offsite Coordination
Presentation transcript:

K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington

K. De UTA Grid Workshop April 2002 ATLAS Grid BNL  New paradigm for computing in HEP  Enable collaborative research and resource sharing in large international experiment using distributed computing techniques  Could be the foundation for next revolution in world-wide computing

K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Lawrence Berkeley National Laboratory Brookhaven National Laboratory Indiana University Boston University Argonne National Laboratory U Michigan University of Texas at Arlington Oklahoma University US -ATLAS testbed launched February 2001

K. De UTA Grid Workshop April 2002 Current Status  Eight participating sites running Globus 1.1.x for > 1 year  Tested inter-operability with ‘Hello World’ jobs  Developed and deployed new system tools - GridView, Gripe, PacMan, Magda  Need to move forward:  No user applications running on testbed machines ~90% idle  No user analysis tools  Need coordinated plan to upgrade testbed software  Need organized effort to develop user applications & tools  Need deployment schedule

K. De UTA Grid Workshop April 2002

K. De UTA Grid Workshop April 2002 Workshop Goals  General goals for the workshop  uniform deployment of VDT 1.0 at testbed sites  core software deployment at testbed sites (afs, local...)  participation in DC1 data generation (Tier 1, testbed, TeraGrid, other facilities)  plans for scheduling, monitoring, site policies etc.  status of US ATLAS tools like pacman, magda, grappa, gridview, gripe...  other user level grid tools for DC1 data analysis  other software issues/requirements for testbed sites

K. De UTA Grid Workshop April 2002 Workshop Plans  Assess available resources and develop plan for testbed hardware  Thursday morning session  Identify ATLAS application(s) for testbed deployment  Thursday after-lunch session  Plan software (toolkit) upgrade and deployment in testbed  Thursday late-afternoon session  Status of system tools  Friday morning pre-coffee break  Plan for summer 2002 demo  Friday morning post-coffee break  Final planning and talks on other grid experiences  Friday afternoon session  Hands on tutorial  Saturday morning session