Southwest Tier 2 Center Status Report

Slides:



Advertisements
Similar presentations
Oklahoma Center for High Energy Physics A DOE EPSCoR Proposal Submitted by a consortium of University of Oklahoma, Oklahoma State University, Langston.
Advertisements

Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Site Report US CMS T2 Workshop Samir Cury on behalf of T2_BR_UERJ Team.
March 27, IndiaCMS Meeting, Delhi1 T2_IN_TIFR of all-of-us, for all-of-us, by some-of-us Tier-2 Status Report.
Database Services for Physics at CERN with Oracle 10g RAC HEPiX - April 4th 2006, Rome Luca Canali, CERN.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
TechFair ‘05 University of Arlington November 16, 2005.
UTA Site Report Jae Yu UTA Site Report 2 nd DOSAR Workshop UTA Mar. 30 – Mar. 31, 2006 Jae Yu Univ. of Texas, Arlington.
ATLAS DC2 seen from Prague Tier2 center - some remarks Atlas sw workshop September 2004.
Oxford Update HEPix Pete Gronbech GridPP Project Manager October 2014.
DOSAR VO ACTION AGENDA ACTION ITEMS AND GOALS CARRIED FORWARD FROM THE DOSAR VI WORKSHOP AT OLE MISS APRIL 17-18, 2008.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Spending Plans and Schedule Jae Yu July 26, 2002.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
NL Service Challenge Plans Kors Bos, Sander Klous, Davide Salomoni (NIKHEF) Pieter de Boer, Mark van de Sanden, Huub Stoffers, Ron Trompert, Jules Wolfrat.
DØSAR a Regional Grid within DØ Jae Yu Univ. of Texas, Arlington THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington.
Sep 02 IPP Canada Remote Computing Plans Pekka K. Sinervo Department of Physics University of Toronto 4 Sep IPP Overview 2 Local Computing 3 Network.
ATLAS Great Lakes Tier-2 (AGL-Tier2) Shawn McKee (for the AGL Tier2) University of Michigan US ATLAS Tier-2 Meeting at Harvard Boston, MA, August 17 th,
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
The DCS lab. Computer infrastructure Peter Chochula.
High Energy Physics & Computing Grids TechFair Univ. of Arlington November 10, 2004.
CDF computing in the GRID framework in Santander
Status SC3 SARA/Nikhef 20 juli Status & results SC3 throughput phase SARA/Nikhef Mark van de Sanden.
UTA Site Report DØrace Workshop February 11, 2002.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
U.S. ATLAS Computing Facilities Bruce G. Gibbard GDB Meeting 16 March 2005.
CERN Computer Centre Tier SC4 Planning FZK October 20 th 2005 CERN.ch.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
RHIC/US ATLAS Tier 1 Computing Facility Site Report Christopher Hollowell Physics Department Brookhaven National Laboratory HEPiX Upton,
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
RAL PPD Tier 2 (and stuff) Site Report Rob Harper HEP SysMan 30 th June
Feb. 13, 2002DØRAM Proposal DØCPB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Partial Workshop ResultsPartial.
Scientific Computing in PPD and other odds and ends Chris Brew.
Southwest Tier 2 (UTA). Current Inventory Dedidcated Resources  UTA_SWT2 320 cores - 2GB/core Xeon EM64T (3.2GHz) Several Headnodes 20TB/16TB in IBRIX/DDN.
Pathway to Petaflops A vendor contribution Philippe Trautmann Business Development Manager HPC & Grid Global Education, Government & Healthcare.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
ATLAS TIER3 in Valencia Santiago González de la Hoz IFIC – Instituto de Física Corpuscular (Valencia)
LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University DOSAR II Workshop at UT-Arlington March 30-31, 2005.
OSG All Hands Meeting P. Skubic DOSAR OSG All Hands Meeting March 5-8, 2007 Pat Skubic University of Oklahoma Outline What is DOSAR? History of DOSAR Goals.
High Energy Physics at the OU Supercomputing Center for Education & Research Henry Neeman, Director OU Supercomputing Center for Education & Research University.
BeStMan/DFS support in VDT OSG Site Administrators workshop Indianapolis August Tanya Levshina Fermilab.
Title of the Poster Supervised By: Prof.*********
Experience of Lustre at QMUL
The Beijing Tier 2: status and plans
6th DOSAR Workshop University Mississippi Apr. 17 – 18, 2008
LCG Service Challenge: Planning and Milestones
Belle II Physics Analysis Center at TIFR
U.S. ATLAS Tier 2 Computing Center
Cluster / Grid Status Update
LCG 3D Distributed Deployment of Databases
LCG Deployment in Japan
OUHEP STATUS Hardware OUHEP0, 2x Athlon 1GHz, 2 GB, 800GB RAID
Vanderbilt Tier 2 Project
Kolkata Status and Plan
Experience of Lustre at a Tier-2 site
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
The INFN TIER1 Regional Centre
Computing Board Report CHIPP Plenary Meeting
UTFSM computer cluster
Southwest Tier 2.
DOSAR: State of Organization
U.S. ATLAS Testbed Status Report
DØ MC and Data Processing on the Grid
Presentation transcript:

Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - UTA Mark Sosebee for the SWT2 Center December 8, 2006

Overview UTA: Kaushik De, Patrick McGuigan, The Southwest Tier 2 Center is a collaboration between the University of Texas at Arlington (UTA) and the University of Oklahoma (OU) Personnel: UTA: Kaushik De, Patrick McGuigan, Victor Reece, Mark Sosebee OU: Karthik Arunachalam, Horst Severini, Pat Skubic, Joel Snow (LU) December 8, 2006 Mark Sosebee

UTA CC Hardware Configuration 160 compute nodes: Dual Xeon EM64T, 3.2 GHz, 4GB RAM, 160 GB disk 8 front-end nodes: Dual Xeon EM64T, 8GB RAM, 73 GB SCSI RAID 1 16 TB SAN storage (IBRIX): 80 x 250 GB SATA disks 6 I/O servers, 1 management server 16 TB potential in compute nodes December 8, 2006 Mark Sosebee

UTA CC December 8, 2006 Mark Sosebee

UTA DPCC Hardware Configuration Shared resource with CSE department 75 compute nodes: Dual Xeon 2.4-2.6 GHz 2 GB RAM 60-80 GB local disks 45 TB among 10 NFS servers (IDE RAID) Typically ~100 ATLAS production queue slots December 8, 2006 Mark Sosebee

OCHEP Hardware Configuration 40 compute nodes: Dual Xeon EM64T, 3.2 GHz, 4GB RAM, 160 GB Disk 2 front-end nodes: Dual Xeon EM64T, 8GB RAM, 73GB SCSI Raid 1 4 TB SAN storage (IBRIX): 20 x 250 GB SATA disks 2 I/O servers, 1 management server 4 TB potential in compute nodes December 8, 2006 Mark Sosebee

Additional OU Resources Old OSCER cluster, boomer: 135 dual Xeon nodes, 2 GHz 5 TB storage Used for DØ MC production & data processing New OSCER cluster, topdawg: 512 dual Xeon EM64T nodes, 3.2 GHz 10 TB storage Used for ATLAS Tier 2 & DØ computing as available December 8, 2006 Mark Sosebee

Network Connectivity UTA: Gigabit link to the North Texas Gigapop OC12 from NTG to Houston peering site (I2) Future: LEARN / NLR OU: Campus backbone 10 Gbps Connection to NLR via OneNet OU  OneNet & OneNet  NLR 10 Gbps capable Currently setup for 2 Gbps Traffic to/from most HEP end points already routed through NLR December 8, 2006 Mark Sosebee

UTA CC / IBRIX Scaling issues were observed with IBRIX when the number of running jobs exceeded ~ 150 Lost files One segment server becomes a “hot-spot” IBRIX tech support recommended: Upgrade software to v2 Reconfigure storage – one large filesystem rather than two Software upgraded at the end of August Performance much improved – all CPU’s now in production December 8, 2006 Mark Sosebee

Analysis for Regional Users Two workshops have been held at UTA (March & May) to promote physics analysis for ATLAS groups in the SWT2 region Participants: UTA OU, UNM, Langston, SMU, UT Dallas, LTU Bi-weekly video / phone meetings See: http://indico.cern.ch/categoryDisplay.py?categId=885 December 8, 2006 Mark Sosebee

Other Activities / Future Plans at SWT2 Ongoing effort devoted to DQ2 deployment, testing, and upgrades SC4 dCache setup – it worked, with a couple of caveats. Learned much about the system for potential future deployments Start utilizing new OSCER cluster (topdawg) ATLAS remote DDM operations successful Awaiting production status DØ SAMGrid jobs already certified & running Discussions underway with hardware vendors for next phase of cluster – weighted toward large storage (~50 TB, ~100 cpu’s) December 8, 2006 Mark Sosebee

Machine Room: UTA CPB December 8, 2006 Mark Sosebee

Conclusion SWT2 is operational and performing well Next hardware purchase early 2007 – Chemistry & Physics building at UTA Large topdawg cluster at OU will be used as available for ATLAS computing Supporting regional analysis – “best effort” http://www.atlas-swt2.org/twiki/bin/view/SWT2/WebHome December 8, 2006 Mark Sosebee