UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.

Slides:



Advertisements
Similar presentations
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Advertisements

DOSAR Workshop VI April 17, 2008 Louisiana Tech Site Report Michael Bryant Louisiana Tech University.
Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006.
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
1 A Basic R&D for an Analysis Framework Distributed on Wide Area Network Hiroshi Sakamoto International Center for Elementary Particle Physics (ICEPP),
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
US ATLAS Western Tier 2 Status and Plan Wei Yang ATLAS Physics Analysis Retreat SLAC March 5, 2007.
PCGRID ‘08 Workshop, Miami, FL April 18, 2008 Preston Smith Implementing an Industrial-Strength Academic Cyberinfrastructure at Purdue University.
ISU DOSAR WORKSHOP Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University April 5, 2007.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
TechFair ‘05 University of Arlington November 16, 2005.
Computing for HEP in the Czech Republic Jiří Chudoba Institute of Physics, AS CR, Prague.
CERN - IT Department CH-1211 Genève 23 Switzerland t Tier0 database extensions and multi-core/64 bit studies Maria Girone, CERN IT-PSS LCG.
UTA Site Report Jae Yu UTA Site Report 2 nd DOSAR Workshop UTA Mar. 30 – Mar. 31, 2006 Jae Yu Univ. of Texas, Arlington.
DØSAR, State of the Organization Jae Yu DOSAR, Its State of Organization 7th DØSAR (3 rd DOSAR) Workshop University of Oklahoma Sept. 21 – 22, 2006 Jae.
Status of DØ Computing at UTA Introduction The UTA – DØ Grid team DØ Monte Carlo Production The DØ Grid Computing –DØRAC –DØSAR –DØGrid Software Development.
12th November 2003LHCb Software Week1 UK Computing Glenn Patrick Rutherford Appleton Laboratory.
DOSAR Workshop Sept , 2007 J. Cochran 1 The State of DOSAR Outline What Exactly is DOSAR (for the new folks) Brief History Goals, Accomplishments,
High Energy Physics at UTA UTA faculty Andrew Brandt, Kaushik De, Andrew White, Jae Yu along with many post-docs, graduate and undergraduate students investigate.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
Introduction to U.S. ATLAS Facilities Rich Baker Brookhaven National Lab.
DOSAR VO ACTION AGENDA ACTION ITEMS AND GOALS CARRIED FORWARD FROM THE DOSAR VI WORKSHOP AT OLE MISS APRIL 17-18, 2008.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
Jan. 17, 2002DØRAM Proposal DØRACE Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Remote Analysis Station ArchitectureRemote.
ScotGRID:The Scottish LHC Computing Centre Summary of the ScotGRID Project Summary of the ScotGRID Project Phase2 of the ScotGRID Project Phase2 of the.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Summary, Action Items and Milestones 1 st HiPCAT THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington Contact Jae Yu or Alan.
JLab Scientific Computing: Theory HPC & Experimental Physics Thomas Jefferson National Accelerator Facility Newport News, VA Sandy Philpott.
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
RAL Site Report Andrew Sansum e-Science Centre, CCLRC-RAL HEPiX May 2004.
Spending Plans and Schedule Jae Yu July 26, 2002.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
The II SAS Testbed Site Jan Astalos - Institute of Informatics Slovak Academy of Sciences.
DØSAR a Regional Grid within DØ Jae Yu Univ. of Texas, Arlington THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington.
HiPCAT, The Texas HPC and Grid Organization 4 th DOSAR Workshop Iowa State University Jaehoon Yu University of Texas at Arlington.
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
ATLAS Tier 1 at BNL Overview Bruce G. Gibbard Grid Deployment Board BNL 5-6 September 2006.
IDE disk servers at CERN Helge Meinhard / CERN-IT CERN OpenLab workshop 17 March 2003.
PC clusters in KEK A.Manabe KEK(Japan). 22 May '01LSCC WS '012 PC clusters in KEK s Belle (in KEKB) PC clusters s Neutron Shielding Simulation cluster.
U.S. ATLAS Computing Facilities Bruce G. Gibbard GDB Meeting 16 March 2005.
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
Outline  Higgs Particle Searches for Origin of Mass  Grid Computing  A brief Linear Collider Detector R&D  The  The grand conclusion: YOU are the.
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
ATLAS Midwest Tier2 University of Chicago Indiana University Rob Gardner Computation and Enrico Fermi Institutes University of Chicago WLCG Collaboration.
January 30, 2016 RHIC/USATLAS Computing Facility Overview Dantong Yu Brookhaven National Lab.
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
The State of DOSAR DOSAR VI Workshop at Ole Miss April Dick Greenwood Louisiana Tech University.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
DOSAR Roadmap Jae Yu DOSAR Roadmap 5 th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007 Jae Yu Univ. of Texas, Arlington LHC Tier 3 Efforts.
Feb. 13, 2002DØRAM Proposal DØCPB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Partial Workshop ResultsPartial.
Ole’ Miss DOSAR Grid Michael D. Joy Institutional Analysis Center.
Distributed Physics Analysis Past, Present, and Future Kaushik De University of Texas at Arlington (ATLAS & D0 Collaborations) ICHEP’06, Moscow July 29,
Southwest Tier 2 (UTA). Current Inventory Dedidcated Resources  UTA_SWT2 320 cores - 2GB/core Xeon EM64T (3.2GHz) Several Headnodes 20TB/16TB in IBRIX/DDN.
Pathway to Petaflops A vendor contribution Philippe Trautmann Business Development Manager HPC & Grid Global Education, Government & Healthcare.
DOSAR II Workshop Goals and Organizations Jae Yu DOSAR II Workshop, UTA Mar. 30 – 31, 2006 Introduction Workshop Goals Workshop Organization.
LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University DOSAR II Workshop at UT-Arlington March 30-31, 2005.
OSG All Hands Meeting P. Skubic DOSAR OSG All Hands Meeting March 5-8, 2007 Pat Skubic University of Oklahoma Outline What is DOSAR? History of DOSAR Goals.
A Distributed Tier-1 for WLCG Michael Grønager, PhD Technical Coordinator, NDGF CHEP 2007 Victoria, September the 3 rd, 2007.
Computing Operations Roadmap
6th DOSAR Workshop University Mississippi Apr. 17 – 18, 2008
Belle II Physics Analysis Center at TIFR
U.S. ATLAS Tier 2 Computing Center
Southwest Tier 2 Center Status Report
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
Southwest Tier 2.
DOSAR: State of Organization
High Energy Physics at UTA
High Energy Physics at UTA
Presentation transcript:

UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington

UTA Site Report Jae Yu 4/2/20092 UTA a partner of ATLAS SWT2 –Actively participating in ATLAS production Kaushik De is co-leading Panda development Phase I implementation at UTACC completed and running Phase II implementation at Physics and Chemistry Research Building on- going –DDM monitoring project completed –JY co-leading the ATLAS Operations Support HEP group working with other discipline in shared use of existing computing resources –Interacting with the campus HPC community Working with HiPCAT, Texas grid community Co-Linux Condor cluster setup essentially stopped but will pick it back up soon… Introduction

UTA Site Report Jae Yu 4/2/20093 UTA HEP-CSE + UTSW Medical joint project through NSF MRI Primary equipment for D0 re-reconstruction and MC production up to 2005 Now primarily participating in ATLAS MC production and reprocessing at part of SWT2 resources Other disciplines also use this facility but at a minimal level –Biology, Geology, UTSW medical, etc –Simulation on detector development Hardware Capacity –PC based Linux system assisted by some 70TB of IDE disk storage –3 IBM PS157 Series Shared Memory systems UTA DPCC – The 2003 Solution

UTA Site Report Jae Yu 4/2/20094 UTA – DPCC 100 P4 Xeon 2.6GHz CPU = 260 GHz 64TB of IDE RAID + 4TB internal NFS File system 84 P4 Xeon 2.4GHz CPU = 202 GHz 5TB of FBC + 3.2TB IDE Internal GFS File system Total CPU: 462 GHz Total disk: 76.2TB Total Memory: 168Gbyte Network bandwidth: 68Gb/sec HEP – CSE Joint Project DØ+ATLAS CSE Research

UTA Site Report Jae Yu The Southwest Tier 2 Center is a collaboration between the University of Texas at Arlington (UTA) and the University of Oklahoma (OU) Personnel:  UTA: Kaushik De, Patrick McGuigan, Victor Reece, Mark Sosebee  OU: Karthik Arunachalam, Horst Severini, Pat Skubic, Joel Snow (LU) SW UTA

UTA Site Report Jae Yu  183 compute nodes (732 cores):  Mix of Operton 2216/2220 cpu’s, dual core, 2 GB RAM / core  Front-end nodes, monitoring hosts, etc.:  Mix of Operton and Xeon systems which provide cluster gateways (Globus, storage, etc.), administration  225 TB storage (usable):  Based on Dell MD1000 systems  xrootd is used to provide a unified file namespace Configuration of Phase II at CPB

UTA Site Report Jae Yu  160 compute nodes:  Dual Xeon EM64T, 3.2 GHz, 4GB RAM, 160 GB disk  8 front-end nodes:  Dual Xeon EM64T, 8GB RAM, 73 GB SCSI RAID 1  16 TB SAN storage (IBRIX):  80 x 250 GB SATA disks  6 I/O servers, 1 management server  16 TB potential in compute nodes Configuration of Phase I at ARDC

UTA Site Report Jae Yu  Most pressing issue resource-wise is storage capacity  Of course number of cpu’s will grow over time, but ensuring adequate storage is critical  Process underway for next purchase:  Approximately 600 TB (raw) in the same Dell MD1000 disk arrays + storage servers utilizing PERC5 controller cards  A cluster dedicated for user analysis – Of order 100 cpu’s (cores), and 100 TB of disk  We’re hoping this equipment will arrive by late April or early May Upcoming Expansion at UTA T2

UTA Site Report Jae Yu 4/2/20099 CSE Student Exchange Program Joint effort between HEP and CSE A total of 10 CSE MS Students each have worked in SAM- Grid team –Five generations of the student –Many of them playing leading roles in grid community Abishek Rana at UCSD Parag Mashilka at FNAL The program with BNL Panda project a mature project now –Three students completed their tenure Two obtained MS and one working on Ph.D. –One Ph.D. students at UTA working with BNL team –Participating in ATLAS Panda monitoring project

UTA Site Report Jae Yu 4/2/ Conclusions Actively engaged in preparing for the collisions at the LHC SWT2 taking its shape as a well established facility –DPCC is now getting called “old” but still being used Involved in ATLAS Computing Operations Support –Activities will pick up the speed, in particular for Working closely with HiPCAT for State-wide grid activities Co-Linux Condor Cluster setup activities is at a holt at the moment but will need to pick up the speed soon CSE student exchange program still on going but with Ph.D. students now