LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University DOSAR II Workshop at UT-Arlington March 30-31, 2005.

Slides:



Advertisements
Similar presentations
4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
Advertisements

DOSAR Workshop VI April 17, 2008 Louisiana Tech Site Report Michael Bryant Louisiana Tech University.
Southwest Tier 2 Center Status Report U.S. ATLAS Tier 2 Workshop - Harvard Mark Sosebee for the SWT2 Center August 17, 2006.
DOSAR Workshop VII April 2, 2009 Louisiana Tech Site Report Michael S. Bryant Systems Manager, CAPS/Physics Louisiana Tech University
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Southgrid Status Pete Gronbech: 27th June 2006 GridPP 16 QMUL.
IFIN-HH LHCB GRID Activities Eduard Pauna Radu Stoica.
1 INDIACMS-TIFR TIER-2 Grid Status Report IndiaCMS Meeting, Sep 27-28, 2007 Delhi University, India.
DOSAR Workshop V September 27, 2007 Michael Bryant Louisiana Tech University Louisiana Tech Site Report.
ScotGrid: a Prototype Tier-2 Centre – Steve Thorn, Edinburgh University SCOTGRID: A PROTOTYPE TIER-2 CENTRE Steve Thorn Authors: A. Earl, P. Clark, S.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
Oak Ridge National Laboratory — U.S. Department of Energy 1 The ORNL Cluster Computing Experience… Stephen L. Scott Oak Ridge National Laboratory Computer.
ISU DOSAR WORKSHOP Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University April 5, 2007.
Grid Job and Information Management (JIM) for D0 and CDF Gabriele Garzoglio for the JIM Team.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
David Hutchcroft on behalf of John Bland Rob Fay Steve Jones And Mike Houlden [ret.] * /.\ /..‘\ /'.‘\ /.''.'\ /.'.'.\ /'.''.'.\ ^^^[_]^^^ * /.\ /..‘\
UTA Site Report Jae Yu UTA Site Report 2 nd DOSAR Workshop UTA Mar. 30 – Mar. 31, 2006 Jae Yu Univ. of Texas, Arlington.
Deploying and Operating the SAM-Grid: lesson learned Gabriele Garzoglio for the SAM-Grid Team Sep 28, 2004.
DOSAR VO ACTION AGENDA ACTION ITEMS AND GOALS CARRIED FORWARD FROM THE DOSAR VI WORKSHOP AT OLE MISS APRIL 17-18, 2008.
1 st December 2003 JIM for CDF 1 JIM and SAMGrid for CDF Mòrag Burgon-Lyon University of Glasgow.
Loosely Coupled Parallelism: Clusters. Context We have studied older archictures for loosely coupled parallelism, such as mesh’s, hypercubes etc, which.
March 2003 CERN 1 EDG and AliEn in Prague Dagmar Adamova INP Rez near Prague.
SAMGrid as a Stakeholder of FermiGrid Valeria Bartsch Computing Division Fermilab.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
DOSAR Workshop at Sao Paulo Dick Greenwood What’s Next for DOSAR? Dick Greenwood Louisiana Tech University 1 st DOSAR Workshop at the Sao Paulo, Brazil.
Batch Scheduling at LeSC with Sun Grid Engine David McBride Systems Programmer London e-Science Centre Department of Computing, Imperial College.
Sejong STATUS Chang Yeong CHOI CERN, ALICE LHC Computing Grid Tier-2 Workshop in Asia, 1 th December 2006.
D0SAR - September 2005 Andre Sznajder 1 Rio GRID Initiatives : T2-HEPGRID Andre Sznajder UERJ(Brazil)
Laboratório de Instrumentação e Física Experimental de Partículas GRID Activities at LIP Jorge Gomes - (LIP Computer Centre)
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
Status of Grid-enabled UTA McFarm software Tomasz Wlodek University of the Great State of TX At Arlington.
Spending Plans and Schedule Jae Yu July 26, 2002.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
Quick Introduction to NorduGrid Oxana Smirnova 4 th Nordic LHC Workshop November 23, 2001, Stockholm.
October 2002 INFN Catania 1 The (LHCC) Grid Project Initiative in Prague Dagmar Adamova INP Rez near Prague.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
CCS Overview Rene Salmon Center for Computational Science.
DØSAR a Regional Grid within DØ Jae Yu Univ. of Texas, Arlington THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
Data reprocessing for DZero on the SAM-Grid Gabriele Garzoglio for the SAM-Grid Team Fermilab, Computing Division.
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
GridPP11 Liverpool Sept04 SAMGrid GridPP11 Liverpool Sept 2004 Gavin Davies Imperial College London.
HEP Computing Status Sheffield University Matt Robinson Paul Hodgson Andrew Beresford.
LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University 1 st DOSAR Workshop at Sao Paulo September 16-17, 2005.
December 26, 2015 RHIC/USATLAS Grid Computing Facility Overview Dantong Yu Brookhaven National Lab.
LCG LCG-1 Deployment and usage experience Lev Shamardin SINP MSU, Moscow
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
ISG We build general capability Introduction to Olympus Shawn T. Brown, PhD ISG MISSION 2.0 Lead Director of Public Health Applications Pittsburgh Supercomputing.
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
1 LONI (The Louisiana Optical Network Initiative) Tevfik Koşar Center for Computation and Technology Louisiana State University DOSAR Workshop, Arlington-TX.
Final Implementation of a High Performance Computing Cluster at Florida Tech P. FORD, X. FAVE, K. GNANVO, R. HOCH, M. HOHLMANN, D. MITRA Physics and Space.
Computing Issues for the ATLAS SWT2. What is SWT2? SWT2 is the U.S. ATLAS Southwestern Tier 2 Consortium UTA is lead institution, along with University.
15-Feb-02Steve Traylen, RAL WP6 Test Bed Report1 RAL/UK WP6 Test Bed Report Steve Traylen, WP6 PPGRID/RAL, UK
Ole’ Miss DOSAR Grid Michael D. Joy Institutional Analysis Center.
AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & LSU Condor in Louisiana Tevfik Kosar Center for Computation & Technology Louisiana.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
ENABLING HIGHLY AVAILABLE GRID SITES Michael Bryant Louisiana Tech University.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Tier2 Centre in Prague Jiří Chudoba FZU AV ČR - Institute of Physics of the Academy of Sciences of the Czech Republic.
Southwest Tier 2 Center Status Report
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
Grid Aware: HA-OSCAR By
NTC 324 Competitive Success-- snaptutorial.com
NTC 324 Education for Service-- snaptutorial.com
NTC 324 Teaching Effectively-- snaptutorial.com
NTC 324 RANK Education for Service-- ntc324rank.com.
Status of Grids for HEP and HENP
Presentation transcript:

LTU Site Report Dick Greenwood LTU Site Report Dick Greenwood Louisiana Tech University DOSAR II Workshop at UT-Arlington March 30-31, 2005

LTU Site Report Dick Greenwood 30MAR06 2 LTU & CCT Computing Resources CPU(GHz)Storage (TB)People LTU33 GHz0.81F+1PD+1GA+1UG CCT (Helix)128*2* 2 GHz0.1 People: Dick Greenwood: Institutional Rep. Joe Steele: Researched/researching LHC Grid analysis tools (ADA, ARDA, etc.), D0SAR representative in data reprocessing group, Certifying data file merging for all of D0's remote reprocessing farms,Updating SamGrid documentation. Chandra : High Availability Computing research (with Box); system manager. Sunitha Polam: CS Grad. Student => graduated Michael Bryant: former system manager => will help run farm

LTU Site Report Dick Greenwood 16sep05 3 SAM-Grid Execution Site Components Data Server SAM Station Durable Storage Site Network Boundaries The Grid Gateway Grid/Fabric Interface Worker Node Cluster Head Node Cluster Machine type Machine Service Software service Bi-directional Network Communication Mono-directional Network Communication Network boundaries Key

LTU Site Report Dick Greenwood 16sep05 4 SamGrid Execution Site: ltu-cct Sam Station: ltu.cct.lsu.edu Operating as Samgrid site since Dec Installed 200 GB additional Storage to ltu-cct Plagued with timeout problems –Limited to 24 simultaneous batch jobs –Installed 1Gbit ethernet card but not working yet.

LTU Site Report Dick Greenwood 30MAR06 5 Personnel Using LaTech IAC (CAPS) Linux Cluster At present: –D0 Group Lee Sawyer, Dick Greenwood (IAC Rep.), Joe Steele, Julie Kalk –Jefferson Lab Group (G0, Qweak Experiments) Kathleen Johnston, Neven Simicevic, Steve Wells, Tony Forest Other users –Box Leangsuksun, LaTech CS

LTU Site Report Dick Greenwood 30MAR06 6 CAPS CLUSTER

LTU Site Report Dick Greenwood 30MAR06 7 Status of CAPS Site Caps10, our local SAMGrid head node and SAM station, is slowly being upgraded with the latest products and configured accordingly. Has been neglected caps10 because focus more powerful samgrid facility ltu-cct. Caps10 is being configured to match ltu-cct'sconfiguration since it will be much easier to maintain two identical SAMGrid head nodes. Hope to have caps10 operational within the next month or so. Our four Windy worker nodes (Dual Xeon 2.2/2.8GHz) have been added to an OSCAR 4.2 cluster running CentOS-3 (i386). The Windy cluster is a 12 Gigaflop compute cluster with 9GB of total RAM. While the Windy cluster is small, it is still quite useful for small D0 requests and local research. Condor is being used as the cluster batch scheduler instead of Torque/PBS. Plan to move HA-OSCAR in the near future in order to provide highly available computing resources. Along with the Windy cluster, we also have two AMD Opteron 2.0GHz testing nodes that are not a part of the cluster. After trying both i386 and x86_64 versions of Debian and CentOS-3/4, we found that only CentOS-3 x86_64 would install on these two nodes and thus decided to make them standalone testing nodes. We would eventually like to see these nodes become a part of the Windy cluster.

LTU Site Report Dick Greenwood 16sep05 8 Current Status of HA Research by Box Leangsuksun on CAPS Cluster Details can be obtained from

LTU Site Report Dick Greenwood 16sep05 9 Grid-enabled Beowulf cluster with Fault Tolerant Job Queue Ease of installation and HA- enabled cluster based on HA- OSCAR Grid enabled package – VDT based. HA-OSCAR from Box’s group at LaTech U. –HA and HPC clustering techniques to enable critical HPC infrastructure Self- configuration Multi-head Beowulf system –HA-enabled HPC Services: Active/Hot Standby

LTU Site Report Dick Greenwood 16sep05 10 Cluster-aware Grid package #1 Integrated within OSCAR package installer

LTU Site Report Dick Greenwood 16sep05 11 Cluster-aware Grid package #2

LTU Site Report Dick Greenwood 16sep05 12 Cluster-aware Grid package #3

LTU Site Report Dick Greenwood 16sep05 13 Current status Support OSCAR and HA-OSCAR 1.1 –Grid enabled package – Demo-ready –Fault Tolerant Job Queue – based on Torque/openPBS and LAM/MPI Self-healing with auto checkpoint/restart both queued and running jobs Details can be obtained from

LTU Site Report Dick Greenwood 30MAR06 14 LTU Plans “Small” site should continue taking MC production requests Participate in the development of the Louisiana Optical Network Initiative (LONI) with its 10GB/sec connectivity to the National Lambda Rail. The NLR POP in Baton Rouge has not been affected (the route runs from Baton Rouge to Houston). The awarding of the local LONI loops is a bottleneck with all of the purchasing/bid requirements: resolved at LTU. Commence collaborating with SW Tier 2 on ATLAS analysis