Summary, Action Items and Milestones 1 st HiPCAT THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington Contact Jae Yu or Alan.

Slides:



Advertisements
Similar presentations
Network Resource Broker for IPTV in Cloud Computing Lei Liang, Dan He University of Surrey, UK OGF 27, G2C Workshop 15 Oct 2009 Banff,
Advertisements

4/2/2002HEP Globus Testing Request - Jae Yu x Participating in Globus Test-bed Activity for DØGrid UTA HEP group is playing a leading role in establishing.
HEPiX Meeting Wrap Up Fall 2000 JLab. Meeting Highlights Monitoring –Several projects underway –Collaboration of ideas occurred –Communication earlier.
Polish Infrastructure for Supporting Computational Science in the European Research Space EUROPEAN UNION Services and Operations in Polish NGI M. Radecki,
Duke Atlas Tier 3 Site Doug Benjamin (Duke University)
Title US-CMS User Facilities Vivian O’Dell US CMS Physics Meeting May 18, 2001.
HEP Prospects, J. Yu LEARN Strategy Meeting Prospects on Texas High Energy Physics Network Needs LEARN Strategy Meeting University of Texas at El Paso.
1 Deployment of an LCG Infrastructure in Australia How-To Setup the LCG Grid Middleware – A beginner's perspective Marco La Rosa
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Hands-On Microsoft Windows Server 2008 Chapter 1 Introduction to Windows Server 2008.
Reiknistofnun Háskóla Íslands
Remote Production and Regional Analysis Centers Iain Bertram 24 May 2002 Draft 1 Lancaster University.
UTA Site Report Jae Yu UTA Site Report 4 th DOSAR Workshop Iowa State University Apr. 5 – 6, 2007 Jae Yu Univ. of Texas, Arlington.
30-Jun-04UCL HEP Computing Status June UCL HEP Computing Status April DESKTOPS LAPTOPS BATCH PROCESSING DEDICATED SYSTEMS GRID MAIL WEB WTS.
UTA Site Report Jae Yu UTA Site Report 2 nd DOSAR Workshop UTA Mar. 30 – Mar. 31, 2006 Jae Yu Univ. of Texas, Arlington.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
DOSAR VO ACTION AGENDA ACTION ITEMS AND GOALS CARRIED FORWARD FROM THE DOSAR VI WORKSHOP AT OLE MISS APRIL 17-18, 2008.
Jan. 17, 2002DØRAM Proposal DØRACE Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Remote Analysis Station ArchitectureRemote.
Intel IT Overlay Jeff Sedayao PlanetLab Workshop at HPLABS May 11, 2006.
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
Alex Read, Dept. of Physics Grid Activity in Oslo CERN-satsingen/miljøet møter MN-fakultetet Oslo, 8 juni 2009 Alex Read.
Status of UTA IAC + RAC Jae Yu 3 rd DØSAR Workshop Apr. 7 – 9, 2004 Louisiana Tech. University.
Spending Plans and Schedule Jae Yu July 26, 2002.
RAL Site Report John Gordon IT Department, CLRC/RAL HEPiX Meeting, JLAB, October 2000.
26SEP03 2 nd SAR Workshop Oklahoma University Dick Greenwood Louisiana Tech University LaTech IAC Site Report.
DØSAR a Regional Grid within DØ Jae Yu Univ. of Texas, Arlington THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington.
HiPCAT, The Texas HPC and Grid Organization 4 th DOSAR Workshop Iowa State University Jaehoon Yu University of Texas at Arlington.
SURAGrid Project Meeting Washington, DC Wednesday, February 22, 2006 Barry Wilkinson Department of Computer Science UNC-Charlotte SURAGrid and Grid Computing.
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
HEPiX FNAL ‘02 25 th Oct 2002 Alan Silverman HEPiX Large Cluster SIG Report Alan Silverman 25 th October 2002 HEPiX 2002, FNAL.
ISU DOSAR WORKSHOP Dick Greenwood DOSAR/OSG Statement of Work (SoW) Dick Greenwood Louisiana Tech University April 5, 2007.
High Performance Computing Across Texas –HPC systems, Clusters, and advanced visualization –Grids and Massive data storage –Scientific Computing and Projects.
THEGrid Workshop Jae Yu THEGrid Workshop UTA, July 8 – 9, 2004 Introduction An Example of Existing Activities What do we want to accomplish at the workshop?
ALICE-USA Computing January 21, 2014 CERN L. Pinsky--University of Houston1 University of Houston Computing Support Status & Future Possibilities for ALICE.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
Project 1 (CGNB 413) Briefing
UTA MC Production Farm & Grid Computing Activities Jae Yu UT Arlington DØRACE Workshop Feb. 12, 2002 UTA DØMC Farm MCFARM Job control and packaging software.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Feb. 14, 2002DØRAM Proposal DØ IB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) Introduction Partial Workshop Results DØRAM Architecture.
CD FY09 Tactical Plan Status FY09 Tactical Plan Status Report for Neutrino Program (MINOS, MINERvA, General) Margaret Votava April 21, 2009 Tactical plan.
International Workshop on HEP Data Grid Aug 23, 2003, KNU Status of Data Storage, Network, Clustering in SKKU CDF group Intae Yu*, Joong Seok Chae Department.
Tier3 monitoring. Initial issues. Danila Oleynik. Artem Petrosyan. JINR.
Doug Benjamin Duke University. 2 ESD/AOD, D 1 PD, D 2 PD - POOL based D 3 PD - flat ntuple Contents defined by physics group(s) - made in official production.
Computing at SSRL: Experimental User Support Timothy M. McPhillips Stanford Synchrotron Radiation Laboratory.
Participation of JINR in CERN- INTAS project ( ) Korenkov V., Mitcin V., Nikonov E., Oleynik D., Pose V., Tikhonenko E. 19 march 2004.
MC Production in Canada Pierre Savard University of Toronto and TRIUMF IFC Meeting October 2003.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor and (the) Grid (one of.
DOSAR Roadmap Jae Yu DOSAR Roadmap 5 th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007 Jae Yu Univ. of Texas, Arlington LHC Tier 3 Efforts.
Feb. 13, 2002DØRAM Proposal DØCPB Meeting, Jae Yu 1 Proposal for a DØ Remote Analysis Model (DØRAM) IntroductionIntroduction Partial Workshop ResultsPartial.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
Computing Strategies. A computing strategy should identify – the hardware, – the software, – Internet services, and – the network connectivity needed.
NSDI Strategic Plan Update FGDC Coordination Group Meeting November 19, 2013.
DOSAR II Workshop Goals and Organizations Jae Yu DOSAR II Workshop, UTA Mar. 30 – 31, 2006 Introduction Workshop Goals Workshop Organization.
UTA Site Report Jae Yu UTA Site Report 7 th DOSAR Workshop Louisiana State University Apr. 2 – 3, 2009 Jae Yu Univ. of Texas, Arlington.
Hans Wenzel CDF CAF meeting October 18 th -19 th CMS Computing at FNAL Hans Wenzel Fermilab  Introduction  CMS: What's on the floor, How we got.
Dave Newbold, University of Bristol14/8/2001 Testbed 1 What is it? First deployment of DataGrid middleware tools The place where we find out if it all.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
ARIES WP2 Task 2.2 kick-off Coordination, support and enhancement of communication/outreach activities for accelerators in Europe Jennifer Toes (CERN),
6th DOSAR Workshop University Mississippi Apr. 17 – 18, 2008
Belle II Physics Analysis Center at TIFR
Global Grid Forum GridForge
5th DOSAR Workshop Louisiana Tech University Sept. 27 – 28, 2007
Introduction to HEPiX Helge Meinhard, CERN-IT
Grid Canada Testbed using HEP applications
The School Point of View
News & Action Items Automatic Release Ready notification system is in place and working (Thanks Alan!! ) Needs refinements!!! New Fermi RH CD-Rom released.
Proposal for a DØ Remote Analysis Model (DØRAM)
Webinar Instructor Training Part 3
SAM Offsite Coordination
Presentation transcript:

Summary, Action Items and Milestones 1 st HiPCAT THEGrid Workshop July 8 – 9, 2004 Univ. of Texas at Arlington Contact Jae Yu or Alan Sill

Summary So Far High energy physics computing going on And of interest at practically every major institution in Texas 39 people registered, 23 attended here Good tour of local facilities What’s next? This discussion...

What does each institution have? UH –HPC that shared by many discipline throughout the university –Leonard Johnson has own facility  256 dual node Itanium  Support ALICE DC Small clusters (10s of machines) for benchmarking –Local desktop clustering done by the department  Physics has about 4 IT people –Can identify dedicated machines and put them together as a cluster for THEGrid

What does each institution have? SMU –30 Intel machine under complete control of HEP + 6TB Storage  ATLAS –4 Faculty + 4PD + students on ATLAS A&M –No current plans for grid –Possible CDF CAF installation in the near future

How much human resources does an institution need to start? A graduate student (preferably knowledgeable in computer) A part time postdoc Try to utilize interdisciplinary collaboration with CSE to obtain valuable manpower –Need a physics faculty to drive the effort –UH has a CS PhD students for reconstruction

Suggested Action Items Build and distribute small system –(one machine per campus minimum) to implement HEP grid standards (desktop cluster paradigm) as a starting point. Form a steering committee –(one person per institution, 3- person executive committee), with subgroups, with challenge to expand this greatly. –Charge: identify cluster resources, software to be installed, THEGrid environment, tools, projects, etc. Hardware platforms? (Specifically investigate 64-bit options, hardware alternatives for cpu and storage.)

Communication Action Items Mailing list(s): HiPCAT has started one but it is centrally administered.  By July 12 –TTU will host Mailman based list on our server ( highenergy.phys.ttu.edu ) - self-service Web site  By Aug. 9 –Host page on HiPCAT site. TTU to design this page and beginning info. Regular THEGrid meetings –Contact: Alan Sill –Weekly: First meeting on July 23 –Time : Fridays at 9:30am –Teleconference: ESNet teleconferencing dial: Input id (82tgrid)#

Communication Action Items, cnt’d Regular executive committee meeting  Coincide with THEGrid weekly meeting –Members: Larry Pinsky, Alan Sill, Jae Yu Regular workshop –2 nd workshop at UH (Thanks, Larry!) In October 7th and 8th Includes a hands-on session to finish off SC2004 prep –Interval: To be determined at the 2 nd meeting

Tools & Environment OS: Start with RH 7.3 (e.g Fermi) Linux, aim to support all grid-compatible versions of Unix. Minimum: CERN/Fermilab versions. –See for manual to install yum, etc. if not starting from Fermi distribution. Everyone should get a DOE Grid personal certificate. Machines will need them also... –use HiPCAT’s? Patrick McGuigan to follow up with Phil about this in our context. Ganglia to be used to aggregate monitoring, MonALISA where possible. (Alan add links to installation info. page.) –UTA to host central ganglia site. Bandwidth monitoring: Add one node each to FNAL WAN bandwidth monitoring. –Alan will request THEGrid subset to be made once enough nodes are established. Groupware through biogrid –Ask for presentation by Paul Medley of UTA at next HiPCAT meeting.

SC2004 Demo Plan Primary demo: Display ganglia resource monitoring of THEGrid Preparation –Setup a cluster Desktop cluster if none already exist Established cluster if exist –Install ganglia in each institution, including web front and server –Central display service provided by UTA, putting together THEGrid ganglia page Tasks and Milestones (SC2004: Nov. 6 – 12) –Distribute ganglia installation guideline by TTU: July 12 –Complete installation of ganglia at institutions: Sept. 1 –Complete central display service (UTA): Oct. 1

Discussion on Funding Complete THEGrid proposal document –By when: Aug. 31 –Distribute the document to the distribution list: July 13  Alan Sill –Revision to include the list of Texas institutions with HEP, NP and AP along with their projects: Larry Pinsky Circulate the first draft: Aug. 9 –Where do we submit this document to HiPCAT –Raise awareness on THEGrid and HiPCAT to local university administration and volunteer to go with them to persuade state legislature Write a short primer for people to use for local universities and post: Larry Pinsky –What are we asking for? Bandwidth and infrastructure (network and people)  Pry out the allocated money Look for matching from state toward federal funds What are the sources? –State?  Maybe a bit harder due to possible conflict with HiPCAT Should be part of HiPCAT for state –Federal?  Executive committee should be aware of available programs in various federal agencies

Action Items and Completion Dates NameAction ItemDates Sill+SmithForm a mail distribution list7/12/04 YuDistribute web link to ganglia installation7/12/04 SillDistribute THEGrid proposal7/13/04 AllBring the names of institutional reps.7/23/04 SillFirst THEGrid weekly meeting7/23/04 Sill+Smith Design and activate THEGRid web page & link from HiPCAT project page 8/9/04 Pinsky Compile and distribute the HEP, NP and AP list in the state of Texas 8/9/04 Sill+Yu+allComplete THEGrid proposal8/31/04 All Complete forming a small cluster and install ganglia for SC2004 demo 9/1/04 YuComplete THEGrid central ganglia service10/1/04 Pinsky2 nd THEGrid workshop at UH10/7 – 8/04 All via HiPCATSC2004 Demo11/6 – 12/04