Download presentation
Presentation is loading. Please wait.
Published byClinton Fields Modified over 9 years ago
1
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 Overview of TeraGrid Resources and Services Sergiu Sanielevici, TeraGrid Area Director for User Services Coordination Pittsburgh Supercomputing Center sergiu@psc.edu
2
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 TeraGrid: Integrating NSF Cyberinfrastructure SDSC NCAR (mid-2006) TACC UC/ANL NCSA ORNL PU IU PSC
3
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 NSF TeraGrid In Context 1/001/051/10 1/951/901/85 NSF Centers ProgramPACI Construction Operation & Enhancement * * *
4
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 The TeraGrid Facility Grid Infrastructure Group (GIG) –University of Chicago –TeraGrid integration, planning, management, coordination Resource Providers (RP) –Currently NCSA, SDSC, PSC, Indiana, Purdue, ORNL, TACC, UC/ANL Additional RPs in discussion –Systems (resources, services) support, user support –Provide access to resources via policies, software, and mechanisms coordinated by and provided through the GIG. The Facility –An integrated set of HPC resources providing NSF scientists with access to resources and collections of resources through unified user support, coordinated software and services, and extensive documentation and training. The Federation –Interdependent partners working together under the direction of an overall project director, the GIG PI.
5
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 It’s all in These URLs TG home page: www.teragrid.orgwww.teragrid.org TG User Portal: https://portal.teragrid.org
6
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 ANL/UCIUNCSAORNLPSCPurdueSDSCTACC Computational Resources Itanium 2 (0.5 TF) IA-32 (0.5 TF) Itanium2 (0.2 TF) IA-32 (2.0 TF) Itanium2 (10.7 TF) SGI SMP (7.0 TF) Dell Xeon (17.2TF) IBM p690 (2TF) Condor Flock (1.1TF) IA-32 (0.3 TF) XT3 (10 TF) TCS (6 TF) Marvel SMP (0.3 TF) Hetero (1.7 TF) IA-32 (11 TF) Opportunistic Itanium2 (4.4 TF) Power4+ (15.6 TF) Blue Gene (5.7 TF) IA-32 (6.3 TF) Online Storage20 TB32 TB1140 TB1 TB300 TB26 TB1400 TB50 TB Mass Storage1.2 PB5 PB2.4 PB1.3 PB6 PB2 PB Net Gb/s, Hub30 CHI10 CHI30 CHI10 ATL30 CHI10 CHI40 LA10 CHI Data Collections # collections Approx total size Access methods 5 Col. >3.7 TB URL/DB/ GridFTP > 30 Col. URL/SRB/DB/ GridFTP 4 Col. 7 TB SRB/Portal/ OPeNDAP >70 Col. >1 PB GFS/SRB/ DB/GridFTP 4 Col. 2.35 TB SRB/Web Services/ URL InstrumentsProteomics X-ray Cryst. SNS and HFIR Facilities Visualization Resources RI: Remote Interact RB: Remote Batch RC: RI/Collab RI, RC, RB IA-32, 96 GeForce 6600GT RB SGI Prism, 32 graphics pipes; IA-32 RI, RB IA-32 + Quadro4 980 XGL RB IA-32, 48 Nodes RBRI, RC, RB UltraSPARC IV, 512GB SMP, 16 gfx cards TeraGrid Resources 100+ TF 8 distinct architectures 3 PB Online Disk >100 data collections Updated by Kelly Gaither (gaither@tacc.utexas.edu)
7
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 TeraGrid Facility Today Heterogeneous Resources at Autonomous Resource Provider Sites Local Value-Added User Environment Common TeraGrid Computing Environment A single point of contact for help Integrated documentation and training A common allocation process A common baseline user environment Services to assist users in harnessing the right TeraGrid platforms for each part of their work, Enhancements driven by users. Science Gateways to engage broader communities
8
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 Current Menu of Compute Resources Cross-Site IA-64 Cluster: DTF –IBM Itanium-2/Myrinet at NCSA, SDSC, ANL: ~15.6 TF, 5.2 TB Memory combined Single-Site IA-32 & IA-64 Clusters –NCSA, TACC, Purdue, IU, ANL, ORNL: ~32 TF in all! Tightly-Coupled MPP Systems –PSC XT3 (10 TF)+TCS (6 TF); SDSC Blue Gene (5.7 TF) SMP Systems –SDSC Power4+ (15.6 TF); NCSA Altix (7 TF)+p690 (2TF); PSC Marvel (0.3 TF) Mix and Match With Data and Visualization
9
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 Exploring the TeraGrid Get started with a Development Grant: TG-DAC up to 30K SU Roaming* –Experiment with code(s) and task(s) on the various resources for ~1 year –Find the best mapping of your research task flow to the resources: the scenarios we’ll present today may suggest possible answers –Document this mapping to write a production proposal: define a science goal and discuss how many SUs you will need on which systems to accomplish it over 1 or 2 CYs. *Roaming means never having to say which system
10
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 Peer Reviewed Production Grants Large (>200K SU, start 4/1 or 10/1) or Medium (start 1/1, 4/1, 7/1 or 10/1) Supersize it! Specific or Roaming Depends on the outcome of your task flow mapping: specific –If a task works best on a specific system, ask for it by name. Extrapolate DAC benchmarks to justify your request. specific allocations –You can ask for specific allocations on several systems. –But a roaming allocation –But if it’s best for you to use a large number of TG systems (e.g any/all clusters/SMPs/MPPs), a roaming allocation will free you from the need to predict what task you will do on which machine! –Roaming jobs may get lower priority on machines that have been assigned many specific allocations.
11
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 We’re here to work with you! Personal consultant contact upon receiving a production grant Documentation at www.teragrid.org/userinfo/index.php help@teragrid.org for any problems or questionshelp@teragrid.org ASTA Program:ASTA Program: intensive help from our consultants for a focused effort to optimize the effectiveness of your application’s use of TeraGrid resources to achieve a scientific breakthrough. Tough to get into, but worth it! Talk to us… sergiu@psc.edu Science Gateways: enable entire communities of users associated with a common scientific goal to use TeraGrid resources through a common interface. Contact: gateway-info@teragrid.org
12
Sergiu Sanielevici(sergiu@psc.edu) April 2006June 2006 And now: A word about Safety and Security on the TeraGrid! Over to Jim, our much-feared Chief of Security. Thank you, and please enjoy this Tutorial and this first annual TeraGrid conference! sergiu@psc.edu
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.