TeraGrid Overview John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration University of Chicago/Argonne National Laboratory March 25,

Slides:



Advertisements
Similar presentations
TeraGrid Community Software Areas (CSA) JP (John-Paul) Navarro TeraGrid Grid Infrastructure Group Software Integration University of Chicago and Argonne.
Advertisements

Jeffrey P. Gardner Pittsburgh Supercomputing Center
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Grid Deployments and Cyberinfrastructure Andrew J. Younge 102 Lomb Memorial Drive Rochester, NY 14623
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
ANL NCSA PICTURE 1 Caltech SDSC PSC 128 2p Power4 500 TB Fibre Channel SAN 256 4p Itanium2 / Myrinet 96 GeForce4 Graphics Pipes 96 2p Madison + 96 P4 Myrinet.
Science Gateways on the TeraGrid Von Welch, NCSA (with thanks to Nancy Wilkins-Diehr, SDSC for many slides)
O C I October 31, 2006Office of CyberInfrastructure1 Enhancing Virtual Organizations Abhi Deshmukh Office of Cyberinfrastructure & Engineering Directorate.
Simo Niskala Teemu Pasanen
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Core Services I & II David Hart Area Director, UFP/CS TeraGrid Quarterly Meeting December 2008.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
June 26, 2006 TeraGrid A National Production Cyberinfrastructure Facility Scott Lathrop TeraGrid Director of Education, Outreach and Training University.
GCE06, Tampa, FL November 12-13, 2006 Science Gateways on the TeraGrid Charlie Catlett, Sebastien Goasguen, Jim Marsteller, Stuart Martin, Don Middleton,
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
An Introduction and Overview of Grid Computing Presenter: Xiaofei Cao Patrick Berg.
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
SAN DIEGO SUPERCOMPUTER CENTER Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SDSC Director of Consulting,
April 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr Area Director for Science Gateways San Diego Supercomputer Center
TeraGrid Information Services John-Paul “JP” Navarro TeraGrid Grid Infrastructure Group “GIG” Area Co-Director for Software Integration and Information.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Science Gateways on the TeraGrid Nancy Wilkins-Diehr Area Director for Science Gateways San Diego Supercomputer Center
Minority-Serving Institutions Cyberinfrastructure Institute Welcome to TeraGrid Scott Lathrop, TeraGrid Director of Education, Outreach and Training
Advancing Scientific Discovery through TeraGrid Scott Lathrop TeraGrid Director of Education, Outreach and Training University of Chicago and Argonne National.
August 2007 Advancing Scientific Discovery through TeraGrid Scott Lathrop TeraGrid Director of Education, Outreach and Training University of Chicago and.
CTSS 4 Strategy and Status. General Character of CTSSv4 To meet project milestones, CTSS changes must accelerate in the coming years. Process –Process.
August 2007 Leveraging Campus Authentication to Access the TeraGrid - OR - Partnering with Campuses to Broaden Participation in TeraGrid Scott Lathrop.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
The National Grid Cyberinfrastructure Open Science Grid and TeraGrid John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration Mike Wilde.
The (US) National Grid Cyberinfrastructure Open Science Grid and TeraGrid.
TeraGrid Overview Cyberinfrastructure Days Internet2 10/9/07 Mark Sheddon Resource Provider Principal Investigator San Diego Supercomputer Center
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Some Grid Experiences Laura Pearlman USC Information Sciences Institute ICTP Advanced Training Workshop on Scientific Instruments on the Grid *Most of.
National Grid Cyberinfrastructure Open Science Grid (OSG) and TeraGrid (TG)
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
TeraGrid CTSS Plans and Status Dane Skow for Lee Liming and JP Navarro OSG Consortium Meeting 22 August, 2006.
Tools for collaboration How to share your duck tales…
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Leveraging the InCommon Federation to access the NSF TeraGrid Jim Basney Senior Research Scientist National Center for Supercomputing Applications University.
SC06, Tampa FL November 11-17, 2006 Science Gateways on the TeraGrid Powerful Beyond Imagination! Nancy Wilkins-Diehr TeraGrid Area Director for Science.
TeraGrid Quarterly Meeting Arlington, VA Sep 6-7, 2007 NCSA RP Status Report.
Riding the Crest: High-End Cyberinfrastructure Experiences and Opportunities on the NSF TeraGrid A Panel Presentation by Laura M c GinnisRadha Nandkumar.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Sergiu April 2006June 2006 Overview of TeraGrid Resources and Services Sergiu Sanielevici, TeraGrid Area Director for User.
The National Grid Cyberinfrastructure Open Science Grid and TeraGrid.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
Comprehensive Scientific Support Of Large Scale Parallel Computation David Skinner, NERSC.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
TeraGrid-Wide Operations Von Welch Area Director for Networking, Operations and Security NCSA, University of Illinois April, 2009.
CTSS Rollout update Mike Showerman JP Navarro April
AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & Technology Introduction to the TeraGrid Daniel S. Katz Lead, LONI as a TeraGrid.
The National Grid Cyberinfrastructure Open Science Grid and TeraGrid John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration Mike Wilde.
October 2007 TeraGrid : Advancing Scientific Discovery and Learning Diane A. Baxter, Ph.D. Education Director San Diego Supercomputer Center University.
Software Integration Highlights CY2008 Lee Liming, JP Navarro GIG Area Directors for Software Integration University of Chicago, Argonne National Laboratory.
GridShell/Condor: A virtual login Shell for the NSF TeraGrid (How do you run a million jobs on the NSF TeraGrid?) The University of Texas at Austin.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
SAN DIEGO SUPERCOMPUTER CENTER Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SDSC Director of Consulting,
TeraGrid Capability Discovery John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration University of Chicago/Argonne National Laboratory.
TeraGrid Software Integration: Area Overview (detailed in 2007 Annual Report Section 3) Lee Liming, JP Navarro TeraGrid Annual Project Review April, 2008.
NSF TeraGrid Review January 10, 2006
Joint Techs, Columbus, OH
Jeffrey P. Gardner Pittsburgh Supercomputing Center
Office of CyberInfrastructure
Advancing Scientific Discovery through TeraGrid
Presentation transcript:

TeraGrid Overview John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration University of Chicago/Argonne National Laboratory March 25, 2007

JP Navarro 2 What is the TeraGrid? Technology + Support = Science

JP Navarro 3 TeraGrid’s 3-pronged strategy to further science DEEP Science: Enabling Terascale Science –Make science more productive through an integrated set of very- high capability resources ASTA projects WIDE Impact: Empowering Communities –Bring TeraGrid capabilities to the broad science community Science Gateways OPEN Infrastructure, OPEN Partnership –Provide a coordinated, general purpose, reliable set of services and resources Grid interoperability working group

TeraGrid Science

JP Navarro 5 NSF Funded Research NSF-funded program to offer high end compute, data and visualization resources to the nation’s academic researchers Proposal-based, researchers can use resources at no cost Variety of disciplines

JP Navarro 6 TeraGrid PI’s By Institution as of May 2006 TeraGrid PI’s Blue: 10 or more PI’s Red: 5-9 PI’s Yellow: 2-4 PI’s Green: 1 PI

TeraGrid Technology

JP Navarro 8 TeraGrid Hardware Components High-end compute hardware –Intel/Linux clusters –Alpha SMP clusters –IBM POWER3 and POWER4 clusters –SGI Altix SMPs –SUN visualization systems –Cray XT3 –IBM Blue Gene/L Large-scale storage systems – hundreds of terabytes for secondary storage Visualization hardware Very high-speed network backbone (40Gb/s) – bandwidth for rich interaction and tight coupling

JP Navarro 9 UC/ANLIUNCSAORNLPSCPurdueSDSCTACC Computational Resources Itanium 2 (0.5 TF) IA-32 (0.5 TF) Itanium2 (0.2 TF) IA-32 (2.0 TF) Itanium2 (10.7 TF) SGI SMP (7.0 TF) Dell Xeon (17.2TF) IBM p690 (2TF) Condor Flock (1.1TF) IA-32 (0.3 TF) XT3 (10 TF) TCS (6 TF) Marvel SMP (0.3 TF) Hetero (1.7 TF) IA-32 (11 TF) Opportunistic Itanium2 (4.4 TF) Power4+ (15.6 TF) Blue Gene (5.7 TF) IA-32 (6.3 TF) Online Storage20 TB32 TB1140 TB1 TB300 TB26 TB1400 TB50 TB Mass Storage1.2 PB5 PB2.4 PB1.3 PB6 PB2 PB Net Gb/s, Hub30 CHI10 CHI30 CHI10 ATL30 CHI10 CHI10 LA10 CHI Data Collections # collections Approx total size Access methods 5 Col. >3.7 TB URL/DB/ GridFTP > 30 Col. URL/SRB/DB/ GridFTP 4 Col. 7 TB SRB/Portal/ OPeNDAP >70 Col. >1 PB GFS/SRB/ DB/GridFTP 4 Col TB SRB/Web Services/ URL InstrumentsProteomics X-ray Cryst. SNS and HFIR Facilities Visualization Resources RI: Remote Interact RB: Remote Batch RC: RI/Collab RI, RC, RB IA-32, 96 GeForce 6600GT RB SGI Prism, 32 graphics pipes; IA-32 RI, RB IA-32 + Quadro4 980 XGL RB IA-32, 48 Nodes RBRI, RC, RB UltraSPARC IV, 512GB SMP, 16 gfx cards TeraGrid Resources 100+ TF 8 distinct architectures 3 PB Online Disk >100 data collections

JP Navarro 10 TeraGrid Software Components Coordinated TeraGrid Software and Services “CTSS” –Grid services –Supporting software Community Owned Software Areas “CSA” Advanced Applications

JP Navarro 11 Coordinated TeraGrid Software & Services 4 CTSS 4 Core Integration Capability –Authorization/Accounting/Security Supports a coordinated authorization and allocation process –Policy –Software deployment –Validation & Verification (Inca) –Information services Resource registration Capability (software and service) registration

JP Navarro 12 Coordinated TeraGrid Software & Services 4 Remote Compute Capability Kit –Grid job submission New Web Services “WS” GRAM Legacy method (pre-WS GRAM) Data Movement and Management Capability Kit –Grid data movement (Globus GridFTP) –Storage Request Broker “SRB” GPFS WAN Remote Login Capability Kit –GSI ssh (besides password, ssh keys) Local Parallel Programming Capability Kit –MPI Grid Parallel Programming Capability Kit –MPICH-G2 Application Development and Runtime –Compilers –BLAS libraries –HDF4 and HDF5

JP Navarro 13 Science Gateways A new initiative for the TeraGrid Increasing investment by communities in their own cyberinfrastructure, but heterogeneous: Resources Users – from expert to K-12 Software stacks, policies Science Gateways –Provide “TeraGrid Inside” capabilities –Leverage community investment Three common forms: –Web-based Portals –Application programs running on users' machines but accessing services in TeraGrid –Coordinated access points enabling users to move seamlessly between TeraGrid and other grids. Workflow Composer

JP Navarro 14 Gateways are growing in numbers 10 initial projects as part of TG proposal >20 Gateway projects today No limit on how many gateways can use TG resources –Prepare services and documentation so developers can work independently Open Science Grid (OSG) Special PRiority and Urgent Computing Environment (SPRUCE) National Virtual Observatory (NVO) Linked Environments for Atmospheric Discovery (LEAD) Computational Chemistry Grid (GridChem) Computational Science and Engineering Online (CSE- Online) GEON(GEOsciences Network) Network for Earthquake Engineering Simulation (NEES) SCEC Earthworks Project Network for Computational Nanotechnology and nanoHUB GIScience Gateway (GISolve) Biology and Biomedicine Science Gateway Open Life Sciences Gateway The Telescience Project Grid Analysis Environment (GAE) Neutron Science Instrument Gateway TeraGrid Visualization Gateway, ANL BIRN Gridblast Bioinformatics Gateway Earth Systems Grid Astrophysical Data Repository (Cornell) Many others interested –SID Grid –HASTAC

TeraGrid Support

JP Navarro 16 The TeraGrid Facility Grid Infrastructure Group (GIG) –University of Chicago –TeraGrid integration, planning, management, coordination –Organized into areas User Services Operations Gateways Data/Visualization/Scheduling Education Outreach & Training Software Integration Resource Providers (RP) –Currently NCSA, SDSC, PSC, Indiana, Purdue, ORNL, TACC, UC/ANL –Systems (resources, services) support, user support –Provide access to resources via policies, software, and mechanisms coordinated by and provided through the GIG.

JP Navarro 17 TeraGrid Facility Today Heterogeneous Resources at Autonomous Resource Provider Sites Local Value-Added User Environment Common TeraGrid Computing Environment A single point of contact for help Integrated documentation and training A common allocation process Coordinated Software and Services A common baseline user environment

JP Navarro 18 Useful links TeraGrid website – Policies/procedures posted at: – TeraGrid user information overview – Summary of TG Resources – Summary of machines with links to site-specific user guides (just click on the name of each site) –

JP Navarro 19 Grid Resources in the US Origins: –National Grid (iVDGL, GriPhyN, PPDG) and LHC Software & Computing Projects Current Compute Resources: –61 Open Science Grid sites –Connected via Inet2, NLR.... from 10 Gbps – 622 Mbps –Compute & Storage Elemets –All are Linux clusters –Most are shared Campus grids Local non-grid users –More than 10,000 CPUs A lot of opportunistic usage Total computing capacity difficult to estimate Same with Storage Origins: –National Grid (iVDGL, GriPhyN, PPDG) and LHC Software & Computing Projects Current Compute Resources: –61 Open Science Grid sites –Connected via Inet2, NLR.... from 10 Gbps – 622 Mbps –Compute & Storage Elemets –All are Linux clusters –Most are shared Campus grids Local non-grid users –More than 10,000 CPUs A lot of opportunistic usage Total computing capacity difficult to estimate Same with Storage Origins: –National Super Computing Centers, funded by the National Science Foundation Current Compute Resources: –9 TeraGrid sites –Connected via dedicated multi-Gbps links –Mix of Architectures ia64, ia32: LINUX Cray XT3 Alpha: True 64 SGI SMPs – Resources are dedicated but Grid users share with local and grid users 1000s of CPUs, > 40 TeraFlops –100s of TeraBytes Origins: –National Super Computing Centers, funded by the National Science Foundation Current Compute Resources: –9 TeraGrid sites –Connected via dedicated multi-Gbps links –Mix of Architectures ia64, ia32: LINUX Cray XT3 Alpha: True 64 SGI SMPs – Resources are dedicated but Grid users share with local and grid users 1000s of CPUs, > 40 TeraFlops –100s of TeraBytes The TeraGrid The OSG