National Computational Science Alliance The Alliance Distributed Supercomputing Facilities Opening Talk to the Alliance User Advisory Council Held at Supercomputing.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

Future Directions for NSF Advanced Computing Infrastructure to support US Science in CASC, April 25, 2014 Jon Eisenberg Director, CSTB v2.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Center for Computational Sciences Cray X1 and Black Widow at ORNL Center for Computational.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
NSF Regional Grants Conference St. Louis, MO
National Computational Science Alliance “The Coming of the Grid” Building a Computational Grid Workshop Argonne National Laboratory September 8-10,1997.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
U.S. Department of Energy Office of Science Advanced Scientific Computing Research Program NERSC Users Group Meeting Department of Energy Update June 12,
S AN D IEGO S UPERCOMPUTER C ENTER N ATIONAL P ARTNERSHIP FOR A DVANCED C OMPUTATIONAL I NFRASTRUCTURE Computational Science Challenges for the Beginning.
Beowulf Supercomputer System Lee, Jung won CS843.
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
High Impact Implementation for an Innovation-Driven Economy The Texas Industry Cluster Initiative.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
Presented by Scalable Systems Software Project Al Geist Computer Science Research Group Computer Science and Mathematics Division Research supported by.
NPACI Panel on Clusters David E. Culler Computer Science Division University of California, Berkeley
1 Intellectual Architecture Leverage existing domain research strengths and organize around multidisciplinary challenges Institute for Computational Research.
National Computational Science Alliance NCSA is the Leading Edge Site for the National Computational Science Alliance
External Reports Overview Presentation for the ENG Advisory Committee By Michael Reischman Deputy Assistant Director for Engineering.
The Interplay of Funding Policy for infrastructure at NSF Richard S. Hirsh.
Knowledge Environments for Science and Engineering: Current Technical Developments James French, Information and Intelligent Systems Division, Computer.
Chapter 2 Computer Clusters Lecture 2.1 Overview.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
National Computational Science Alliance Coupling the Leading Edge Site with the Alliance Partners Talk given to First Annual ITEA Workshop on High Performance.
National Computational Science Alliance Introducing the National Computational Science Alliance Panel Presentation to Supercomputing ‘97 in San Jose November.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Information Technology at Purdue Presented by: Dr. Gerry McCartney Vice President and CIO, ITaP HPC User Forum September 8-10, 2008 Using SiCortex SC5832.
National Computational Science Alliance Supercomputing: Directions in Technology, Architecture and Applications Keynote Talk to Supercomputer’98 in Mannheim,
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
National Center for Supercomputing Applications The Computational Chemistry Grid: Production Cyberinfrastructure for Computational Chemistry PI: John Connolly.
Future Requirements for NSF ATM Computing Jim Kinter, co-chair Presentation to CCSM Advisory Board 9 January 2008.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
National Computational Science Alliance Expanding Participation in Computing and Communications -- the NSF Partnerships for Advanced Computational Infrastructure.
Supercomputing the Next Century Talk to the Max-Planck-Institut fuer Gravitationsphysik Albert-Einstein-Institut, Potsdam, Germany June 15, 1998.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
National Computational Science Alliance Knowledge Management and Corporate Intranets Talk to visiting team from Fisher Scientific January 13, 1998.
National Computational Science Alliance Tele-Immersion - The Killer Application for High Performance Networks Panel Talk at a Vanguard Meeting in San Francisco,
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
National Computational Science Alliance Increasing Competitiveness Through the Utilization of Emerging Technologies Leader to Leader Speaker Series, Allstate.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor : A Concept, A Tool and.
National Computational Science Alliance From Supercomputing to the Grid Invited Talk at SGI Booth, Supercomputing ‘98 Orlando, Florida, November 10,1998.
Center for Computational Sciences O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY Vision for OSC Computing and Computational Sciences
David Mogk Dept. of Earth Sciences Montana State University April 8, 2015 Webinar SAGE/GAGE FACILITIES SUPPORTING BROADER EDUCATIONAL IMPACTS: SOME CONTEXTS.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.
Supporting Scientific Collaboration Online SCOPE Workshop at San Diego Supercomputer Center March 19-22, 2008.
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing – High-End Resources Wayne Pfeiffer Deputy Director NPACI & SDSC NPACI.
NSF CyberInfrastructure Linkages with “IT Issues” in the FDA Critical Path Initiative Sangtae “Sang” Kim, PhD National Science Foundation* presented at.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Scalable Systems Software for Terascale Computer Centers Coordinator: Al Geist Participating Organizations ORNL ANL LBNL.
The Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI C(I) 2 ] Providing a scalable mechanism for developing a CI-enabled science.
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
National Strategic Computing Initiative
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
National Computational Science Alliance A Review of User Projects at the Alliance Leading Edge Site Opening Talk to the Alliance Allocation Board Hosted.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
NATIONAL PARTNERSHIP FOR ADVANCED COMPUTATIONAL INFRASTRUCTURE Capability Computing: Performance, Usage, Tflops Peter R. Taylor San Diego Supercomputer.
LCSE – NCSA Partnership Accomplishments, FY01 Paul R. Woodward Laboratory for Computational Science & Engineering University of Minnesota October 17, 2001.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
O AK R IDGE N ATIONAL L ABORATORY U. S. D EPARTMENT OF E NERGY The Center for Computational Sciences 1 State of the CCS SOS 8 April 13, 2004 James B. White.
PACI Program : One Partner’s View Paul R. Woodward LCSE, Univ. of Minnesota NSF Blue Ribbon Committee Meeting Pasadena, CA, 1/22/02.
National Science Foundation Blue Ribbon Panel on Cyberinfrastructure Summary for the OSCER Symposium 13 September 2002.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
National Computational Science Ky PACS at the University of Kentucky April 2000 –Advanced Computing Resources –EPSCoR Outreach –SURA Liaison –John.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
National Computational Science Alliance Industrial Supercomputing Opening Talk to NCSA Strategic Industrial Partners Program Advisory Committee NCSA, University.
Polly Baker Division Director: Data, Mining, and Visualization
Unidata Policy Committee Meeting
Presentation transcript:

National Computational Science Alliance The Alliance Distributed Supercomputing Facilities Opening Talk to the Alliance User Advisory Council Held at Supercomputing ‘98 in Orlanda, Florida, December 5,1998

National Computational Science Alliance The National PACI Program - Partners and Supercomputer Users 850 Projects in 280 Universities 60 Partner Universities

National Computational Science Alliance PACI - The NSF Partnerships for Advanced Computational Infrastructure The Two Partnerships (NPACI & Alliance) Each Have: –Leading-Edge Site –The Site With the Very Large Scale Computing Systems –Mid-Level Resource Sites –Partners With Alternative or Experimental Computer Architectures, Data Stores, Visualization Capabilities, Etc. –Applications Technologies –Computational Science Partners Involved in Development, Testing and Evaluation of Infrastructure –Enabling Technologies –Computer Science Partners, Developing Tools and Software Infrastructure Driven by Application Partners –Education, Outreach, and Training Partners Network Infrastructure Is Critical.

National Computational Science Alliance NCSA is Combining Shared Memory Programming with Massive Parallelism Doubling Every Nine Months! Challenge Power Challenge Origin SN1

National Computational Science Alliance NCSA Users by System Sep94 Nov94Jan95 Mar95 May95 Jul95 Sep95 Nov95Jan96 Mar96 May96 Jul96 Sep96 Nov96Jan97 Mar97 May97 Jul97 Sep97 Nov97Jan98 Mar98 May98 Jul98 Sep98 Number of Users SGI Power Challenge Array CM5 Convex C3880 Convex Exemplar Cray Y-MP Origin SPP-2000 C3880 (retired 10/95) SPP-1200 Y-MP (retired 12/94) Origin SPP-2000 CM-5 (retired 1/97) PCA (retired 7/98) (retired 5/98)

National Computational Science Alliance Millions of NUs Used at NCSA FY93 to FY98

National Computational Science Alliance Solomon (82) NCSA Supplies Cycles to a Logarithmic Demand Function of Projects Super Large Medium Small Tiny Sugar (2) Knight (67) FY98 Usage < 10 NUs : Evans, Ghoniem, Jacobs, Long, York Karniadakis (31) Hawley (6) Suen (4) Goddard (41) Goodrich (59) Kollman (10) Chen (125) Droegemier (24)

National Computational Science Alliance Evolution of the NCSA Project Distribution Function FY93 to FY98

National Computational Science Alliance Rapid Increase in Large projects at NCSA FY93-98

National Computational Science Alliance Breakout in Supporting Super Projects at NCSA in the Last Year

National Computational Science Alliance Migration of NCSA User Distribution Toward the High End +400% +350% +114% -27% -79% Number of Projects

National Computational Science Alliance Alliance LES Chose 27 Large PSC Projects to Track Out of 100 Targeted Projects Includes: Droegemeir, Freeman, Karplus, Kollman, Schulten, Sugar 0 100, , , , , , , ,000 1QFY972QFY973QFY974QFY971QFY982QFY983QFY98 NUs Per Quarter Number of the 27 Projects Computing at NCSA Bar Shows NUs at NCSA Used Per Quarter

National Computational Science Alliance Disciplines Represented in the Large Academic Projects at the Alliance LES Over 5,000 NUs Annually Per Project >100 Projects Over 3.2 Million NUs Note Mapping to AT Teams: Nanomaterials Cosmology Chemical Engineering Molecular Biology Environmental Hydrology Astro and Bio Instruments 6/1/97 to 5/31/98 NCSA

National Computational Science Alliance Application Performance Scaling on 128-Processor Origin Conclusion Processor Origin is a 15 GF Machine (20-25% of Peak)

National Computational Science Alliance Origin Brings Shared Memory to MPP Scalability

National Computational Science Alliance Let’s Blow This Up! The Growth Rate of the National Capacity is Slowing Down Again Source: Quantum Research

National Computational Science Alliance The Drop in High End Capacity Available to National Academic Researchers Quantum Research FY96-98

National Computational Science Alliance Major Gap Has Developed in National Usage at NSF Supercomputer Centers Projection

National Computational Science Alliance Allocated Capacity for Meritorious NSF Large National Projects Doubled Data from NSF Metacenter and NRAC Reviews

National Computational Science Alliance Clustered Shared Memory Computers are Today’s High End NCSA has 6 x 128 Origin Processors ASC has 4 x 128 ARL has 3 x 128 CEWES has 1 x 128 NAVO has 1 x 128 Los Alamos ASCI Blue Will Have 48 x 128! Livermore ASCI Blue has 1536x4 IBM SP

National Computational Science Alliance High-End Computing Enables High Resolution of Flow Details 1024x1024x1024- A Billion Zone Computation of Compressible Turbulence This Simulation Run on Los Alamos SGI Origin Array U. Minn.SGI Visual Supercomputer Renders Images Vorticity LCSE, Univ of Minnesota

National Computational Science Alliance Cycles Used by NSF Community at the NSF Supercomputer Centers by Vendor SGI SN1 is the Natural Upgrade for 84% of Cycles! June 1, 1997 through May 31, 1998 CTC, NCSA, PSC, SDSC 1019 Projects Using 100% of the Cycles T3D/E Origin/PC C/T90

National Computational Science Alliance Peak Teraflops in Aggressive Upgrade Plan

National Computational Science Alliance Deputy Director Bordogna on NSF Leadership in Information Technologies Three Important Priorities for NSF in the Area of IT for the Future: –The First Area Is Fundamental and High-Risk IT Research Advanced Computation Research.. –The Second Priority Area for NSF Is Competitive Access and Use of High-end Computing and Networking. –The Third Priority Is Investing in IT Education at All Levels.

National Computational Science Alliance President’s Information Technology Advisory Committee Interim Report More Long Term IT Research Needed –Fundamental Research in Software Development –R & D and Testbeds for Scalable Infrastructure –Increase Support for High End Computing Socio-Economic and Workforce Impacts –Address the Shortage of High-Tech Workers –Study Social and Economic Impacts of IT Adoption Modes and Management of Federal IT Research –Fund Projects of Broader Scope and Longer Duration –Virtual Centers for Expeditions into the 21st Century –NSF as Lead Agency for Coordinating IT Research Congressional Testimony 10/6/98

National Computational Science Alliance PITAC Draft Refinement of High-End Acquisition Recommendation Fund the Acquisition of the Most Powerful High-End Computing Systems to Support Long Term Basic Research in Science and Engineering Access for (Highest Priority): –ALL Academic Researchers –ALL Disciplines –ALL Universities Access for (Second Priority): –Government Researchers –Industrial Researchers

National Computational Science Alliance Harnessing the Unused Cycles of Networks of Workstations Condor Cycles University of Kansas is Installing Condor Alliance Nanotechnologies Team Used Univ. of Wisconsin Condor Cluster - Burned 1 CPU-Year in Two Weeks!

National Computational Science Alliance NT Workstation Shipments Rapidly Surpassing UNIX Source: IDC, Wall Street Journal, 3/6/98

National Computational Science Alliance 128 Hewlett Packard 300 MHz 64 Compaq 333 MHz Andrew Chien, CS UIUC-->UCSD Rob Pennington, NCSA Reagan Moore, SDSC Plan to Link UCSD & UIUC Clusters “Supercomputer performance at mail-order prices”-- Jim Gray, Microsoft PACI Fostering Commodity Computing Various Applications Sustain 7 GF on 128 Processors

National Computational Science Alliance Performance Analysis is Key Computer Science Research Enabling Computational Science Mflops/ProcFlops/ByteFlops/NetworkRT Cray T3E1200~2~2,500 SGI Origin ~0.5~1,000 HPVM NT Supercluster300~3.2~6,000 IBM SP2550~3.7~38,000 Berkeley NOW II320~8.0~6,400 Beowulf(100Mbit)300~25~500,000

National Computational Science Alliance Performance of Scalable Systems Shows the Promise of Local Clustered PCs Danesh Tafti, Rob Pennington, NCSA; Andrew Chien (UIUC, UCSD) Solving 2D Navier-Stokes Kernel

National Computational Science Alliance Near Perfect Scaling of Cactus - 3D Dynamic Solver for the Einstein GR Equations Ratio of GFLOPs Origin = 2.5x NT SC Paul Walker, John Shalf, Rob Pennington, Andrew Chien NCSA Cactus was Developed by Paul Walker, MPI-Potsdam UIUC, NCSA

National Computational Science Alliance QCD Performance on Various Machines Doug Toussaint and Kostas Orginos, University of Arizona

National Computational Science Alliance The Road to Intel’s Merced The Convergence of Scientific and Commercial Computing IA-64 Co-Developed by Intel and Hewlett-Packard

National Computational Science Alliance User Web Browser Output to User User Input Format Translator, Query Engine and Program Driver Workbench Server Results to User User Instructions and queries Application Programs (May have varying interfaces and be written in different languages) Results Instructions Information Sources (May be of varying formats) Information Queries NCSA Computational Biology Group The NCSA Information Workbench - An Architecture for Web-Based Computing

National Computational Science Alliance Structure & Function Pathways & Physiology Populations & Evolution Ecosystems Genomes Gene Products Using a Web Browser to Run Programs and Analyze Data Worldwide NCSA Biology Workbench Has Over 6,000 Users From Over 20 Countries