Presentation is loading. Please wait.

Presentation is loading. Please wait.

Charlie Catlett June 2006 The State of TeraGrid A National Production Cyberinfrastructure Facility Charlie Catlett, TeraGrid Director.

Similar presentations


Presentation on theme: "Charlie Catlett June 2006 The State of TeraGrid A National Production Cyberinfrastructure Facility Charlie Catlett, TeraGrid Director."— Presentation transcript:

1 Charlie Catlett (cec@uchicago.edu) June 2006 The State of TeraGrid A National Production Cyberinfrastructure Facility Charlie Catlett, TeraGrid Director University of Chicago and Argonne National Laboratory cec@uchicago.edu www.teragrid.org ©UNIVERSITY OF CHICAGO THESE SLIDES MAY BE FREELY USED PROVIDING THAT THE TERAGRID LOGO REMAINS ON THE SLIDES, AND THAT THE SCIENCE GROUPS ARE ACKNOWLEDGED IN CASES WHERE SCIENTIFIC IMAGES ARE USED. (SEE SLIDE NOTES FOR CONTACT INFORMATION)

2 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid: Integrating NSF Cyberinfrastructure SDSC TACC UC/ANL NCSA ORNL PU IU PSC TeraGrid is a facility that integrates computational, information, and analysis resources at the San Diego Supercomputer Center, the Texas Advanced Computing Center, the University of Chicago / Argonne National Laboratory, the National Center for Supercomputing Applications, Purdue University, Indiana University, Oak Ridge National Laboratory, the Pittsburgh Supercomputing Center, and the National Center for Atmospheric Research. NCAR Caltech USC-ISI Utah Iowa Cornell Buffalo UNC-RENCI Wisc

3 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid Vision TeraGrid will create integrated, persistent, and pioneering computational resources that will significantly improve our nation’s ability and capacity to gain new insights into our most challenging research questions and societal problems. –Our vision requires an integrated approach to the scientific workflow including obtaining access, application development and execution, data analysis, collaboration and data management.

4 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid Objectives DEEP Science: Enabling Petascale Science –Make Science More Productive through an integrated set of very-high capability resources Address key challenges prioritized by users WIDE Impact: Empowering Communities –Bring TeraGrid capabilities to the broad science community Partner with science community leaders - “Science Gateways” OPEN Infrastructure, OPEN Partnership –Provide a coordinated, general purpose, reliable set of services and resources Partner with campuses and facilities

5 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid DEEP Objectives DEEP Science: Enabling Petascale Science –Make Science More Productive through an integrated set of very-high capability resources Address key challenges prioritized by users –Ease of Use: TeraGrid User Portal Significant and deep documentation and training improvements Addresses user tasks related to allocations, accounts –Breakthroughs: Advanced Support for TeraGrid Applications (ASTA) Hands-on, “Embedded” consultant to help teams bridge a gap –Seven user teams have been helped –Eight user teams currently receiving assistance –Five proposed projects with new user teams –New Capabilities driven by 2004 and 2005 user surveys WAN Parallel File System for remote I/O (move data only once!) Enhanced workflow tools (added GridShell, VDS)

6 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid Usage 33% Annual Growth PACI Systems

7 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid PI’s By Institution as of May 2006 TeraGrid PI’s Blue: 10 or more PI’s Red: 5-9 PI’s Yellow: 2-4 PI’s Green: 1 PI

8 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid User Community 160 DAC proposals 8 months into FY06 continues strong growth in new users investigating the use of TeraGrid for their science.

9 Charlie Catlett (cec@uchicago.edu) June 2006 Ease of Use: TeraGrid User Portal Account Management –Manage my allocation(s) –Manage my credentials –Manage my project users Information Services –TeraGrid resources & attributes –job queues –load and status information Documentation –User Info documentation –contextual help for interfaces Consulting Services –help desk information –portal feedback channel Allocation Services –How to apply for allocations –Allocation request/renewal Eric Roberts (ericrobe@tacc.utexas.edu)

10 Charlie Catlett (cec@uchicago.edu) June 2006 Advanced Support for TeraGrid Applications

11 ● LSMS- locally self-consistent multiple scattering method is a linear scaling ab initio electronic structure method (Gordin Bell prize winner) ● Achieves as high as 81% peak performance of CRAY- XT3 Wang (PSC), Stocks, Rusanu, Nicholson, Eisenbach (ORNL), Faulkner (FAU) Magnetic Nanocomposites Wang (PSC) Direct quantum mechanical simulation on Cray XT3. Goal: nano-structured material with potential applications in high density data storage: 1 particle/bit. –Need to understand influence of these nanoparticles on each other. A petaflop machine would enable realistic simulations for nanostructures of ~ 50nm (~ 5M atoms).

12 Charlie Catlett (cec@uchicago.edu) June 2006 Homogeneous turbulence driven by force of Arnold-Beltrami-Childress (ABC) form VORTONICS Boghosian (Tufts) Physical challenges: Reconnection and Dynamos –Vortical reconnection governs establishment of steady-state in Navier-Stokes turbulence –Magnetic reconnection governs heating of solar corona –The astrophysical dynamo problem. Exact mechanism and space/time scales unknown and represent important theoretical challenges Computational challenges: Enormous problem sizes, memory requirements, and long run times –requires relaxation on space-time lattice of 5-15 Terabytes. –Requires geographically distributed domain decomposition (GD3): DTF, TCS, Lonestar Real time visualization at UC/ANL –Insley (UC/ANL), O’Neal (PSC), Guiang (TACC)

13 Charlie Catlett (cec@uchicago.edu) June 2006 Largest and most detailed earthquake simulation of the southern San Andreas fault. First calculation of physics-based probabilistic hazard curves for Southern California using full waveform modeling rather than traditional attenuation relationships. Computation and data analysis at multiple TeraGrid sites. Workflow tools enable work at a scale previously unattainable by automating the very large number of programs and files that must be managed. TeraGrid staff Cui (SDSC), Reddy (GIG/PSC) Simulation of a magnitude 7.7 seismic wave propagation on the San Andreas Fault. 47 TB data set. TeraShake / CyberShake Olsen (SDSU), Okaya (USC) Major Earthquakes on the San Andreas Fault, 1680-present 1906 M 7.8 1857 1680 M 7.7

14 Charlie Catlett (cec@uchicago.edu) June 2006 Searching for New Crystal Structures Deem (Rice) Searching for new 3-D zeolite crystal structures in crystallographic space Requires 10,000s of serial jobs through TeraGrid. Using MyCluster/GridShell to aggregate all the computational capacity on the TeraGrid for accelerating search. TG staff Walker (TACC) and Cheeseman (Purdue)

15 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid WIDE Objectives WIDE Impact: Empowering Communities –Bring TeraGrid capabilities to the broad science community Partner with science community leaders - “Science Gateways” –Science Gateways Program Originally ten partners, now 21 and growing –Reaching over 100 Gateway partner institutions (Pis) –Anticipating order of magnitude increase in users via Gateways –Education, Outreach, and Training Initiated joint programs integrating TeraGrid partner offerings

16 Charlie Catlett (cec@uchicago.edu) June 2006 Science Gateway Partners Open Science Grid (OSG) Special PRiority and Urgent Computing Environment (SPRUCE, UChicago) National Virtual Observatory (NVO, Caltech) Linked Environments for Atmospheric Discovery (LEAD, Indiana) Computational Chemistry Grid (GridChem, NCSA) Computational Science and Engineering Online (CSE-Online, Utah) GEON(GEOsciences Network) (GEON, SDSC) Network for Earthquake Engineering Simulation (NEES, SDSC) SCEC Earthworks Project (USC) Astrophysical Data Repository (Cornell) CCR ACDC Portal (Buffalo) Network for Computational Nanotechnology and nanoHUB (Purdue) GIScience Gateway (GISolve, Iowa) Biology and Biomedicine Science Gateway (UNC RENCI) Open Life Sciences Gateway (OLSG, UChicago) The Telescience Project (UCSD) Grid Analysis Environment (GAE, Caltech) Neutron Science Instrument Gateway (ORNL) TeraGrid Visualization Gateway (ANL) BIRN (UCSD) Gridblast Bioinformatics Gateway (NCSA) Earth Systems Grid (NCAR) SID Grid (UChicago)

17 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid Science Gateway Partner Sites TG-SGW-Partners 21 Science Gateway Partners (and growing) - Over 100 partner Institutions

18 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid Science Gateways Initiative: Community Interface to Grids Common Web Portal or application interfaces (database access, computation, workflow, etc). “Back-End” use of TeraGrid computation, information management, visualization, or other services. Standard approaches so that science gateways may readily access resources in any cooperating Grid without technical modification.

19 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid EOT Our mission is to engage larger and more diverse communities of researchers, educators and students in discovering, using, and contributing to applications of cyberinfrastructure to advance scientific discovery.

20 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid ‘06 Student Competitions CI Impact - perspectives of CI impact on our world –Bryan BemleyBowie State University, Maryland –August KnechtUniversity of Illinois, Illinois –Joel PoloneyUniversity of Illinois, Illinois –Daniela RosnerUniversity of California, Berkeley Research Posters - grid computing applications in research –Ivan Beschastnikh University of Chicago, Illinois* –Diego DonzisGeorgia Tech, Georgia –Alexander GondarenkoCornell University, New York –Raymond HansenPurdue University, Indiana –Wenli HeUniversity of Iowa, Iowa –Gregory KoenigUniversity of Illinois, Illinois –Alex LemannEarlham College, Indiana –Zhengqiang (Sean) LiangWayne State University, Michigan –Diglio SimoniSyracuse University, New York –Rishi VermaIndiana University, Indiana *Grand Prize WInner

21 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid OPEN Objectives OPEN Infrastructure, OPEN Partnership –Provide a coordinated, general purpose, reliable set of services and resources Partner with campuses and facilities –Streamlined Software Integration Evolved architecture to leverage standards, web services –Campus Partnership Programs User access, physical and digital asset federation, outreach

22 Charlie Catlett (cec@uchicago.edu) June 2006 TeraGrid “Open” Initiatives Working with Campuses: toward Integrated Cyberinfrastructure –Access for Users: Authentication and Authorization –Additional Capacity: Integrated resources –Additional Services: Integrated data collections –Broadening Participation: Outreach beyond R1 institutions Technology Foundations –Security and Auth*/Acctg –Service-based Software Architecture

23 Charlie Catlett (cec@uchicago.edu) June 2006 Lower Integration Barriers; Improved Scaling Initial Integration: Implementation-based –Coordinated TeraGrid Software and Services (CTSS) Provide software for heterogeneous systems, leverage specific implementations to achieve interoperation. Evolving understanding of “minimum” required software set for users Emerging Architecture: Services-based –Core services: capabilities that define a “TeraGrid Resource” Authentication & Authorization Capability Information Service Auditing/Accounting/Usage Reporting Capability Verification & Validation Mechanism –Significantly smaller than the current set of required components. –Provides a foundation for value-added services. Each Resource Provider selects one or more added services, or “kits” Core and individual kits can evolve incrementally, in parallel

24 Charlie Catlett (cec@uchicago.edu) June 2006 Example Value-Added Service Kits Job Execution Application Development Science Gateway Hosting Application Hosting –dynamic service deployment Data Movement Data Management Science Workflow Support Visualization

25 Charlie Catlett (cec@uchicago.edu) June 2006 Lower User Barriers; Increase Security Resource CA Database UID/password CA Execute Job

26 Charlie Catlett (cec@uchicago.edu) June 2006 PK Yeung Georgia Institute of Technology Gerhard Klimeck Purdue University Thomas Cheatham University of Utah Gwen Jacobs Montana State University Luis Lehner Louisiana State University Philip Maechling University of Southern California Roy Pea Stanford University Alex Ramirez Hispanic Association of Colleges and Universities Nora Sabelli Center for Innovative Learning Technologies Patricia Teller University of Texas - El Paso Cathy Wu Georgetown University Bennett Bertenthal University of Chicago Cyberinfrastructure User Advisory Committee


Download ppt "Charlie Catlett June 2006 The State of TeraGrid A National Production Cyberinfrastructure Facility Charlie Catlett, TeraGrid Director."

Similar presentations


Ads by Google