Charlie Catlett June 2006 The State of TeraGrid A National Production Cyberinfrastructure Facility Charlie Catlett, TeraGrid Director.

Slides:



Advertisements
Similar presentations
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
Advertisements

1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
User Introduction to the TeraGrid 2007 SDSC NCAR TACC UC/ANL NCSA ORNL PU IU PSC.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
O C I October 31, 2006Office of CyberInfrastructure1 Enhancing Virtual Organizations Abhi Deshmukh Office of Cyberinfrastructure & Engineering Directorate.
Simo Niskala Teemu Pasanen
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Network, Operations and Security Area Tony Rimovsky NOS Area Director
June 26, 2006 TeraGrid A National Production Cyberinfrastructure Facility Scott Lathrop TeraGrid Director of Education, Outreach and Training University.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
TeraGrid National Cyberinfrasctructure for Scientific Research PRESENTER NAMES AND AFFILIATIONS HERE.
GIG Software Integration: Area Overview TeraGrid Annual Project Review April, 2008.
SAN DIEGO SUPERCOMPUTER CENTER Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SDSC Director of Consulting,
April 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr Area Director for Science Gateways San Diego Supercomputer Center
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Science Gateways on the TeraGrid Nancy Wilkins-Diehr Area Director for Science Gateways San Diego Supercomputer Center
Advancing Scientific Discovery through TeraGrid Scott Lathrop TeraGrid Director of Education, Outreach and Training University of Chicago and Argonne National.
August 2007 Advancing Scientific Discovery through TeraGrid Scott Lathrop TeraGrid Director of Education, Outreach and Training University of Chicago and.
1 TeraGrid ‘10 August 2-5, 2010, Pittsburgh, PA State of TeraGrid in Brief John Towns TeraGrid Forum Chair Director of Persistent Infrastructure National.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
TeraGrid Overview Cyberinfrastructure Days Internet2 10/9/07 Mark Sheddon Resource Provider Principal Investigator San Diego Supercomputer Center
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Some Grid Experiences Laura Pearlman USC Information Sciences Institute ICTP Advanced Training Workshop on Scientific Instruments on the Grid *Most of.
SAN DIEGO SUPERCOMPUTER CENTER Impact Requirements Analysis Team Co-Chairs: Mark Sheddon (SDSC) Ann Zimmerman (University of Michigan) Members: John Cobb.
Apache Airavata (Incubating) Gateway to Grids & Clouds Suresh Marru Nov 10 th 2011.
Russ Hobby Program Manager Internet2 Cyberinfrastructure Architect UC Davis.
Geosciences - Observations (Bob Wilhelmson) The geosciences in NSF’s world consists of atmospheric science, ocean science, and earth science Many of the.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Gary Bertoline, Assistant Dean for Graduate Studies and Professor – Computer Graphics, Purdue University Scott Lathrop, Education, Outreach & Training/External.
SC06, Tampa FL November 11-17, 2006 Science Gateways on the TeraGrid Powerful Beyond Imagination! Nancy Wilkins-Diehr TeraGrid Area Director for Science.
Riding the Crest: High-End Cyberinfrastructure Experiences and Opportunities on the NSF TeraGrid A Panel Presentation by Laura M c GinnisRadha Nandkumar.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
Minority-Serving Institutions (MSI) Cyberinfrastructure (CI) Institute [MSI-CI 2 ] and CI Empowerment Coalition MSI-CIEC October Geoffrey Fox
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
© 2006 The University of Chicago Team Science, Team Scholarship Tom Barton Chad Kainz.
Sergiu April 2006June 2006 Overview of TeraGrid Resources and Services Sergiu Sanielevici, TeraGrid Area Director for User.
Sergiu April 2006June 2006 TeraGrid: ASTA Program Status June 2006 SDSC NCAR (mid-2006) TACC UC/ANL NCSA ORNL PU IU PSC.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
December 10, 2003Slide 1 International Networking and Cyberinfrastructure Douglas Gatchell Program Director International Networking National Science Foundation,
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Data, Visualization and Scheduling (DVS) TeraGrid Annual Meeting, April 2008 Kelly Gaither, GIG Area Director DVS.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
TeraGrid User Portal Eric Roberts. Outline Motivation Vision What’s included? Live Demonstration.
TeraGrid Overview John-Paul “JP” Navarro TeraGrid Area Co-Director for Software Integration University of Chicago/Argonne National Laboratory March 25,
AT LOUISIANA STATE UNIVERSITY CCT: Center for Computation & Technology Introduction to the TeraGrid Daniel S. Katz Lead, LONI as a TeraGrid.
October 2007 TeraGrid : Advancing Scientific Discovery and Learning Diane A. Baxter, Ph.D. Education Director San Diego Supercomputer Center University.
TeraGrid Science Advisory Board Indianapolis, IN  20 July 2009.
PEER 2003 Meeting 03/08/031 Interdisciplinary Framework Major focus areas Structural Representation Fault Systems Earthquake Source Physics Ground Motions.
Petascale Computing Resource Allocations PRAC – NSF Ed Walker, NSF CISE/ACI March 3,
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
TG ’08, June 9-13, State of TeraGrid John Towns Co-Chair, TeraGrid Forum Director, Persistent Infrastructure National Center for Supercomputing.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
SAN DIEGO SUPERCOMPUTER CENTER Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director for Science Gateways SDSC Director of Consulting,
Building PetaScale Applications and Tools on the TeraGrid Workshop December 11-12, 2007 Scott Lathrop and Sergiu Sanielevici.
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
TeraGrid Science Advisory Board Arlington, VA  10 Dec
INTRODUCTION TO XSEDE. INTRODUCTION  Extreme Science and Engineering Discovery Environment (XSEDE)  “most advanced, powerful, and robust collection.
TeraGrid Software Integration: Area Overview (detailed in 2007 Annual Report Section 3) Lee Liming, JP Navarro TeraGrid Annual Project Review April, 2008.
Bob Jones EGEE Technical Director
NSF TeraGrid Review January 10, 2006
Joslynn Lee – Data Science Educator
Joint Techs, Columbus, OH
Office of CyberInfrastructure
Advancing Scientific Discovery through TeraGrid
Presentation transcript:

Charlie Catlett June 2006 The State of TeraGrid A National Production Cyberinfrastructure Facility Charlie Catlett, TeraGrid Director University of Chicago and Argonne National Laboratory ©UNIVERSITY OF CHICAGO THESE SLIDES MAY BE FREELY USED PROVIDING THAT THE TERAGRID LOGO REMAINS ON THE SLIDES, AND THAT THE SCIENCE GROUPS ARE ACKNOWLEDGED IN CASES WHERE SCIENTIFIC IMAGES ARE USED. (SEE SLIDE NOTES FOR CONTACT INFORMATION)

Charlie Catlett June 2006 TeraGrid: Integrating NSF Cyberinfrastructure SDSC TACC UC/ANL NCSA ORNL PU IU PSC TeraGrid is a facility that integrates computational, information, and analysis resources at the San Diego Supercomputer Center, the Texas Advanced Computing Center, the University of Chicago / Argonne National Laboratory, the National Center for Supercomputing Applications, Purdue University, Indiana University, Oak Ridge National Laboratory, the Pittsburgh Supercomputing Center, and the National Center for Atmospheric Research. NCAR Caltech USC-ISI Utah Iowa Cornell Buffalo UNC-RENCI Wisc

Charlie Catlett June 2006 TeraGrid Vision TeraGrid will create integrated, persistent, and pioneering computational resources that will significantly improve our nation’s ability and capacity to gain new insights into our most challenging research questions and societal problems. –Our vision requires an integrated approach to the scientific workflow including obtaining access, application development and execution, data analysis, collaboration and data management.

Charlie Catlett June 2006 TeraGrid Objectives DEEP Science: Enabling Petascale Science –Make Science More Productive through an integrated set of very-high capability resources Address key challenges prioritized by users WIDE Impact: Empowering Communities –Bring TeraGrid capabilities to the broad science community Partner with science community leaders - “Science Gateways” OPEN Infrastructure, OPEN Partnership –Provide a coordinated, general purpose, reliable set of services and resources Partner with campuses and facilities

Charlie Catlett June 2006 TeraGrid DEEP Objectives DEEP Science: Enabling Petascale Science –Make Science More Productive through an integrated set of very-high capability resources Address key challenges prioritized by users –Ease of Use: TeraGrid User Portal Significant and deep documentation and training improvements Addresses user tasks related to allocations, accounts –Breakthroughs: Advanced Support for TeraGrid Applications (ASTA) Hands-on, “Embedded” consultant to help teams bridge a gap –Seven user teams have been helped –Eight user teams currently receiving assistance –Five proposed projects with new user teams –New Capabilities driven by 2004 and 2005 user surveys WAN Parallel File System for remote I/O (move data only once!) Enhanced workflow tools (added GridShell, VDS)

Charlie Catlett June 2006 TeraGrid Usage 33% Annual Growth PACI Systems

Charlie Catlett June 2006 TeraGrid PI’s By Institution as of May 2006 TeraGrid PI’s Blue: 10 or more PI’s Red: 5-9 PI’s Yellow: 2-4 PI’s Green: 1 PI

Charlie Catlett June 2006 TeraGrid User Community 160 DAC proposals 8 months into FY06 continues strong growth in new users investigating the use of TeraGrid for their science.

Charlie Catlett June 2006 Ease of Use: TeraGrid User Portal Account Management –Manage my allocation(s) –Manage my credentials –Manage my project users Information Services –TeraGrid resources & attributes –job queues –load and status information Documentation –User Info documentation –contextual help for interfaces Consulting Services –help desk information –portal feedback channel Allocation Services –How to apply for allocations –Allocation request/renewal Eric Roberts

Charlie Catlett June 2006 Advanced Support for TeraGrid Applications

● LSMS- locally self-consistent multiple scattering method is a linear scaling ab initio electronic structure method (Gordin Bell prize winner) ● Achieves as high as 81% peak performance of CRAY- XT3 Wang (PSC), Stocks, Rusanu, Nicholson, Eisenbach (ORNL), Faulkner (FAU) Magnetic Nanocomposites Wang (PSC) Direct quantum mechanical simulation on Cray XT3. Goal: nano-structured material with potential applications in high density data storage: 1 particle/bit. –Need to understand influence of these nanoparticles on each other. A petaflop machine would enable realistic simulations for nanostructures of ~ 50nm (~ 5M atoms).

Charlie Catlett June 2006 Homogeneous turbulence driven by force of Arnold-Beltrami-Childress (ABC) form VORTONICS Boghosian (Tufts) Physical challenges: Reconnection and Dynamos –Vortical reconnection governs establishment of steady-state in Navier-Stokes turbulence –Magnetic reconnection governs heating of solar corona –The astrophysical dynamo problem. Exact mechanism and space/time scales unknown and represent important theoretical challenges Computational challenges: Enormous problem sizes, memory requirements, and long run times –requires relaxation on space-time lattice of 5-15 Terabytes. –Requires geographically distributed domain decomposition (GD3): DTF, TCS, Lonestar Real time visualization at UC/ANL –Insley (UC/ANL), O’Neal (PSC), Guiang (TACC)

Charlie Catlett June 2006 Largest and most detailed earthquake simulation of the southern San Andreas fault. First calculation of physics-based probabilistic hazard curves for Southern California using full waveform modeling rather than traditional attenuation relationships. Computation and data analysis at multiple TeraGrid sites. Workflow tools enable work at a scale previously unattainable by automating the very large number of programs and files that must be managed. TeraGrid staff Cui (SDSC), Reddy (GIG/PSC) Simulation of a magnitude 7.7 seismic wave propagation on the San Andreas Fault. 47 TB data set. TeraShake / CyberShake Olsen (SDSU), Okaya (USC) Major Earthquakes on the San Andreas Fault, 1680-present 1906 M M 7.7

Charlie Catlett June 2006 Searching for New Crystal Structures Deem (Rice) Searching for new 3-D zeolite crystal structures in crystallographic space Requires 10,000s of serial jobs through TeraGrid. Using MyCluster/GridShell to aggregate all the computational capacity on the TeraGrid for accelerating search. TG staff Walker (TACC) and Cheeseman (Purdue)

Charlie Catlett June 2006 TeraGrid WIDE Objectives WIDE Impact: Empowering Communities –Bring TeraGrid capabilities to the broad science community Partner with science community leaders - “Science Gateways” –Science Gateways Program Originally ten partners, now 21 and growing –Reaching over 100 Gateway partner institutions (Pis) –Anticipating order of magnitude increase in users via Gateways –Education, Outreach, and Training Initiated joint programs integrating TeraGrid partner offerings

Charlie Catlett June 2006 Science Gateway Partners Open Science Grid (OSG) Special PRiority and Urgent Computing Environment (SPRUCE, UChicago) National Virtual Observatory (NVO, Caltech) Linked Environments for Atmospheric Discovery (LEAD, Indiana) Computational Chemistry Grid (GridChem, NCSA) Computational Science and Engineering Online (CSE-Online, Utah) GEON(GEOsciences Network) (GEON, SDSC) Network for Earthquake Engineering Simulation (NEES, SDSC) SCEC Earthworks Project (USC) Astrophysical Data Repository (Cornell) CCR ACDC Portal (Buffalo) Network for Computational Nanotechnology and nanoHUB (Purdue) GIScience Gateway (GISolve, Iowa) Biology and Biomedicine Science Gateway (UNC RENCI) Open Life Sciences Gateway (OLSG, UChicago) The Telescience Project (UCSD) Grid Analysis Environment (GAE, Caltech) Neutron Science Instrument Gateway (ORNL) TeraGrid Visualization Gateway (ANL) BIRN (UCSD) Gridblast Bioinformatics Gateway (NCSA) Earth Systems Grid (NCAR) SID Grid (UChicago)

Charlie Catlett June 2006 TeraGrid Science Gateway Partner Sites TG-SGW-Partners 21 Science Gateway Partners (and growing) - Over 100 partner Institutions

Charlie Catlett June 2006 TeraGrid Science Gateways Initiative: Community Interface to Grids Common Web Portal or application interfaces (database access, computation, workflow, etc). “Back-End” use of TeraGrid computation, information management, visualization, or other services. Standard approaches so that science gateways may readily access resources in any cooperating Grid without technical modification.

Charlie Catlett June 2006 TeraGrid EOT Our mission is to engage larger and more diverse communities of researchers, educators and students in discovering, using, and contributing to applications of cyberinfrastructure to advance scientific discovery.

Charlie Catlett June 2006 TeraGrid ‘06 Student Competitions CI Impact - perspectives of CI impact on our world –Bryan BemleyBowie State University, Maryland –August KnechtUniversity of Illinois, Illinois –Joel PoloneyUniversity of Illinois, Illinois –Daniela RosnerUniversity of California, Berkeley Research Posters - grid computing applications in research –Ivan Beschastnikh University of Chicago, Illinois* –Diego DonzisGeorgia Tech, Georgia –Alexander GondarenkoCornell University, New York –Raymond HansenPurdue University, Indiana –Wenli HeUniversity of Iowa, Iowa –Gregory KoenigUniversity of Illinois, Illinois –Alex LemannEarlham College, Indiana –Zhengqiang (Sean) LiangWayne State University, Michigan –Diglio SimoniSyracuse University, New York –Rishi VermaIndiana University, Indiana *Grand Prize WInner

Charlie Catlett June 2006 TeraGrid OPEN Objectives OPEN Infrastructure, OPEN Partnership –Provide a coordinated, general purpose, reliable set of services and resources Partner with campuses and facilities –Streamlined Software Integration Evolved architecture to leverage standards, web services –Campus Partnership Programs User access, physical and digital asset federation, outreach

Charlie Catlett June 2006 TeraGrid “Open” Initiatives Working with Campuses: toward Integrated Cyberinfrastructure –Access for Users: Authentication and Authorization –Additional Capacity: Integrated resources –Additional Services: Integrated data collections –Broadening Participation: Outreach beyond R1 institutions Technology Foundations –Security and Auth*/Acctg –Service-based Software Architecture

Charlie Catlett June 2006 Lower Integration Barriers; Improved Scaling Initial Integration: Implementation-based –Coordinated TeraGrid Software and Services (CTSS) Provide software for heterogeneous systems, leverage specific implementations to achieve interoperation. Evolving understanding of “minimum” required software set for users Emerging Architecture: Services-based –Core services: capabilities that define a “TeraGrid Resource” Authentication & Authorization Capability Information Service Auditing/Accounting/Usage Reporting Capability Verification & Validation Mechanism –Significantly smaller than the current set of required components. –Provides a foundation for value-added services. Each Resource Provider selects one or more added services, or “kits” Core and individual kits can evolve incrementally, in parallel

Charlie Catlett June 2006 Example Value-Added Service Kits Job Execution Application Development Science Gateway Hosting Application Hosting –dynamic service deployment Data Movement Data Management Science Workflow Support Visualization

Charlie Catlett June 2006 Lower User Barriers; Increase Security Resource CA Database UID/password CA Execute Job

Charlie Catlett June 2006 PK Yeung Georgia Institute of Technology Gerhard Klimeck Purdue University Thomas Cheatham University of Utah Gwen Jacobs Montana State University Luis Lehner Louisiana State University Philip Maechling University of Southern California Roy Pea Stanford University Alex Ramirez Hispanic Association of Colleges and Universities Nora Sabelli Center for Innovative Learning Technologies Patricia Teller University of Texas - El Paso Cathy Wu Georgetown University Bennett Bertenthal University of Chicago Cyberinfrastructure User Advisory Committee