Where do we go from here? “Knowledge Environments to Support Distributed Science and Engineering” Symposium on Knowledge Environments for Science and Engineering.

Slides:



Advertisements
Similar presentations
Common Instrument Middleware Architecture and Federation of Instrument Resources for X-ray Crystallography Rick McMullen Indiana University.
Advertisements

Strategic Simulation Plan Thomas Zacharia Director Computer Science and Mathematics Division Oak Ridge National Laboratory presented at the Workshop on.
Earth System Curator Spanning the Gap Between Models and Datasets.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
C3.ca in Atlantic Canada Virendra Bhavsar Director, Advanced Computational Research Laboratory (ACRL) Faculty of Computer Science University of New Brunswick.
U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
Data Grids Darshan R. Kapadia Gregor von Laszewski
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
14 July 2000TWIST George Brett NLANR Distributed Applications Support Team (NCSA/UIUC)
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Introduction to Grid Computing Ann Chervenak Carl Kesselman And the members of the Globus Team.
An Introduction to the Open Science Data Cloud Heidi Alvarez Florida International University Robert L. Grossman University of Chicago Open Cloud Consortium.
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
Presented by The Earth System Grid: Turning Climate Datasets into Community Resources David E. Bernholdt, ORNL on behalf of the Earth System Grid team.
Dataset Citation: From Pilot to Production Mark Martin Assistant Director, Office of Scientific and Technical Information U.S. Department of Energy.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
A Wide Range of Scientific Disciplines Will Require a Common Infrastructure Example--Two e-Science Grand Challenges –NSF’s EarthScope—US Array –NIH’s Biomedical.
Spring 2003 Internet2 Meeting Cyberinfrastructure - Implications for the Future of Research Alan Blatecky ANIR National Science Foundation.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
© Internet 2012 Internet2 and Global Collaboration APAN 33 Chiang Mai 14 February 2012 Stephen Wolff Internet2.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
National Center for Supercomputing Applications Barbara S. Minsker, Ph.D. Associate Professor National Center for Supercomputing Applications and Department.
Commodity Grid Kits Gregor von Laszewski (ANL), Keith Jackson (LBL) Many state-of-the-art scientific applications, such as climate modeling, astrophysics,
Office of Science U.S. Department of Energy U.S. Department of Energy’s Office of Science The Office of Science Role in Environmental Cleanup Dr. James.
NSDL Collections Based on DOE User Facilities Christopher Klaus 10/05/03.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
IEA Hydrogen Annex 30: Global Hydrogen Systems Analysis Experts Kick-off meeting September 16, 2010 US Experts Sandia National Laboratories: Dave Reichmuth,
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
3 December 2015 Examples of partnerships and collaborations from the Internet2 experience Interworking2004 Ottawa, Canada Heather Boyles, Internet2
Breakout # 1 – Data Collecting and Making It Available Data definition “ Any information that [environmental] researchers need to accomplish their tasks”
1 OFFICE OF ADVANCED SCIENTIFIC COMPUTING RESEARCH The NERSC Center --From A DOE Program Manager’s Perspective-- A Presentation to the NERSC Users Group.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
U.S. Department of Energy’s Office of Science Dr. Raymond L. Orbach Director, Office of Science April 29, 2004 PRESENTATION FOR THE BIOLOGICAL AND ENVIRONMENTAL.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Digital Data Collections ARL, CNI, CLIR, and DLF Forum October 28, 2005 Washington DC Chris Greer Program Director National Science Foundation.
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
National Science Foundation Blue Ribbon Panel on Cyberinfrastructure Summary for the OSCER Symposium 13 September 2002.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
Realizing the Promise of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science.
Center for Component Technology for Terascale Simulation Software (CCTTSS) 110 April 2002CCA Forum, Townsend, TN This work has been sponsored by the Mathematics,
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Toward High Breakthrough Collaboration (HBC) Susan Turnbull Program Manager Advanced Scientific Computing Research March 4, 2009.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
NESG Site Ambassador Mike Bennett Inder Monga Summer ESCC, July 2013 Berkeley, CA.
Presented by SciDAC-2 Petascale Data Storage Institute Philip C. Roth Computer Science and Mathematics Future Technologies Group.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Clouds , Grids and Clusters
Collaborations and Interactions with other Projects
Bridging the digital divide
Grid Application Model and Design and Implementation of Grid Services
Presentation transcript:

Where do we go from here? “Knowledge Environments to Support Distributed Science and Engineering” Symposium on Knowledge Environments for Science and Engineering November 26, 2002 Mary Anne Scott Dept of Energy Office of Science

Distributed Resources; Distributed Expertise Major User Facilities User Institutions Multiprogram Laboratories Program-Dedicated Laboratories Specific-Mission Laboratories Pacific Northwest National Laboratory Ames Laboratory Argonne National Laboratory Brookhaven National Laboratory Oak Ridge National Laboratory Los Alamos National Laboratory Lawrence Livermore National Laboratory Lawrence Berkeley National Laboratory Sandia National Laboratories Fermi National Accelerator Laboratory Princeton Plasma Physics Laboratory Thomas Jefferson National Accelerator Facility Idaho National Environmental and Engineering Laboratory National Renewable Energy Laboratory Stanford Linear Accelerator Center

3 DOE Office of Science Context Research Pre-1995 Foundational technology (Nexus, MPI, Mbone, …) Distributed Collaborative Experiment Environment Projects (testbeds and supporting technology) DOE 2000 Program (pilot collaboratories and technology projects) 2000-present National Collaboratories Program 2001-present Scientific Discovery Through Advanced Computing (SciDAC) Planning In order to inform the development and deployment of technology, a set of high- impact science applications in the areas of high energy physics, climate, chemical sciences, magnetic fusion energy, and molecular biology have been analyzed * to characterize their visions for the future process of science, and the networking and middleware capabilities needed to support those visions * DOE Office of Science, High Performance Network Planning Workshop. August 13-15, 2002: Reston, Virginia, USA.

4 MAGIC for addressing the coordination problem? A team under the Large Scale Network (interagency coordination) Meets Monthly (1st Wed of each month) Federal participants ANL, DOE, LANL, LBL, NASA, NCO, NIH, NIST, NOAA, NSF, PNL, UCAR Other participants Boeing, Cisco, Educause, HP, IBM, Internet2, ISI, Level3, Microsoft, U-Chicago, UIUC, U-Wisconsin Workshop held in Chicago, Aug editors, contributors and participants from Federal Government, agencies and labs; industry, universities, and international organizations ~100 participants “Blueprint for Future Science Middleware and Grid Research and Infrastructure” Middleware And Grid Infrastructure Coordination

5 Driving Factors for Middleware and Grids New classes of scientific problems are enabled from technologies development High energy physicists will harness tens of thousands of CPUs in a worldwide data grid On-line digital sky survey requires mechanisms for data federation and effective navigation Advances in medical imaging and technologies enable collaboration across disciplines and scale Coupling of expertise, collaboration, and disciplines encourage the development of new science and research. Continuing exponential advances in sensor, computer, storage and network capabilities will occur. Sensor networks will create experimental facilities. PetaByte and ExaByte databases will become feasible. Increase in numerical and computer modeling capabilities broaden the base of science disciplines. Increase in network speeds makes it feasible to connect distributed resources as never before. Science Push Technology Pull

6 Future Science (~5yr) DisciplineCharacteristicsVision for the Future Process of Science Anticipated Requirements NetworkingMiddleware Climate Many simulation elements/components added as understanding increases 100 Tby/100 yr generated simulation data, 1-5PBy/yr (per institution) distributed to major users in large chucks for post- simulation analysis Enable the analysis of model data by all of the collaborating community Authenticated data streams for easier site access through firewalls Robust access to large quantities of data Server side data processing (compute/cache embedded in the net) Reliable data/file transfer (accounting for system/network failures) High Energy Physics Instrument based data sources Hierarchial data repositories Hundreds of analysis sites 100s of petabytes of data Global collaboration Compute and storage requirements satisfies by optimal use of all available global resources Productivity aspects of rapid response Worldwide collaboration will cooperative analyze data and contribute to a common knowledge base Discover of publishe (structured) data and its provenace 100 Gbit/se Lambda based point-to-point for single high b/w flows; capacity planning Network monitoring Track world-side resource usage patterns to maximize utilization Direct network access to data management systems Monitoring to enable optimized use of network caching/ compute, and storage resources Publish/subscribe and global discovery Chemical Sciences 3D simulation sets ( TB) Coupling of MPP quantum chemistry and molecular dynamics simulations Validation using large experiment data set Remote steering of simulation time step Remote data sub-setting, mining, and visualization Shared data/metadata w/annotation evolves to knowledge base ~100Gbit for distributed computation chemistry and molecular dynamics simulations Management of metadata Global event services Cross-discipline respoitories International interoperability for collab. infrastructure, respositories, search, and notification Archival publication