Realizing the Promise of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

U.S. Department of Energy’s Office of Science Basic Energy Sciences Advisory Committee Dr. Daniel A. Hitchcock October 21, 2003
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
High Performance Computing Course Notes Grid Computing.
Presented by Suzy Tichenor Director, Industrial Partnerships Program Computing and Computational Sciences Directorate Oak Ridge National Laboratory DOE.
GENI: Global Environment for Networking Innovations Larry Landweber Senior Advisor NSF:CISE Joint Techs Madison, WI July 17, 2006.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
SDSC Computing the 21st Century Talk Given to the NSF Sugar Panel May 27, 1998.
U.S. Science Policy Cheryl L. Eavey, Program Director
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Be a Part of Something Great! Learning Communities at Wayne State.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
Globus Ian Foster and Carl Kesselman Argonne National Laboratory and University of Southern California
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – “Think Globally, Act Locally” A point-of-view from the United States Mary.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – Discussion of Best Practices in the U.S. Mary E. Spada Program Manager, Strategic.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
1 Building National Cyberinfrastructure Alan Blatecky Office of Cyberinfrastructure EPSCoR Meeting May 21,
The Challenges of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The.
The Grid as Infrastructure and Application Enabler Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
1 Challenges Facing Modeling and Simulation in HPC Environments Panel remarks ECMS Multiconference HPCS 2008 Nicosia Cyprus June Geoffrey Fox Community.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Supercomputing Center Jysoo Lee KISTI Supercomputing Center National e-Science Project.
Bill Newhouse Program Lead National Initiative for Cybersecurity Education Cybersecurity R&D Coordination National Institute of Standards and Technology.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
DOE 2000, March 8, 1999 The IT 2 Initiative and NSF Stephen Elbert program director NSF/CISE/ACIR/PACI.
Spring 2003 Internet2 Meeting Cyberinfrastructure - Implications for the Future of Research Alan Blatecky ANIR National Science Foundation.
Access to Knowledge through the Grid in a Mobile World Stefan Wesner Project Manager High Performance Computing Centre Stuttgart.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Introduction to Grid Computing Ed Seidel Max Planck Institute for Gravitational Physics
What is GEO? launched in response to calls for action by the 2002 World Summit on Sustainable Development, Earth Observation Summits, and by the G8 (Group.
Futures Lab: Biology Greenhouse gasses. Carbon-neutral fuels. Cleaning Waste Sites. All of these problems have possible solutions originating in the biology.
1 ARGONNE  CHICAGO Grid Introduction and Overview Ian Foster Argonne National Lab University of Chicago Globus Project
Authors: Ronnie Julio Cole David
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
The Globus Toolkit®: The Open Source Solution for Grid Computing
What is CDR? – A Few Examples Water Resources in a Changing Climate – Idaho Climate Change Large CD consortia — not the case that everyone works on everything.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Breakout # 1 – Data Collecting and Making It Available Data definition “ Any information that [environmental] researchers need to accomplish their tasks”
Computational Science & Engineering meeting national needs Steven F. Ashby SIAG-CSE Chair March 24, 2003.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
National Science Foundation Science of Learning Centers RESEARCH EDUCATIONWORKFORCE.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
1 Cyber-Enabled Discovery and Innovation Michael Foster May 11, 2007.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
30 November 2001 Advisory Panel on Cyber Infrastructure National Science Foundation Douglas Van Houweling November 30, 2001 National Science Foundation.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
Inspired by Biology From Molecules to Materials to Machines An NSF-supported assessment of the emerging scientific opportunities at the interface between.
To Recap: The Broad Areas of Learning (BAL) are Lifelong Learning Sense of Self Community and Place Engaged Citizens Why the BAL are important Mandated.
Clouds , Grids and Clusters
Mentoring the Next Generation of Science Gateway Developers and Users
Grid Computing B.Ramamurthy 9/22/2018 B.Ramamurthy.
Carla Ellis Duke University
Presentation transcript:

Realizing the Promise of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University of Chicago Presentation to the NSF Advisory Committee on CyberInfrastructure, November 30, 2001

ARGONNE  CHICAGO The Grid Opportunity l What Grids are about: “Resource sharing & coordinated problem solving in dynamic, multi- institutional virtual organizations” = entirely new tools, with often revolutionary impacts l The opportunity: advance transition to routine use by multiple years

ARGONNE  CHICAGO Why Grids? l A biochemist exploits 10,000 computers to screen 100,000 compounds in an hour l 1,000 physicists worldwide pool resources for petaop analyses of petabytes of data l Civil engineers collaborate to design, execute, & analyze shake table experiments l Climate scientists visualize, annotate, & analyze terabyte simulation datasets l An emergency response team couples real time data, weather model, population data

ARGONNE  CHICAGO Why Grids? (contd) l A multidisciplinary analysis in aerospace couples code and data in four companies l A home user invokes architectural design functions at an application service provider l An application service provider purchases cycles from compute cycle providers l Scientists at a multinational company collaborate on the design of a new product l A community group pools members’ PCs to perform environmental impact study

ARGONNE  CHICAGO Grids: Why Now? l Moore’s law improvements in computing produce highly functional endsystems l The Internet and burgeoning wired and wireless provide universal connectivity l Changing modes of working and problem solving emphasize teamwork, computation l Network exponentials produce dramatic changes in geometry and geography –9-month doubling: double Moore’s law! – : x340,000; : x4000?

ARGONNE  CHICAGO The Grid World: Current Status l An exciting time, in many ways –Dozens of major Grid projects worldwide >Deployment, technology, application –Consensus on key concepts & technologies >E.g., Globus Toolkit as de facto standard –Growing industrial interest l But also: –Funded by an inadequate patchwork of diverse, mostly short-term sources –No long-term coordinated plan aimed at injecting Grid technologies into community –International programs outpacing U.S. efforts!

ARGONNE  CHICAGO PACIs and Grids l PACIs play critical role in Grid development –Act very effectively as nucleation point, bully pulpit, technology explorer –Major resource providers for community l But grid technologies & applications are essentially unfunded mandates for PACIs –“Grids” a tiny fraction of total PACI budget –Situation only worse for TeraGrid! l Current situation untenable long term –New scientific tools are not created for free

ARGONNE  CHICAGO What is Needed: A National Grid Program l Goal: Accelerate “Grid-enablement” of entire science & engineering communities –Don’t wait the 20 years it took the Internet! l Program components 1) Persistent R, D, outreach, support organization 2) Application-oriented “Grid challenge” projects 3) Infrastructure: campus, national, international 4) Basic research, engaging CS community 5) Explicit international component l Explicit and strong interagency coordination

ARGONNE  CHICAGO A Persistent Grid Technology Organization l We’re talking about a complete retooling of entire science and engineering disciplines –Not a part-time, or three-year, or graduate student business –Also not something we can buy (yet) l We need a persistent national organization that can support this process –Technology R&D, packaging, delivery –Training, outreach, support –GRIDS Center an (unproven) existing model

ARGONNE  CHICAGO Application “Grid Challenge” Projects l Goal: Engage significant number of communities in the transition to Grids –GriPhyN, NVO, NEESgrid existing models l Emphasize innovation in application of technology and impact to community –May be data-, instrumentation-, compute-, and/or collaboration-intensive –Aim is to achieve improvement in the quality and/or quantity of science or engineering –And to entrain community in new approaches

ARGONNE  CHICAGO Upgrade National Infrastructure l Seed nation with innovative Grid resources –iVDGL one existing model l Encourage formation of campus Grids –Re-think campus infrastructure program? –U.Tenn SinRG one existing model l Enhance national & international networks, link with TeraGrid –Advanced optical nets, StarLight, etc. l Operations and monitoring

ARGONNE  CHICAGO Basic Research l Engage researchers in imagining & creating new tools & problem-solving methods –In a world of massive connectivity, data, sensors, computing, collaboration, … l And in understanding and creating the new supporting services & infrastructure needed l This is not “CS as usual”

ARGONNE  CHICAGO Explicit International Component l International connections are important, to –Support international science & engineering –Connect with international Grid R&D –Achieve consensus and interoperability l But international cooperation, especially in technology R&D, is hard l An National Grid Program needs to provide explicit support for international work –Infrastructure: networks –Support for projects

ARGONNE  CHICAGO Resource Requirements l Persistent technology/support org: $30M –~200 people l Application “Grid challenges”: $20M –10 teams, with application & CS involvement l Infrastructure upgrades: $20M –Tb networks, campus infrastructures l Research: $15M –Grids of tomorrow, & 100s of grad students l International projects: $5M –Support international work in other projects