National e-Science Core Programme & Grid Highlights BiGUM1 eSI 30 th October 2001.

Slides:



Advertisements
Similar presentations
GridPP July 2003Stefan StonjekSlide 1 SAM middleware components Stefan Stonjek University of Oxford 7 th GridPP Meeting 02 nd July 2003 Oxford.
Advertisements

The UK e-Science Programme & The National e-Science Centre Malcolm Atkinson Director of NeSC Universities of Edinburgh and Glasgow Pilot Projects Meeting.
National e-Science Centre Glasgow e-Science Hub Opening: Remarks NeSCs Role Prof. Malcolm Atkinson Director 17 th September 2003.
NeSC: National e-Science Centre. NeSC Mission Help the UK develop international strength in Grid computing Industry, Commerce, Scientific Research, …
E-Science Update Steve Gough, ITS 19 Feb e-Science large scale science increasingly carried out through distributed global collaborations enabled.
The Anatomy of the Grid: An Integrated View of Grid Architecture Carl Kesselman USC/Information Sciences Institute Ian Foster, Steve Tuecke Argonne National.
Andy Keane Director, Southampton Regional e-Science Centre e-Science, the Grid & Southampton University.
High Performance Computing Course Notes Grid Computing.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
GEODE Workshop 16 th January 2007 Issues in e-Science Richard Sinnott University of Glasgow Ken Turner University of Stirling.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The DOE Science Grid Computing and Data Infrastructure for Large-Scale Science William Johnston, Lawrence Berkeley National Lab Ray Bair, Pacific Northwest.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
Grid and e-Science Technologies Simon Cox Technical Director Southampton Regional e-Science Centre.
The Grid a brief briefing Carole Goble Information Management Group.
Semantic Web for E-Science and Education Enrico Motta Knowledge Media Institute The Open University, UK.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Grid Computing Net 535.
Introduction to Grid Computing Ann Chervenak Carl Kesselman And the members of the Globus Team.
GRIDS Center G rid R esearch I ntegration D evelopment & S upport Chicago - NCSA – SDSC - USC/ISI - Wisconsin.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Welcome e-Science in the UK Building Collaborative eResearch Environments Prof. Malcolm Atkinson Director 23 rd February 2004.
GridPP Tuesday, 23 September 2003 Tim Phillips. 2 Bristol e-Science Vision National scene Bristol e-Science Centre Issues & Challenges.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
Grid Security Steve Tuecke Argonne National Laboratory.
IT – som værktøj Bent Thomsen Institut for Datalogi Aalborg Universitet.
The Globus Project: A Status Report Ian Foster Carl Kesselman
Towards an e-Science Roadmap Tony Hey Director UK e-Science Core Programme
The Anatomy of the Grid Mahdi Hamzeh Fall 2005 Class Presentation for the Parallel Processing Course. All figures and data are copyrights of their respective.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
1 4/23/2007 Introduction to Grid computing Sunil Avutu Graduate Student Dept.of Computer Science.
IT – som værktøj Bent Thomsen Institut for Datalogi Aalborg Universitet.
Service - Oriented Middleware for Distributed Data Mining on the Grid ,劉妘鑏 Antonio C., Domenico T., and Paolo T. Journal of Parallel and Distributed.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
Grid Architecture William E. Johnston Lawrence Berkeley National Lab and NASA Ames Research Center (These slides are available at grid.lbl.gov/~wej/Grids)
“Grids and eScience” Mark Hayes Technical Director - Cambridge eScience Centre GEFD Summer School 2003.
The UK eScience Grid (and other real Grids) Mark Hayes NIEeS Summer School 2003.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
1 ARGONNE  CHICAGO Grid Introduction and Overview Ian Foster Argonne National Lab University of Chicago Globus Project
Authors: Ronnie Julio Cole David
ICCS WSES BOF Discussion. Possible Topics Scientific workflows and Grid infrastructure Utilization of computing resources in scientific workflows; Virtual.
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
The UK e-Science Programme & The National e-Science Centre Malcolm Atkinson Director of NeSC Universities of Edinburgh and Glasgow Scottish Regional Forum.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
CEOS Working Group on Information Systems and Services - 1 Data Services Task Team Discussions on GRID and GRIDftp Stuart Doescher, USGS WGISS-15 May 2003.
NeSC Workshop - February /14 Study of User Priorities for e-Infrastructure for e-Research (SUPER) Steven Newhouse Jennifer Schopf Andrew Richards.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
1 The Challenge of Data Integration Data + Grid = Discovery? Prof. Malcolm Atkinson Director 22 nd January 2003.
Utility Computing: Security & Trust Issues Dr Steven Newhouse Technical Director London e-Science Centre Department of Computing, Imperial College London.
1 Observations on Architecture, Protocols, Services, APIs, SDKs, and the Role of the Grid Forum Ian Foster Carl Kesselman Steven Tuecke.
An Introduction to UK e-Science Anne E Trefethen Deputy Director UK e-Science Core Programme.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Toward a common data and command representation for quantum chemistry Malcolm Atkinson Director 5 th April 2004.
1 Porting applications to the NGS, using the P-GRADE portal and GEMLCA Peter Kacsuk MTA SZTAKI Hungarian Academy of Sciences Centre for.
Middleware and the Grid Steven Tuecke Mathematics and Computer Science Division Argonne National Laboratory.
Welcome Grids and Applied Language Theory Dave Berry Research Manager 16 th October 2003.
Realizing the Promise of Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science.
UK Grid: Moving from Research to Production
Globus —— Toolkits for Grid Computing
Welcome to National e-Science Centre Official Opening
UK e-Science OGSA-DAI November 2002 Malcolm Atkinson
Grid Computing.
Grid Portal Services IeSE (the Integrated e-Science Environment)
Grid Computing B.Ramamurthy 9/22/2018 B.Ramamurthy.
Grid Introduction and Overview
Proposed Grid Protocol Architecture Working Group
Presentation transcript:

National e-Science Core Programme & Grid Highlights BiGUM1 eSI 30 th October 2001

Contents Welcome NeSC and e-Science Support Grid Definitions Grid Examples Grid Architectures

£80m Collaborative projects E-Science Steering Committee DG Research Councils Director Director’s Management Role Director’s Awareness and Co-ordination Role Generic Challenges EPSRC (£15m), DTI (£15m) Industrial Collaboration (£40m) Academic Application Support Programme Research Councils (£74m), DTI (£5m) PPARC (£26m) BBSRC (£8m) MRC (£8m) NERC (£7m) ESRC (£3m) EPSRC (£17m) CLRC (£5m) Grid TAG e-Science Programme From Tony Hey 27 July 01

Cambridge Newcastle Edinburgh Oxford Glasgow Manchester Cardiff Soton London Belfast DL RAL Hinxton UK Grid Network From Tony Hey 27 July 01

Key Elements of UK Grid Development Plan 1. Network of Grid Core Programme e- Science Centres 2. Development of Generic Grid Middleware 3. Grid IRC Grand Challenge Project 4. Support for e-Science Testbeds 5. International Involvement via GGF 6. Grid Network Team From Tony Hey 27 July 01

NeSC’s context NeSC eSI GSC Application PilotsIRCs …e-Science Centres e-Scientists, Grid users, Grid services & Grid Developers UK Core Programme TeamGlobal Grid Forum … CS Research TAG DBTF ATF GNT Coordination

NeSC — The Team Director Malcolm Atkinson (Universities of Glasgow & Edinburgh) Deputy Director Arthur Trew (Director EPCC) Commercial Director Mark Parsons (EPCC) Regional Director Stuart Anderson (Edinburgh Informatics) Chairman Richard Kenway (Edinburgh Physics & Astronomy) Initial Board Members Muffy Calder (Glasgow Computing Science) Tony Doyle (Glasgow Physics & Astronomy) Centre Manager Anna Kenway

NeSC’s Roles Stimulation of Grid & e-Science Activity Users, developers, researchers Education, Training, Support Think Tank & Research Coordination of Grid & e-Science Activity Regional Centres, Task Forces, Pilots & IRCs Technical and Managerial Fora Support for training, travel, participation Developing a High-Profile Institute Meetings Visiting Researchers Regional Support Portfolio of Industrial Research Projects

eSI Highlights Report given yesterday History  3 workshops week 1: DF1, GUM1 & DBAG1  HEC  preGGF3 & DF2 October  Steve Tuecke Globus tutorial (oversubscribed)  4-day workshop Getting Going with Globus (G3) –Reports on DataGrid & GridPP experience  Biologist Grid Users’ Meeting 1 (BiGUM1) November  GridPP  Configuration management December  AstroGrid

eSI Highlights cont & 2003 January  Steve Tuecke 4 day Globus Developers’ Workshop February  UKOLN March  Protein folding Workshop 14 th to 17 th IBM sponsor May  Mind and Brain Workshop 22 nd to 26 th July GGF5 & HPDC 11 EICC August Research Festival 4 juxtaposed 1-week in-depth workshops Topics under consideration  Dependability and Security for the Grid  Metadata and the Grid  Provenance, Annotation and Archiving  The Knowledge Grid  Programming Models for the Grid 14 th to 16 th April 2003 Dependability

 Motivation for IPG Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for “Grids” is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. From Bill Johnston 27 July 01

Why Grids? A biochemist exploits 10,000 computers to screen 100,000 compounds in an hour 1,000 physicists worldwide pool resources for petaop analyses of petabytes of data Civil engineers collaborate to design, execute, & analyze shake table experiments Climate scientists visualize, annotate, & analyze terabyte simulation datasets An emergency response team couples real time data, weather model, population data From Steve Tuecke 12 Oct. 01

Why Grids? (contd.) A multidisciplinary analysis in aerospace couples code and data in four companies A home user invokes architectural design functions at an application service provider An application service provider purchases cycles from compute cycle providers Scientists working for a multinational soap company design a new product A community group pools members’ PCs to analyze alternative designs for a local road From Steve Tuecke 12 Oct. 01

The Grid Problem Flexible, secure, coordinated resource sharing among dynamic collections of individuals, institutions, and resource From “The Anatomy of the Grid: Enabling Scalable Virtual Organizations” Enable communities (“virtual organizations”) to share geographically distributed resources as they pursue common goals -- assuming the absence of… central location, central control, omniscience, existing trust relationships. From Steve Tuecke 12 Oct. 01

Elements of the Problem Resource sharing Computers, storage, sensors, networks, … Sharing always conditional: issues of trust, policy, negotiation, payment, … Coordinated problem solving Beyond client-server: distributed data analysis, computation, collaboration, … Dynamic, multi-institutional virtual organisations Community overlays on classic org structures Large or small, static or dynamic From Steve Tuecke 12 Oct. 01

Why Now? Moore’s law improvements in computing produce highly functional endsystems The Internet and burgeoning wired and wireless provide universal connectivity Changing modes of working and problem solving emphasize teamwork, computation Network exponentials produce dramatic changes in geometry and geography From Steve Tuecke 12 Oct. 01

Network Exponentials Network vs. computer performance Computer speed doubles every 18 months Network speed doubles every 9 months Difference = order of magnitude per 5 years 1986 to 2000 Computers: x 500 Networks: x 340, to 2010 Computers: x 60 Networks: x 4000 Moore’s Law vs. storage improvements vs. optical improvements. Graph from Scientific American (Jan- 2001) by Cleo Vilett, source Vined Khoslan, Kleiner, Caufield and Perkins. From Steve Tuecke 12 Oct. 01

Broader Context “Grid Computing” has much in common with major industrial thrusts Business-to-business, Peer-to-peer, Application Service Providers, Storage Service Providers, Distributed Computing, Internet Computing… Sharing issues not adequately addressed by existing technologies Complicated requirements: “run program X at site Y subject to community policy P, providing access to data at Z according to policy Q” High performance: unique demands of advanced & high-performance systems From Steve Tuecke 12 Oct. 01

The Globus Project™ Making Grid computing a reality Close collaboration with real Grid projects in science and industry Development and promotion of standard Grid protocols to enable interoperability and shared infrastructure Development and promotion of standard Grid software APIs and SDKs to enable portability and code sharing The Globus Toolkit™: Open source, reference software base for building grid infrastructure and applications Global Grid Forum: Development of standard protocols and APIs for Grid computing From Steve Tuecke 12 Oct. 01

DOE X-ray grand challenge: ANL, USC/ISI, NIST, U.Chicago tomographic reconstruction real-time collection wide-area dissemination desktop & VR clients with shared controls Advanced Photon Source Online Access to Scientific Instruments archival storage From Steve Tuecke 12 Oct. 01

Supernova Cosmology Requires Complex, Widely Distributed Workflow Management

Mathematicians Solve NUG30 Looking for the solution to the NUG30 quadratic assignment problem An informal collaboration of mathematicians and computer scientists Condor-G delivered 3.46E8 CPU seconds in 7 days (peak 1009 processors) in U.S. and Italy (8 sites) 14,5,28,24,1,3,16,15, 10,9,21,2,4,29,25,22, 13,26,17,30,6,20,19, 8,18,7,27,12,11,23 MetaNEOS: Argonne, Iowa, Northwestern, Wisconsin From Miron Livny 7 Aug. 01

Network for Earthquake Engineering Simulation NEESgrid: national infrastructure to couple earthquake engineers with experimental facilities, databases, computers, & each other On-demand access to experiments, data streams, computing, archives, collaboration NEESgrid: Argonne, Michigan, NCSA, UIUC, USC From Steve Tuecke 12 Oct. 01

Community = 1000s of home computer users Philanthropic computing vendor (Entropia) Research group (Scripps) Common goal= advance AIDS research Home Computers Evaluate AIDS Drugs From Steve Tuecke 12 Oct. 01

Layered Grid Architecture (By Analogy to Internet Architecture) Application Fabric “Controlling things locally”: Access to, & control of, resources Connectivity “Talking to things”: communication (Internet protocols) & security Resource “Sharing single resources”: negotiating access, controlling use Collective “Coordinating multiple resources”: ubiquitous infrastructure services, app-specific distributed services Internet Transport Application Link Internet Protocol Architecture From Steve Tuecke 12 Oct. 01

Grid Information Service Uniform Resource Access BrokeringGlobal Queuing Global Event Services Co- Scheduling Data Cataloguing Uniform Data Access Collaboration and Remote Instrument Services Network Cache Communication Services Authentication Authorization Security Services AuditingFault Management Monitoring Grid Common Services: Standardized Services and Resources Interfaces Applications: Simulations, Data Analysis, etc. Toolkits: Visualization, Data Publication/Subscription, etc. Distributed Resources Discipline Specific Portals and Scientific Workflow Management Systems Condor pools network caches tertiary storage national user facilities clusters national supercomputer facilities High-speed Networks and Communications Services = Globus services Architecture of a Grid

Architecture of a Grid – upper layers Problem Solving Environments Knowledge based query Tools to implement the human interfaces, e.g. SciRun, ECCE, WebFlow,..... Mechanisms to express, organize, and manage the workflow of problem  solutions (“frameworks”) Access control application codes visualization toolkits collaboration toolkits instrument management toolkits data publish and subscribe toolkits Applications and Supporting Tools Grid enabled libraries (security, communication services, data access, global event management, etc.) Globus MPI CORBACondor- G Java/ Jini DCOM Application Development and Execution Support Distributed Resources Grid Common Services From Steve Tuecke 12 Oct. 01

Three Layer GRID Abstraction Information Grid Knowledge Grid Computation/ Data Grid Computation/ Data Grid Data to Knowledge Data to Knowledge Control From Tony Hey 12 Sep. 01

Data, Information and Knowledge Data Uninterpreted bits and bytes Information Data equipped with meaning Knowledge Information applied to achieve a goal, solve a problem or enact a decision From Tony Hey 12 Sep. 01

Biological Grid Users Are they different? Do they have different collaborations? Do they have different data? Do they have different computations? Do they have the same shared “instruments”? Can they be supported using the same Infrastructure Architecture Policies?