International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.

Slides:



Advertisements
Similar presentations
L ondon e-S cience C entre Application Scheduling in a Grid Environment Nine month progress talk Laurie Young.
Advertisements

The Anatomy of the Grid Enabling Scalable Virtual Organizations Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department.
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Future Directions in Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Why Grids Matter to Europe Bob Jones EGEE.
S.L.LloydATSE e-Science Visit April 2004Slide 1 GridPP – A UK Computing Grid for Particle Physics GridPP 19 UK Universities, CCLRC (RAL & Daresbury) and.
Particle physics – the computing challenge CERN Large Hadron Collider –2007 –the worlds most powerful particle accelerator –10 petabytes (10 million billion.
Ian Willers Information: CMS participation in Monarc and RD45 Slides: Paolo Capiluppi, Irwin Gaines, Harvey Newman, Les Robertson, Jamie Shiers, Lucas.
Mário Campolargo DG INFSO F3 Research Infrastructure 25 March 2003 Grid enabled Research Infrastructure for Europe - an FP6 perspective - Grids Information.
Introduction to Grid Computing Slides adapted from Midwest Grid School Workshop 2008 (
High Performance Computing Course Notes Grid Computing.
Highest Energy e + e – Collider LEP at CERN GeV ~4km radius First e + e – Collider ADA in Frascati GeV ~1m radius e + e – Colliders.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
ICT and Civil ProtectionSenigallia, June 2007 A Service-Oriented Middleware for EU Civil Protection cooperation Regione Marche.
US-CMS Meeting (May 19, 2001)Paul Avery1 US-CMS Meeting (UC Riverside) May 19, 2001 Grids for US-CMS and CMS Paul Avery University of Florida
CERN Krakow 2001 F. Gagliardi - CERN/IT 1 RTD efforts in Europe by Kyriakos Baxevanidis Foster cohesion, interoperability, cross- fertilization of knowledge,
Parallel Programming on the SGI Origin2000 With thanks to Moshe Goldberg, TCC and Igor Zacharov SGI Taub Computer Center Technion Mar 2005 Anne Weill-Zrahia.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
POLITEHNICA University of Bucharest California Institute of Technology National Center for Information Technology Ciprian Mihai Dobre Corina Stratan MONARC.
CERN/IT/DB Multi-PB Distributed Databases Jamie Shiers IT Division, DB Group, CERN, Geneva, Switzerland February 2001.
The Grid as Infrastructure and Application Enabler Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Introduction to Grid Computing Ann Chervenak and Ewa Deelman USC Information Sciences Institute.
Peer to Peer & Grid Computing Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer Science The University.
1 Common Challenges Across Scientific Disciplines Laurence Field CERN 18 th November 2013.
Randall Sobie BCNET Annual Meeting April 24,2002 The Grid A new paradigm in computing Randall Sobie Institute of Particle Physics University of Victoria.
CERN TERENA Lisbon The Grid Project Fabrizio Gagliardi CERN Information Technology Division May, 2000
What is Internet2? Ted Hanss, Internet2 5 March
Finnish DataGrid meeting, CSC, Otaniemi, V. Karimäki (HIP) DataGrid meeting, CSC V. Karimäki (HIP) V. Karimäki (HIP) Otaniemi, 28 August, 2000.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
GriPhyN EAC Meeting (Apr. 12, 2001)Paul Avery1 University of Florida Opening and Overview GriPhyN External.
Brussels Grid Meeting (Mar. 23, 2001)Paul Avery1 University of Florida Extending the Grid Reach in Europe.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Perspectives on Grid Technology Ian Foster Argonne National Laboratory The University of Chicago.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
GRID ARCHITECTURE Chintan O.Patel. CS 551 Fall 2002 Workshop 1 Software Architectures 2 What is Grid ? "...a flexible, secure, coordinated resource- sharing.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Particle Physics Data Grid Collaboratory Pilot Richard P. Mount For the PPDG Collaboration DOE SciDAC PI Meeting January 15, 2002.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
29/1/2002A.Ghiselli, INFN-CNAF1 DataTAG / WP4 meeting Cern, 29 January 2002 Agenda  start at  Project introduction, Olivier Martin  WP4 introduction,
Internet2 Applications Group: Renater Group Presentation T. Charles Yun Internet2 Program Manager, Applications Group 30 October 2001.
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Data Grid Plane Network Grid Plane Dynamic Optical Network Lambda OGSI-ification Network Resource Service Data Transfer Service Generic Data-Intensive.
LHC Computing, CERN, & Federated Identities
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
Middleware and the Grid Steven Tuecke Mathematics and Computer Science Division Argonne National Laboratory.
Storage Management on the Grid Alasdair Earl University of Edinburgh.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
] Open Science Grid Ben Clifford University of Chicago
Bob Jones EGEE Technical Director
Clouds , Grids and Clusters
Ian Bird GDB Meeting CERN 9 September 2003
Network Requirements Javier Orellana
University of Technology
Grid Computing B.Ramamurthy 9/22/2018 B.Ramamurthy.
CS258 Spring 2002 Mark Whitney and Yitao Duan
Presentation transcript:

International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California

The Grid Problem Resource sharing & coordinated problem solving in dynamic, multi-institutional virtual organizations

Enabling International Cooperation l International cooperation valuable, because –Scale of Grid problem is large –Expertise on both sides of Atlantic & Pacific –Important international applications –Cost of noncooperation can be high l Useful cooperation will not just happen but must be explicitly encouraged –Substantial testbed & application projects, jointly sponsored by EU, US, others –Transatlantic Terabit Testbed, etc. –International Virtual Data Grid Laboratory

Grid Forum l IETF like body to codify standard practice l Two meetings held so far, next in April l European Grid forum established to address Europe specific issues

Layered Grid Architecture (By Analogy to Internet Architecture) Application Fabric Controlling things locally: Access to, & control of, resources Connectivity Talking to things: communication (Internet protocols) & security Resource Sharing single resources: negotiating access, controlling use Collective Coordinating multiple resources: ubiquitous infrastructure services, app-specific distributed services Internet Transport Application Link Internet Protocol Architecture

The Grid Physics Network l Petabyte-scale computational environment for data intensive science –CMS and Atlas Projects of the Large Hadron Collider –Laser Interferometer Gravitational- Wave Observatory –Sloan Digital Sky Survey (200 million objects each with ~100 attributes)

Data Grids l Integrate data archives into a distributed data management and analysis Grid l More than storage & network, also e.g. –Caching and mirroring to exploit locality –Intelligent scheduling to determine appropriate replica, site for (re)computation, etc. –Coordinated resource management for performance guarantees –Embedded security, policy, agent technologies for effective distributed analysis

Virtual Data Grids l Only raw data must exist –Dynamic data production l Large extent and scale –national or worldwide, multiple distance scales –large numbers of resources l Sophisticated new services –Coordinated use of remote resources l Transparency in data-handling and processing –Optimize for cost, time, policy constraints, …

Grid Communities & Applications: Data Grids for High Energy Physics Tier2 Centre ~1 TIPS Online System Offline Processor Farm ~20 TIPS CERN Computer Centre FermiLab ~4 TIPS France Regional Centre Italy Regional Centre Germany Regional Centre Institute Institute ~0.25TIPS Physicist workstations ~100 MBytes/sec ~622 Mbits/sec ~1 MBytes/sec There is a bunch crossing every 25 nsecs. There are 100 triggers per second Each triggered event is ~1 MByte in size Physicists work on analysis channels. Each institute will have ~10 physicists working on one or more channels; data for these channels should be cached by the institute server Physics data cache ~PBytes/sec ~622 Mbits/sec or Air Freight (deprecated) Tier2 Centre ~1 TIPS Caltech ~1 TIPS ~622 Mbits/sec Tier 0 Tier 1 Tier 2 Tier 4 1 TIPS is approximately 25,000 SpecInt95 equivalents Image courtesy Harvey Newman, Caltech

Virtual Data Tools Request Planning and Scheduling Tools Request Execution Management Tools Transforms Distributed resources (code, storage, computers, and network) Resource Management Services Security and Policy Services Other Grid Services Interactive User Tools Production Team Individual InvestigatorOther Users Raw data source GriPhyn Architecture

? Major Archive Facilities Network caches & regional centers Local sites GriPhyn Usage Scenario

iVDGL l International Virtual-Data Grid Laboratory –A place to conduct Data Grid tests at scale –Concrete manifestation of world-wide grid activity –Continuing activity that will drive Grid awareness –A basis for further funding l Scale of effort –For national, intl scale Data Grid tests, operations –Computationally and data intensive computing –Fast networks l Who –Initially US-UK-EU; Japan, Australia –Other world regions later –Discussions w/ Russia, China, Pakistan, India, South America

Structure of the iVDGL

Compute PlatformStorage Platform iVDGL Monitoring Interface iVDGL Mgmt. Interface iVDGL Control Interface Local Management Interface Interface iGLS Experiment Scheduler Health and Status Monitoring iVDGL Configuration Information Access Control and Policy Services iGOC Experiment Management Experiment Data Collection Application Experiments iVDGL Architecture

iVDGL Map Circa Tier0/1 facility Tier2 facility 10 Gbps link 2.5 Gbps link 622 Mbps link Other link Tier3 facility

iVDGL as a Laboratory l Grid Exercises –Easy, intra-experiment tests first (10-20%, national, transatlantic) –Harder wide-scale tests later (50-100% of all resources) l Local control of resources vitally important –Experiments, politics demand it l Strong interest from other disciplines –HEP + NP experiments –Virtual Observatory (VO) community in Europe/US –Gravity wave community in Europe/US/(Japan?) –Earthquake engineering –Bioinformatics –Computer scientists (wide scale tests)

Conclusions l Application communities for major Grid experiments are international –More communities then those mentioned l International testbeds are coming l Wires are only part of the solution l Common middleware archecture enabling technology