Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.

Slides:



Advertisements
Similar presentations
© 2012 Open Grid Forum Simplifying Inter-Clouds October 10, 2012 Hyatt Regency Hotel Chicago, Illinois, USA.
Advertisements

EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI AAI in EGI Status and Evolution Peter Solagna Senior Operations Manager
The Anatomy of the Grid: An Integrated View of Grid Architecture Carl Kesselman USC/Information Sciences Institute Ian Foster, Steve Tuecke Argonne National.
Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
ASCR Data Science Centers Infrastructure Demonstration S. Canon, N. Desai, M. Ernst, K. Kleese-Van Dam, G. Shipman, B. Tierney.
High Performance Computing Course Notes Grid Computing.
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CIF21) NSF-wide Cyberinfrastructure Vision People, Sustainability, Innovation,
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
 Amazon Web Services announced the launch of Cluster Compute Instances for Amazon EC2.  Which aims to provide high-bandwidth, low- latency instances.
Western Regional Biomedical Collaboratory Creating a culture for collaboration.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
SCHOOL OF INFORMATION UNIVERSITY OF MICHIGAN Symposium on Knowledge Environments for Science and Engineering National Science Foundation Nov , 2002.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
San Diego Supercomputer CenterUniversity of California, San Diego Preservation Research Roadmap Reagan W. Moore San Diego Supercomputer Center
What is EGI? The European Grid Infrastructure enables access to computing resources for European scientists from all fields of science, from Physics to.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
ESIP Federation: Connecting Communities for Advancing Data, Systems, Human & Organizational Interoperability November 22, 2013 Carol Meyer Executive Director.
1 Web: Steve Brewer: Web: EGI Science Gateways Initiative.
SAMGrid as a Stakeholder of FermiGrid Valeria Bartsch Computing Division Fermilab.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
© Internet 2012 Internet2 and Global Collaboration APAN 33 Chiang Mai 14 February 2012 Stephen Wolff Internet2.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
 Apache Airavata Architecture Overview Shameera Rathnayaka Graduate Assistant Science Gateways Group Indiana University 07/27/2015.
Ames Research CenterDivision 1 Information Power Grid (IPG) Overview Anthony Lisotta Computer Sciences Corporation NASA Ames May 2,
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
What is SAM-Grid? Job Handling Data Handling Monitoring and Information.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Interoperability from the e-Science Perspective Yannis Ioannidis Univ. Of Athens and ATHENA Research Center
Using the ARCS Grid and Compute Cloud Jim McGovern.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
Introducing the Open Science Grid Project supported by the Department of Energy Office of Science SciDAC-2 program from the High Energy Physics, Nuclear.
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Open Science Grid as XSEDE Service Provider Open Science Grid as XSEDE Service Provider December 4, 2011 Chander Sehgal OSG User Support.
Eileen Berman. Condor in the Fermilab Grid FacilitiesApril 30, 2008  Fermi National Accelerator Laboratory is a high energy physics laboratory outside.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
Forging the eXtremeDigital (XD) Program Barry I. Schneider Program Director, Office of CyberInfrastructure January 20, 2011.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI EGI strategy and Grand Vision Ludek Matyska EGI Council Chair EGI InSPIRE.
1 Kostas Glinos European Commission - DG INFSO Head of Unit, Géant and e-Infrastructures "The views expressed in this presentation are those of the author.
Distributed Geospatial Information Processing (DGIP) Prof. Wenwen Li School of Geographical Sciences and Urban Planning 5644 Coor Hall
Tutorial on Science Gateways, Roma, Catania Science Gateway Framework Motivations, architecture, features Riccardo Rotondo.
GEOSPATIAL CYBERINFRASTRUCTURE. WHAT IS CYBERINFRASTRUCTURE(CI)?  A combination of data resources, network protocols, computing platforms, and computational.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Fedora Commons Overview and Background Sandy Payette, Executive Director UK Fedora Training London January 22-23, 2009.
DIRAC for Grid and Cloud Dr. Víctor Méndez Muñoz (for DIRAC Project) LHCb Tier 1 Liaison at PIC EGI User Community Board, October 31st, 2013.
Enabling Digital Earth by focussing on ‘accessibility’ rather than ‘delivery’. Ryan Fraser CSIRO.
Store and exchange data with colleagues and team Synchronize multiple versions of data Ensure automatic desktop synchronization of large files B2DROP is.
Internet2 Applications & Engineering Ted Hanss Director, Applications Development.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
Data Grids, Digital Libraries and Persistent Archives: An Integrated Approach to Publishing, Sharing and Archiving Data. Written By: R. Moore, A. Rajasekar,
Accessing the VI-SEEM infrastructure
Clouds , Grids and Clusters
Joslynn Lee – Data Science Educator
DataNet Collaboration
Grid Computing.
Presentation transcript:

Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab

(Inter-)National Community Campus Our (e.g. OSG’s) Past – the 3 types of grids

Clouds National & International Communities & ad-hoc VOs Campuses & Regions Networks And now HPC Digital Libraries the software is inevitably different & changing, with different external semantics.

National Science Foundation Landscape 4

We must integrate & interface many more types of resource services -Cross-campus infrastructures & community grids. -Multi-organization grids & multi-site uniform clusters. -Unique leadership class, peta-scale, exascale & the ubiquitous laptops -Huge shared data repositories & large local data silos. -Shared services & encapsulated virtual environments. -Commercial & community clouds. And these systems also overlap. 5

People work across them all Clouds National & International Communities & ad-hoc VOs Campuses & Regions Networks HPC Digital Libraries

7 Bridges (aka gateways, adaptors, routers) connect the diverse resources layered on the network fabric.

8 With Capability

Why include Collaboratories as well as Virtual Organizations ? center without walls “center without walls, in which the nation’s researchers can perform their research without regard to physical location, interacting with colleagues, accessing instrumentation, sharing data and computational resources, [and] accessing information in digital libraries” (Wulf, 1989). combines the interests of the scientific community at large with those of the computer science and engineering community “a system which combines the interests of the scientific community at large with those of the computer science and engineering community to create integrated, tool-oriented computing and communication systems to support scientific collaboration” (Bly, 1998, p. 31). and conduct experiments to evaluate and improve systems. experimental and empirical research environment in which scientists work and communicate with each other to design systems, participate in collaborative science, and conduct experiments to evaluate and improve systems. agreement on norms, principles, values, and rules” “.a new networked organizational form that also includes social processes; collaboration techniques; formal and informal communication; and agreement on norms, principles, values, and rules” (Cogburn, 2003, p. 86). 9

Goals remain - Supporting People to Do their own scholarship Do their own scholarship (research, science, education...) -Have local access to their local system using local methods. -Access unique computing capabilities remote from their site. and Collaborate with others - both ad-hoc and sustained - Geographically and/or organizationally distributed. -Use data from same or different intrument(s). -P ool resources. -Share and reuse their resources, software & services. -Combine their knowledge. 10

11 Collaboratory Workflow

12 bridges integrate the pieces and workflows manage the use

of course communities also work together Clouds National & International Communities & ad-hoc VOs Campuses & Regions Networks HPC Digital Libraries

14 Our Original Vision

e.g. CDF and D0 combining results. e.g.BioEnergy Centers collaborating to share data.

e.g. Climate applications using both weather data and geophysical history. Material scientists using data across X-ray, neutron and other probes

Bridge capabilities -Transform, interpret & translate. -Act as borders. 17

Bridge capabilities -Transform, interpret & translate. -Act as borders. -Have multiple end points. -Control the flow and queue the traffic. 18

Bridge capabilities -Transform, interpret & translate. -Act as borders. -Have multiple ends. -Control the flow and queue the traffic. -Filter access and identify problem requests. -Include information kiosks that provide maps, dictionaries etc. 19

Bridge capabilities -Transform, interpret & translate. -Act as borders. -Have multiple ends. -Control the flow and queue the traffic. -Filter access and identify problem requests. -Can have information kiosks that provide maps, dictionaries. -Allow each side to evolve independently. -Apply and check for policy. 20

Bridge capabilities 21 be opened and closed

Bridge capabilities 22 be opened and closed offer further discussion of boundary and interaction between the network and middleware layers?

Some examples: jobs, information, management -Advanced Resource Connector (ARC) Compute Element gateway  Multiplexing submissions from a variety of remote execution clients into a common queue and ARC submission framework.  Centralized data management across all ARC sites. 23

Information Service Adaptors -Apply policy filters as well as translate -Information can be injected at different stages in the flow. 24

Condor job router – uploading jobs between different Security infrastructures. -executing locally or remote as needed. 25

FermiGrid OSG-TeraGrid gateway  Maintains state in “osg space” while dispatching work and data to TeraGrid.  Will map users in VOs to TeraGrid allocation policies.  Collects MDS4 into common pool with LDAP information. End-point Identity (Shibboleth, Bridge-CAs) -X509  Allow application of varying security policy Virtual Machine/Cloud administration services Experiment submit-agents Portals and Hubs 26

Human – Software Bridges? 27

Bridges are part of the interoperability solution Integration to adapt to the most cost-effective computing, storage, software and services that are out there. Agility to follow several cycles of technology buzz-words simultaneously, mix and match. A natural place to handle inter-domain differences beyond APIs. 28

29 it’s not just that bridges are topical..