Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007.

Slides:



Advertisements
Similar presentations
SWITCH Visit to NeSC Malcolm Atkinson Director 5 th October 2004.
Advertisements

Ed Seidel Assistant Director Directorate for Mathematical and Physical Sciences National Science Foundation October 1, 2010.
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Copyright James Kent Blackburn This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,
Assessment of Core Services provided to USLHC by OSG.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
OSG Consortium and Stakeholders Bill Kramer – Chair, OSG Counil National Energy Research Scientific Computing Center Lawrence.
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
OSG RA plans Doug Olson, LBNL May Contents RA, agent, sponsor layout & OU=People use case Sample web form Agent Role GridAdmin Role Questions.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
OSG Status & Accomplishments Kent Blackburn California Institute of Technology Open Science Grid Joint Oversight Team Meeting February 20th 2007.
Use of Condor on the Open Science Grid Chris Green, OSG User Group / FNAL Condor Week, April
OSG Council, Aug 22 nd -23 rd 2012 Ruth Pordes, Council Chair.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Report by the Open Science Grid Council Subcommittee to Address At- Large VO Representation on the Consortium Council Shaowen Wang (on behalf of the committee)
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Tools for collaboration How to share your duck tales…
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
SEEK Welcome Malcolm Atkinson Director 12 th May 2004.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Authors: Ronnie Julio Cole David
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center Open Science Grid (OSG) Introduction for the Ohio Supercomputer Center February.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
OSG RA, DOEGrids CA features Doug Olson, LBNL August 2006.
The OSG and Grid Operations Center Rob Quick Open Science Grid Operations Center - Indiana University ATLAS Tier 2-Tier 3 Meeting Bloomington, Indiana.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
Sept 2008 OSG Engagement VO, RENCI 1 Open Science Grid Embedded Immersive Engagement for Cyberinfrastructure on the Open Science Grid John McGee –
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
Tier 3 Support and the OSG US ATLAS Tier2/Tier3 Workshop at UChicago August 20, 2009 Marco Mambelli –
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
1 The Open Science Grid Ruth Pordes Fermilab Ruth Pordes Fermilab.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
OSG Facility Miron Livny OSG Facility Coordinator and PI University of Wisconsin-Madison Open Science Grid Scientific Advisory Group Meeting June 12th.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
1 Open Science Grid Progress & Vision Keith Chadwick, Fermilab
Open Science Grid Interoperability
Open Science Grid Progress and Status
Open Science Grid Overview
Open Science Grid at Condor Week
Presentation transcript:

Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007

The OSG is a continuation of Grid3, a community grid built in 2003 through a joint project of the U.S. LHC software and computing programs, the National Science Foundations’ GriPhyN and iVDGL projects, and the Department of Energy’s PPDG project

Goal of Open Science Grid (OSG) is to facilitate the need for expanding computing and data management that are desired by scientific researchers, especially collaborative science requiring high throughput computing. It is an association of service and resource providers as well as researchers including universities, national laboratories and computing centers across the U.S.

This association, also known as a Consortium, includes members from particle and nuclear physics, astrophysics, bioinformatics, gravitational-wave science and computer science collaborations

Who are the Consortium Members?

And many more… DZero Collaboration Dartmouth College Deutsches Elektronen-Synchrotron (DESY) Fermi National Accelerator Laboratory (FNAL) Florida International University Georgetown University The Globus Alliance Grid Physics Network (GriPhyN) Grid Resources for Advanced Science and Engineering (GRASE) Hampton University Harvard University Indiana University Indiana University-Purdue University, Indianapolis International Virtual Data Grid Laboratory (iVDGL) Kyungpook National University Laser Interferometer Gravitational Wave Observatory (LIGO) Lawrence Berkeley National Laboratory (LBL) Lehigh University Massachusetts Institute of Technology National Energy Research Scientific Computing Center (NERSC) New York University Northwest Indiana Computational Grid Notre Dame University Oak Ridge National Laboratory OSG Grid Operations Center (GOC) Particle Physics Data Grid (PPDG) and PPDG Common Project Pennsylvania State University Purdue University Renaissance Computing Institute Rice University Rochester Institute of Technology São Paulo Regional Analysis Center (SPRACE) Sloan Digital Sky Survey (SDSS)

Consortium Members 2005

Consortium Members 2007

Partners: grid and network organizations as well as international, national, regional and campus grids Some of the partners: APAC National Grid Data Intensive Science University Network (DISUN) Enabling Grids for E-SciencE (EGEE) Grid Laboratory of Wisconsin (GLOW) Grid Operations Center at Indiana University Grid Research and Education Group at Iowa (GROW) Nordic Data Grid Facility (NorduGrid) Northwest Indiana Computational Grid (NWICG) Oxford e-Research Centre (OxGrid)

Who Manages OSG?

Several sub-groups within the Consortium manage, advise, oversee and govern the OSG These groups include the Executive Board, the Executive Team, the OSG Council, the Users Group, the Scientific Oversight Group and the Finance Board

OSG is governed by the Council, which includes a representative from each Consortium member. The Users Group for example supplies a venue for OSG user representatives to share requirements from and experiences of developing and running applications on the OSG. They ensure that all parts of the scientific mission and all applications in use on the OSG are represented

Brookhaven National Laboratory - Howard Gordon Collider Detector at Fermilab (CDF) - TBD Condor Project - Miron Livny DZero Collaboration - Brad Abbott DOSAR - Dick Greenwood Fermi National Accelerator Laboratory - Vicky White Globus Alliance - Ian Foster Some of the council members:

The Scientific Oversight Group represents a scientific community and directs the Council and Executive Board The Finance Board manages all matters related to OSG costs and resources

The administration of the OSG is led by the Executive Director and Executive Board Executive Director Ruth Pordes Fermi National Accelerator Laboratory Council ChairBill KramerLawrence Berkeley National Laboratory Facility Coordinator Miron Livny Deputy: Todd Tannenbaum (interim) University of Wisconsin, Madison They direct the OSG program of work, write policy and represent the OSG Consortium in relations with other organizations and committees.

Who are the Virtual Organizations?

Virtual Organization (VO) is a collection of people (VO members) and it encompasses the group's computing/storage resources and services They are responsible for corresponding individually with each other for guaranteed access to resources In order to receive the approval at another VO's site, a user's grid job must be able to present an verification token along with a token indicating the desired computing privileges

-(NYSGRID) -(CIGI) -Collider Detector at Fermilab (CDF) -Compact Muon Solenoid (CMS) -CompBioGrid (CompBioGrid) -D0 Experiment at Fermilab (DZero) -Dark Energy Survey (DES) -Distributed Organization for Scientific and Academic Research (DOSAR) -Engagement (Engage) -Fermi National Accelerator Center (Fermilab) -Functional Magnetic Resonance Imaging (fMRI) -Geant4 Software Toolkit (geant4) -Genome Analysis and Database Update (GADU) Some of the Virtual Organizations at OSG:

How to Form a VO at OSG?

-A Charter statement describing the purpose of the VO Organization needs… -A VO Membership Service which meets the requirements of an OSG Release. This is done by deploying the VOMS package (a system that manages real-time user authorization information for a VO) -A support organization (called a Support Center in OSG parlance) that will support the VO in OSG Operations -Completion of the registration form

After submitting the registration form, the OSG Operations Activity will review the information. If there are no issues preventing acceptance, they will send a welcome message, add support for your VO to the OSG software infrastructure

Making resources available on OSG You don’t have to be a member of the VO to share resources On OSG, even though it is recommended (in order to test A resource on OSG you must be a member) Resources are presented to OSG typically in one of two modes: - Resources controlled by a single VO and made available as part of the VO's commitment to OSG -Resources provided by a Facility (a collection of resources or sites under the same administrative domain, not necessarily affiliated with a VO) or provided by a group of VOs.

Software Stacks

The VDT provides the underlying software stack for the OSG, but also provides software to other grids. It is separate into two caches. The goal of the VDT software cache is to be grid-agnostic. The OSG software stack is a thin layer on top of the VDT that does two things: it selects the subset of the VDT that OSG uses, and it provides OSG-specific configuration Pacman installs and configures it all

What is Pacman? Packaging system that installs virtual data toolkit (VDT) It is installed via a downloaded tarball Upward compatiblity with all existing caches Flexible command line based cache and package browsing Snapshots, cache hierarchies installation caches available Globus and/or SSH access as well as http access to caches and downloads Updating, verify and repair of installations Multi-version installations

Debian 3.1 (Sarge) Fedora Core 3 Fedora Core 4 Fedora Core 4 (x86-64) Fedora Core 4 (x86 on x86-64) RedHat Enterprise Linux 3 AS RedHat Enterprise Linux 3 AS (x86-64) RedHat Enterprise Linux 3 AS (x86 on x86-64) RedHat Enterprise Linux 3 AS (IA-64) RedHat Enterprise Linux 4 AS RedHat Enterprise Linux 4 AS (x86-64) RedHat Enterprise Linux 4 AS (x86 on x86-64) ROCKS Linux 3.3 Scientific Linux Fermi 3 Scientific Linux Fermi 4 Scientific Linux Fermi 4 (x86-64) Scientific Linux Fermi 4 (x86 on x86-64) Scientific Linux 4 (IA-64) SUSE Linux 9 (IA-64) VDT will run on the following systems:

The OSG is set up to enable a smooth transition from developing new services to providing them in a production environment. Each set of services and functionality is then used as a design basis for OSG applications is then turned into an OSG “Release.”

VDT contains three kinds of middleware: Basic Grid Services: Condor-G and Globus Virtual Data Systems Utilities: MonAlisa, VOMS, What is supported in different platforms?

VDT is funded by National Science Foundation The National Science Foundation funds research and education in most fields of science and engineering. It does this through grants, and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the country. The Foundation accounts for about one-fourth of federal support to academic institutions for basic research.

And… the Department of Energy The Department of Energy's overarching mission is to advance the national, economic, and energy security of the United States; to promote scientific and technological innovation in support of that mission; and to ensure the environmental cleanup of the national nuclear weapons complex. The Department's strategic goals to achieve the mission are designed to deliver results along five strategic themes: Energy Security Nuclear Security Scientific Discovery and Innovation Environmental Responsibility Management Excellence:

The OSG Production software cache is at: The OSG ITB software cache is at: The OSG VTB software cache is at: The VDT software cache for VDT (used in this OSG software release) is at: Contents of the VDT software cache:

When are meetings held at OSG?

Meetings held and scheduled for OSG

Here you can find international news regarding the Grid Here you can find general news about the Grid

Resources: OSG Facility PPT by Miron Livny