What Can the Open Science Grid Do for You? Ruth Pordes Associate Head, Fermilab Computing Division Executive Director, Open Science Grid US CMS Grid Services.

Slides:



Advertisements
Similar presentations
Network Measurements Session Introduction Joe Metzger Network Engineering Group ESnet Eric Boyd Deputy Technology Officer Internet2 July Joint.
Advertisements

Project supported by the Department of Energy Office of Science SciDAC-2 program from the High Energy Physics, Nuclear Physics and Advanced Software and.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Copyright James Kent Blackburn This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,
M.Goldberg/NSFOSGSep 18, 2012 The Open Science Grid 1.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
Collaborative Science, Campus Cyberinfrastructure, & the Open Science Grid Ruth Pordes Fermilab.
OSG Consortium and Stakeholders Bill Kramer – Chair, OSG Counil National Energy Research Scientific Computing Center Lawrence.
CILogon and InCommon: Technical Update Jim Basney This material is based upon work supported by the National Science Foundation under grant numbers
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
OSG - Education, Training and Outreach OpenScienceGrid.org/Education OpenScienceGrid.org/About/Outreach OpenScienceGrid.org/Education OpenScienceGrid.org/About/Outreach.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
What is Internet2? Ted Hanss, Internet2 5 March
A short introduction to the Worldwide LHC Computing Grid Maarten Litmaath (CERN)
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
1-1.1 Sample Grid Computing Projects. NSF Network for Earthquake Engineering Simulation (NEES) 2004 – to date‏ Transform our ability to carry out research.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
Mar 28, 20071/9 VO Services Project Gabriele Garzoglio The VO Services Project Don Petravick for Gabriele Garzoglio Computing Division, Fermilab ISGC 2007.
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Grid computing using Alina Bejan University of Chicago.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
Open Science Grid For CI-Days NYSGrid Meeting Sebastien Goasguen, John McGee, OSG Engagement Manager School of Computing.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
OSG at CHEP2009 Project supported by the Department of Energy Office of Science SciDAC-2 program from the High Energy Physics, Nuclear Physics and Advanced.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
Introducing the Open Science Grid Project supported by the Department of Energy Office of Science SciDAC-2 program from the High Energy Physics, Nuclear.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
Accelerating Campus Research with Connective Services for Cyberinfrastructure Rob Gardner Steve Tuecke.
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
1 An update on the Open Science Grid for IHEPCCC Ruth Pordes, Fermilab.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
DOE/NSF Quarterly review January 1999 Particle Physics Data Grid Applications David Malon Argonne National Laboratory
Towards deploying a production interoperable Grid Infrastructure in the U.S. Vicky White U.S. Representative to GDB.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Round-table session «Presentation of the new telecommunication channel JINR-Moscow and of the JINR Grid-segment» of the JINR Grid-segment» Korenkov Vladimir.
OSG All Hands, LIGO Livinston Lab, 2009 The State of the Open Science Grid Ruth Pordes Executive Director.
1 Particle Physics Data Grid (PPDG) project Les Cottrell – SLAC Presented at the NGI workshop, Berkeley, 7/21/99.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
] Open Science Grid Ben Clifford University of Chicago
Open Science Grid Progress and Status
The Worldwide LHC Computing Grid BEGrid Seminar
Open Science Grid Overview
The LHC Grid Service A worldwide collaboration Ian Bird
Open Science Grid at Condor Week
Presentation transcript:

What Can the Open Science Grid Do for You? Ruth Pordes Associate Head, Fermilab Computing Division Executive Director, Open Science Grid US CMS Grid Services and Interfaces Coordinator International Science Grid This Week.

Since 1980s…Fermilab Science Mission supports Science with Large Data and Collaboration Needs DZero Collaboration Data stored in 2007 for the 2 Tevatron Experiments (CDF, DZero) if on DVDs 4 GB, 1.2 mm high Early adopters of networks for data distribution, web information access, & distributed computing.

PPDG GriPhyN iVDGL TrilliumGrid3 Open Science Grid (DOE) (DOE+NSF) (NSF) Campus, regional grids LHC Ops LHC construction, preparation LIGO operation LIGO preparation European Grid + Worldwide LHC Computing Grid DOE Science Grid (DOE) Other Nations & Nationalities Coherent Global Science Communities Infrastructure integration, operations, interfaces, bridges Autonomous Diverse Requirements Open Science Grid: emerged from ad-hoc cooperation of Science Collaborations, Grid Projects, & IT Facilities

PPDG GriPhyN iVDGL TrilliumGrid3 Open Science Grid (DOE) (DOE+NSF) (NSF) Campus, regional grids LHC Ops LHC construction, preparation LIGO operation LIGO preparation European Grid + Worldwide LHC Computing Grid DOE Science Grid (DOE) Other Nations & Nationalities Coherent Global Science Communities Infrastructure integration, operations, interfaces, bridges Autonomous Diverse Requirements Open Science Grid: emerged from ad-hoc cooperation of Science Collaborations, Grid Projects, & IT Facilities

Transform processing and data intensive science through a cross-domain shared national distributed cyber-infrastructure that brings together campus and community infrastructure and facilitates the needs of Virtual Organizations (VO) at all scales

Through Practical Steps

Physics Helping Non-Physics Which Helps Physics!

Project supported by the Department of Energy Office of Science SciDAC-2 program High Energy Physics, Nuclear Physics and Advanced Software and Computing Research programs, and the National Science Foundation Math and Physical Sciences, Office of Cyber Infrastructure and Office of International Science and Engineering

BU UNM UTA OU FNAL ANL UCHICAGO WISC BNL VANDERBILT PSU UVA IOWA STATE PURDUE IU BUFFALO TTU CORNELL ALBANY UMICH INDIANA IUPUI UWM UNL UFL KU UNI WSU MSU LTU LSU CLEMSON MCGILL UMISS UIUC UCLA SDSC CALTECH NERSC STANFORD UCR LEHIGH MIT UMD Computers and Disks Accessible at Many Places in the US + 4 Universities/Labs in South America + 3 in Taiwan/Korea + 1 in the UK!

Academia SinicaKyungpook National UniversityUniversity of Arkansas Argonne National Laboratory (ANL)Laser Interferometer Gravitational Wave Observatory (LIGO) Universidade de São Paulo Boston UniversityLawrence Berkeley National Laboratory (LBL)Universideade do Estado do Rio de Janerio Brookhaven National Laboratory (BNL)Lehigh UniversityUniversity of Birmingham California Institute of TechnologyMassachusetts Institute of TechnologyUniversity of California, Davis Center for Advanced Computing ResearchNational Energy Research Scientific Computing Center (NERSC) University of California, Riverside Center for Computation & Technology at LouisianaNational Taiwan UniversityUniversity of California, San Diego The State University of New York at BuffaloNew York UniversityUniversity of Chicago Center for High Performance Computing at the University of New Mexico Notre Dame UniversityUniversity of Connecticut Health Center Clemson UniversityOak Ridge National LaboratoryUniversity of Florida Collider Detector at Fermilab (CDF)OSG Grid Operations Center (GOC)University of Illinois at Chicago Columbia UniversityPennsylvania State UniversityUniversity of Michigan Condor ProjectPurdue UniversityUniversity of Nebraska - Lincoln Cornell UniversityRenaissance Computing InstituteUniversity of New Mexico DZero CollaborationRice UniversityUniversity of North Carolina Fermi National Accelerator Laboratory (FNAL)Rochester Institute of TechnologyUniversity of Northern Iowa Florida International UniversitySão Paulo Regional Analysis Center (SPRACE)University of Oklahoma Georgetown UniversitySloan Digital Sky Survey (SDSS)University of Rochester The Globus AllianceSolenoidal Tracker at RHIC (STAR)University of South Florida Hampton UniversitySouthern Methodist UniversityUniversity of Texas at Arlington Harvard UniversityStanford Linear Accelerator Center (SLAC)University of Virginia Indiana UniversityState University of New York at BuffaloUniversity of Wisconsin-Madison Indiana University-Purdue University,Syracuse UniversityUniversity of Wisconsin-Milwaukee Center for Gravitation and Cosmology Wayne State UniversityTexas Tech UniversityUS ATLAS Thomas Jefferson National Accelerator FacilityUS CMS Vanderbilt University

~43,000 cores ~6 Petabytes shared Disk Tape storage Sharing of the Facility among all LHC experiments and Run II, RHIC, Theorists, Biologists, Chemists, Climate… Common software developed between Computer Science projects and HENP.

400,000 CPU days/day 200,000 jobs/day

OSG is part of the Worldwide LHC Computing Grid Collaboration (together with EGEE & the Experiments)

BU UN M UTA OU FNAL ANL UCHICAGO WISC BNL VANDERBILT PSU UVA IOWA STATE PURDUE IU BUFFALO TTU CORNELL ALBANY UMICH INDIANA IUPUI UWM UN L UFL KU UNI WSU MSU LT U LSU CLEMSON MCGILL UMISS UIUC UCLA SDSC CALTECH NERSC STANFORD UC R LEHIGH Britta Daudert California Institute of Technology Britta is an OSG user with the Laser Interferometer Gravitational Wave Observatory (LIGO) experiment; gravity wave research is one of OSG’s principle science drivers.

Alain Roy University of Wisconsin-Madison Alain is the software coordinator for OSG. He is a computer scientist in the Condor Project. (He is a master bread maker)

“Providing the security framework that promotes autonomous and open science collaboration…” Balancing openness needed for the science with the security needed for OSG members Ensuring security is not compromised because of Open Science Ensuring science is not burdened because of security Mine Altunay, Fermilab The OSG Security Officer. Building a symmetric “eco-system” Sites, VOs, Users all participate.

Acknowledgement:

Sites

Virtual Organizations Autonomous and self-organized collaborative community of people and things they share. Delegate physical identity management to the participating “Real” Organizations: from “heavy-weight” long-lived, with strong governance and policies e.g. the LHC experiments, to “ad-hoc” self-organizing, groups of individuals who dynamically build on a few common ideas in much the same way as internet- based social networks e.g. students in a Grid school class.

Grid Services OSG Integrates, Tests, Operates, Troubleshoots, Monitors, Manages, Supports OSG acts as an Agency..brokers & supports relationships/expectations between VOs & resource, facility, service, software providers.

How it all comes together Resources that Trust the VO VO Management Service OSG Infrastructure VO Middleware & Applications Virtual Organization Management services (VOMS) allow registration, administration and control of members of the group. Resources trust and authorize VOs not individual users OSG infrastructure provides the fabric for job submission and scheduling, resource discovery, security, monitoring, … And.. Resource Selection Based on Condor ClassAds

Common Reusable Software The Virtual Data Toolkit includes Condor, Globus, VO/group management security modules, accounting, monitoring, resource selection All software is open source. The VDT is also supported for TeraGrid, Enabling Grids for EsciencE in Europe, National UK Grid and is in test for the Earth System Grid.

Opportunistic Use and Sharing: Helping Other HEP Collaborations D0 use of Owned & Non-Owned CPUs

Engage Helping the non-Physics Sciences

Chemistry - Andrew Shultz, University of Buffalo. Application to model virial coefficients of water. Anticipate research highlight/publication this summer. 78,000 jobs consuming average of about 100 CPU days/day over 6 months. 26

Computational Biology: Protein Folding/Structure 27 Assistant professor and 2 students running fairly steadily. ~620,000 CPU wallclock hours in 2008 (average 120 cpudays/day for ~210 days). Expect research highlight in the next few months. When does a set of individuals become a community/VO?

Luettich: Coastal ModelingKuhlman: Rosetta protein modelling Protein Analysis; JHU 28 Deep language processing: UNC

LIGO science topic to search for deviations in pulsar signals. Runs across Boinc home computers, German and US grids. Testing use of a new service on OSG & getting science output. Using >3,000 nodes steadily before accounting system adapted to report it!

30 Becoming a Full OSG Citizen Join the OSGEDU VO: Run small applications after learning how to use OSG from schools Be part of the Engagement program and Engage VO: Be a standalone VO and a Member of the Consortium: Ongoing use of OSG & participate in one or more activity groups.

Education & Communication

New Technologies, Clouds, Companies How To: Have usable global file systems. Have really transparent use of the global computer. Make effective use of multi-core. Integrate High Performance and High Throughput computing. Use specialized hardware & new operating systems. Integrate with Amazon, Google….

What can You Do for the Open Science Grid ?