Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.

Slides:



Advertisements
Similar presentations
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
Advertisements

1 IWLSC, Kolkata India 2006 Jérôme Lauret for the Open Science Grid consortium The Open-Science-Grid: Building a US based Grid infrastructure for Open.
Towards a Virtual European Supercomputing Infrastructure Vision & issues Sanzio Bassini
EInfrastructures (Internet and Grids) US Resource Centers Perspective: implementation and execution challenges Alan Blatecky Executive Director SDSC.
Campus High Throughput Computing (HTC) Infrastructures (aka Campus Grids) Dan Fraser OSG Production Coordinator Campus Grids Lead.
1 Cyberinfrastructure Framework for 21st Century Science & Engineering (CF21) IRNC Kick-Off Workshop July 13,
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
R. Pordes, I Brazilian LHC Computing Workshop 1 What is Open Science Grid?  High Throughput Distributed Facility  Shared opportunistic access to existing.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – “Think Globally, Act Locally” A point-of-view from the United States Mary.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
W w w. h p c - e u r o p a. o r g Single Point of Access to Resources of HPC-Europa Krzysztof Kurowski, Jarek Nabrzyski, Ariel Oleksiak, Dawid Szejnfeld.
Assessment of Core Services provided to USLHC by OSG.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
Open Science Ruth Pordes Fermilab, July 17th 2006 What is OSG Where Networking fits Middleware Security Networking & OSG Outline.
EGI: A European Distributed Computing Infrastructure Steven Newhouse Interim EGI.eu Director.
The Preparatory Phase Proposal a first draft to be discussed.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
EGI-Engage EGI-Engage Engaging the EGI Community towards an Open Science Commons Project Overview 9/14/2015 EGI-Engage: a project.
GridChem Workshop (March 9, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
1 Introduction to EGEE-II Antonio Fuentes Tutorial Grid Madrid, May 2007 RedIRIS/Red.es (Slices of Bob Jone, Director of EGEE-II.
Open Science Grid Frank Würthwein OSG Application Coordinator Experimental Elementary Particle Physics UCSD.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
D0SAR Workshop (March 30, 2006)Paul Avery1 University of Florida Open Science Grid Linking Universities and Laboratories in National.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Grid computing using Alina Bejan University of Chicago.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
The Swiss Grid Initiative Context and Initiation Work by CSCS Peter Kunszt, CSCS.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
1October 9, 2001 Sun in Scientific & Engineering Computing Grid Computing with Sun Wolfgang Gentzsch Director Grid Computing Cracow Grid Workshop, November.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE II: an eInfrastructure for Europe and.
Open Science Grid An Update and Its Principles Ruth Pordes Fermilab.
INFSO-RI Enabling Grids for E-sciencE External Projects Integration Summary – Trigger for Open Discussion Fotis Karayannis, Joanne.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGI Operations Tiziana Ferrari EGEE User.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions and services that benefit.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
© Copyright AARNet Pty Ltd PRAGMA Update & some personal observations James Sankar Network Engineer - Middleware.
Internet2 Spring Meeting NSF Middleware Initiative Purpose To design, develop, deploy and support a set of reusable, expandable set of middleware functions.
1 Presentation Title Speaker Institution Event Name EUAsiaGRID Community and Applications Support Jan Kmuníček, CESNET.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
EGEE-II INFSO-RI Enabling Grids for E-sciencE Training in EGEE-II Mike Mineter (Some slides from Brendan Hamill)
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Defining the Technical Roadmap for the NWICG – OSG Ruth Pordes Fermilab.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI A pan-European Research Infrastructure supporting the digital European Research.
Open Science Grid Interoperability
Bob Jones EGEE Technical Director
Open Science Grid Progress and Status
Monitoring and Information Services Technical Group Report
Long-term Grid Sustainability
EGI-Engage Engaging the EGI Community towards an Open Science Commons
EGI Webinar - Introduction -
Open Science Grid at Condor Week
Presentation transcript:

Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL

One Minute on OSG The Open Science Grid is a national production-quality grid computing infrastructure for large scale science, built and operated by a consortium of U.S. universities and national laboratories. –The OSG Consortium formed in 2004 to enable diverse communities of scientists to access a common grid infrastructure and shared resources. Groups that choose to join the Consortium contribute effort and resources to the common infrastructure. –Aggressive program to federate many previously disjoint [mostly physics oriented] grid resources –Resource providers and consumers from labs and universities join into a single scalable, engineered, and managed grid. –Expanding the reach of grid capabilities is the corner-stone of ODG and for sustaining grid technologies and infrastructure. OSG is committed to use what was learned from working with the physics community to help new communities establish community based grids and adapt their applications and computing "culture”. Currently 25 Virtual Organizations as Users Currently 23 Resource Providers More information at

Integrating Community, Campus, & National Cyberinfrastructures (CIs) Science Community Infrastructure CS/IT Campus Grids National CyberInfrastructure for Science (e.g. OSG –Teragrid) (e.g GADU, LIGO) (e.g GLOW, FermiGrid)

Critical Topics to a Sustainable Grid Infrastructure. The architecting, construction and implementation of a globally scalable, real-time infrastructure. –Way beyond simple middleware, and –Requires an SOA and profound network as well as end-system awareness, end- to-end monitoring, command and control. –The next round of really-large science projects Will implement such a systems. Otherwise the science goals will not be met. A well defined and scalable management structure –Both technical management and people management A structure with solid fundamental supports (user support, helpdesk, office hours) for a core component A framework where development & ideas can be achieved and new resources could "join" [homogeneous or not ] –Has significant design impact on what is "core“ resources and services A common agreement (user, usage, VO) leading to a common security strategy: –opened and shared resources –an advertised agreement (market place) An authentication structure that is secure, easy for sites (VOs) to administer, and easy for users to use and conforms to government policy

Backup

OSG Participating Disciplines Computer Science Condor, Globus, SRM, SRBTest and validate innovations: new services & technologies Physics LIGO, Nuclear Physics, Tevatron, LHC Global Grid: computing & data access Astrophysics Sloan Digital Sky SurveyCoAdd: multiply-scanned objects Spectral fitting analysis Bioinformatics Argonne GADU project Dartmouth Psychological & Brain Sciences BLAST, BLOCKS, gene sequences, etc Functional MRI University campus Resources, portals, apps CCR(U Buffalo) GLOW(U Wisconsin) TACC(Texas Advanced Computing Center) MGRID(U Michigan) UFGRID(U Florida) Crimson Grid(Harvard) FermiGrid(FermiLab Grid)

OSG Grid Partners TeraGrid “DAC2005”: run LHC apps on TeraGrid resources TG Science Portals for other applications Discussions on joint activities: Security, Accounting, Operations, Portals EGEE Joint Operations Workshops, defining mechanisms to exchange support tickets Joint Security working group US middleware federation contributions to core- middleware gLITE Worldwide LHC Computing Grid OSG contributes to LHC global data handling and analysis systems Other partners SURA, GRASE, LONI, TACC Representatives of VOs provide portals and interfaces to their user groups