Copyright James Kent Blackburn 2007. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial,

Slides:



Advertisements
Similar presentations
HATHITRUST A Shared Digital Repository HathiTrust current work, challenges, and opportunities for public libraries Creating a Blueprint for a National.
Advertisements

Joint CASC/CCI Workshop Report Strategic and Tactical Recommendations EDUCAUSE Campus Cyberinfrastructure Working Group Coalition for Academic Scientific.
Network Measurements Session Introduction Joe Metzger Network Engineering Group ESnet Eric Boyd Deputy Technology Officer Internet2 July Joint.
Presentation at WebEx Meeting June 15,  Context  Challenge  Anticipated Outcomes  Framework  Timeline & Guidance  Comment and Questions.
HATHITRUST A Shared Digital Repository HathiTrust: A Second Life for Library Collections Jeremy York Exploring Humanities Cyberinfrastructure April 30,
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007.
EInfrastructures (Internet and Grids) - 15 April 2004 Sharing ICT Resources – Discussion of Best Practices in the U.S. Mary E. Spada Program Manager, Strategic.
Copyright Statement © Jason Rhode and Carol Scheidenhelm This work is the intellectual property of the authors. Permission is granted for this material.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
1 Fighting Back With An Alliance For Secure Computing And Networking Wayne Donald, Virginia Tech Cathy Hubbs, George Mason University Darlene Quackenbush,
HATHITRUST A Shared Digital Repository HathiTrust Past, Present, and Future A Brief Introduction.
Serving MERLOT on Your Campus Gerry Hanley California State University and MERLOT Seminars on Academic Computing August 7, 2002 Snowmass CO Copyright Gerard.
Assessment of Core Services provided to USLHC by OSG.
OSG Consortium and Stakeholders Bill Kramer – Chair, OSG Counil National Energy Research Scientific Computing Center Lawrence.
CILogon and InCommon: Technical Update Jim Basney This material is based upon work supported by the National Science Foundation under grant numbers
HATHITRUST A Shared Digital Repository HathiTrust: Putting Research in Context HTRC UnCamp September 10, 2012 John Wilkin, Executive Director, HathiTrust.
Grid Information Systems. Two grid information problems Two problems  Monitoring  Discovery We can use similar techniques for both.
OSG Grid Workshop in KNUST, Kumasi, Ghana August 6-8, 2012 following the AFRICAN SCHOOL OF FUNDAMENTAL PHYSICS AND ITS APPLICATIONS July 15-Aug 04, 2012.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Experiment Requirements for Global Infostructure Irwin Gaines FNAL/DOE.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
What Can the Open Science Grid Do for You? Ruth Pordes Associate Head, Fermilab Computing Division Executive Director, Open Science Grid US CMS Grid Services.
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
What is Internet2? Ted Hanss, Internet2 5 March
A Technology Vision for the Future Rick Summerhill, Chief Technology Officer, Eric Boyd, Deputy Technology Officer, Internet2 Joint Techs Meeting 16 July.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
HATHITRUST A Shared Digital Repository HathiTrust and TRAC DigitalPreservation 2012 July 25, 2012 Jeremy York, Project Librarian, HathiTrust.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
1-1.1 Sample Grid Computing Projects. NSF Network for Earthquake Engineering Simulation (NEES) 2004 – to date‏ Transform our ability to carry out research.
Learning and Engagement in Library Spaces Suzanne E. Thorin Ruth Lilly University Dean of University Libraries and Associate Vice President for Digital.
DV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science DOE: Scientific Collaborations at Extreme-Scales:
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
Open Science Grid For CI-Days NYSGrid Meeting Sebastien Goasguen, John McGee, OSG Engagement Manager School of Computing.
HATHITRUST A Shared Digital Repository HathiTrust and the Future of Research Libraries American Antiquarian Society March 31, 2012 Jeremy York, Project.
NSDL Collections Based on DOE User Facilities Christopher Klaus 10/05/03.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
OSG Consortium Meeting (January 23, 2006)Paul Avery1 University of Florida Open Science Grid Progress Linking Universities and Laboratories.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
Improving the Social Nature of OnLine Learning Tap into what students are already doing Tap into what students are already doing Educause SWRC07 Copyright.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
What’s Happening at Internet2 Renee Woodten Frost Associate Director Middleware and Security 8 March 2005.
Cyberinfrastructure and Internet2 Eric Boyd Deputy Technology Officer Internet2.
HATHITRUST A Shared Digital Repository HathiTrust Large Digital Libraries: Beyond Google Books Modern Language Association January 5, 2012 Jeremy York,
Parag Mhashilkar Computing Division, Fermi National Accelerator Laboratory.
High Risk 1. Ensure productive use of GRID computing through participation of biologists to shape the development of the GRID. 2. Develop user-friendly.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Emerging StateNets Issues Associated with CI and the 3- Tier Networking Model Steve Corbató CI Strategic Initiatives, University of Utah StateNets – Tempe.
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
March 2014 Open Science Grid Operations A Decade of HTC Infrastructure Support Kyle Gross Operations Support Lead Indiana University / Research Technologies.
Deploying Regional Grids Creates Interaction, Ideas, and Integration
Open Science Grid Progress and Status
Presentation transcript:

Copyright James Kent Blackburn This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author. Copyright Notice

Open Science Grid James “Kent” Blackburn OSG Resources Manager Senior Scientist LIGO Laboratory California Institute of Technology

Instrumentation Security Control Data Generation Computation Analysis Simulation Program Security Management Security and Access Authentication Access Control Authorization Researcher Control Program Viewing Security 3D Imaging Display and Visualization. Display Tools Security Data Input Collab Tools Publishing Human Support Help Desk Policy and Funding Resource Providers Funding Agencies Campuses Search Data Sets Storage Security RetrievalInput Schema Metadata Data Directories Ontologies Archive Education And Outreach Network Training OSG’s Coverage of the CI “Bubble” Diagram OSG OSG Consortium

The Open Science Grid The Open Science Grid’s mission is to help satisfy the ever-growing computing and data management requirements of researchers by enabling them to share a greater percentage of available computer cycles and software with less effort. The OSG is a distributed, common cyberinfrastructure spanning campus, regional, national and international boundaries. At over 50 provider sites, independently-owned and managed resources make up the distributed facility;  agreements between members provided the glue;  their requirements drive the evolution;  their effort helps make it happen. The facility is dedicated to high throughput computing and is open to researchers from all domains. OSG is a Cyberinfrastructure for Research The OSG is a framework for large scale distributed resource sharing, addressing the technology, policy and social requirements of sharing

OSG Consortium Partners Academia Sinica Argonne National Laboratory (ANL) Boston University Brookhaven National Laboratory (BNL) California Institute of Technology Center for Advanced Computing Research Center for Computation & Technology at Louisiana State University Center for Computational Research, The State University of New York at Buffalo Center for High Performance Computing at the University of New Mexico Columbia University Computation Institute at the University of Chicago Cornell University DZero Collaboration Dartmouth College Fermi National Accelerator Laboratory (FNAL) Florida International University Georgetown University Hampton University Indiana University Indiana University-Purdue University, Indianapolis International Virtual Data Grid Laboratory (iVDGL) Thomas Jefferson National Accelerator Facility University of Arkansas Universidade de São Paulo Universideade do Estado do Rio de Janerio University of Birmingham University of California, San Diego University of Chicago University of Florida University of Illinois at Chicago University of Iowa University of Michigan University of Nebraska - Lincoln University of New Mexico University of North Carolina/Renaissance Computing Institute University of Northern Iowa University of Oklahoma University of South Florida University of Texas at Arlington University of Virginia University of Wisconsin-Madison University of Wisconsin-Milwaukee Center for Gravitation and Cosmology Vanderbilt University Wayne State University Kyungpook National University Laser Interferometer Gravitational Wave Observatory (LIGO) Lawrence Berkeley National Laboratory (LBL) Lehigh University Massachusetts Institute of Technology National Energy Research Scientific Computing Center (NERSC) National Taiwan University New York University Northwest Indiana Computational Grid Notre Dame University Pennsylvania State University Purdue University Rice University Rochester Institute of Technology Sloan Digital Sky Survey (SDSS) Southern Methodist University Stanford Linear Accelerator Center (SLAC) State University of New York at Albany State University of New York at Binghamton State University of New York at Buffalo Syracuse University T2 HEPGrid Brazil Texas Advanced Computing Center Texas Tech University

What The OSG Offers Low-threshold access to many distributed computing and storage resources A combination of dedicated, scheduled, and opportunistic computing The Virtual Data Toolkit software packaging and distributions Grid Operations, including facility-wide monitoring, validation, information services and system integration testing Operational security Troubleshooting of end-to-end problems Education and Training

The OSG as a Community Alliance The OSG is a grass-roots endeavor bringing together research institutions throughout the U.S. and the World.  The OSG Consortium brings together the stakeholders.  The OSG Facility brings together resources and users. The OSG’s growing alliance of universities, national laboratories, scientific collaborations and software developers,  contribute to the OSG,  share ideas and technologies  reap the benefits of the integrated resources through both agreements with fellow members and opportunistic use. An active engagement effort adds new domains and resource providers to the OSG Consortium. Training is offered at semi-annual OSG Consortium meetings and through educational activities organized in collaboration with TeraGrid.  One to three day hands-on training sessions are offered around the U.S and abroad for users, administrators and developers.

OSG Community Structure Virtual Organizations (VOs) The OSG community shares/trades in groups (VOs) not individuals VO management services allow registration, administration and control of members within VOs Facilities trust and authorize VOs Compute and storage services prioritize according to VO group membership Set of Available Resources VO Management Service OSG and WAN Campus Grid Experimental Project Grid Image courtesy: UNM VO Management & Applications VO Management & Applications

Campus Grids They are a fundamental building block of the OSG  The multi-institutional, multi-disciplinary nature of the OSG is a macrocosm of many campus IT cyberinfrastructure coordination issues. Currently OSG has three operational campus grids on board:  Fermilab, Purdue, Wisconsin  Working to add Clemson, Harvard, Lehigh Elevation of jobs from Campus CI to OSG is transparent Campus scale brings value through  Richness of common software stack with common interfaces  Higher common denominator makes sharing easier  Greater collective buying power with venders  Synergy through common goals and achievements

Current OSG Resources OSG has more than 50 participating institutions, including self-operated research VOs, campus grids, regional grids and OSG-operated VOs Provides about 10,000 CPU-days per day in processing Provides 10 Terabytes per day in data transport CPU usage averages about 75% OSG is starting to offer support for MPI

Weekly OSG Process Hours

Facts and Figures from First Year of Operations OSG contributed an average of over one thousand CPU-days per day for two months to the D0 physics experiment OSG provided the LHC collaboration more than 30% of their processing cycles worldwide, in which up to 100 Terabytes per day were transferred across more than 7 storage sites LIGO has been running workflows of more than 10,000 jobs across more than 20 different OSG sites. A climate modeling application has accumulated more than 10,000 CPU days of processing on the OSG. The Kuhlman Lab completed structure predictions for ten proteins, consuming more than 10,000 CPU-days on the OSG.

Facing the CI Challenge Together OSG is looking for a few partners to help deploy campus wide grid infrastructure that integrates with local enterprise infrastructure and the national CI OSG’s Engagement Team is available to help scientists get their applications running on OSG  Low impact starting point  Help your researchers gain significant compute cycles while exploring OSG as a framework for your own campus CI your inquires to Learn more about the OSG at more about the OSG at