Some Grid Experiences Laura Pearlman USC Information Sciences Institute ICTP Advanced Training Workshop on Scientific Instruments on the Grid *Most of.

Slides:



Advertisements
Similar presentations
The Access Grid Ivan R. Judson 5/25/2004.
Advertisements

Distributed Data Processing
Xsede eXtreme Science and Engineering Discovery Environment Ron Perrott University of Oxford 1.
1 US activities and strategy :NSF Ron Perrott. 2 TeraGrid An instrument that delivers high-end IT resources/services –a computational facility – over.
The Anatomy of the Grid: An Integrated View of Grid Architecture Carl Kesselman USC/Information Sciences Institute Ian Foster, Steve Tuecke Argonne National.
SACNAS, Sept 29-Oct 1, 2005, Denver, CO What is Cyberinfrastructure? The Computer Science Perspective Dr. Chaitan Baru Project Director, The Geosciences.
High Performance Computing Course Notes Grid Computing.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid June 28, 2006 Bill Kramer Chair of the Open Science Grid Council NERSC Center General Manager, LBNL.
Office of Science U.S. Department of Energy Grids and Portals at NERSC Presented by Steve Chan.
Using Globus to Locate Services Case Study 1: A Distributed Information Service for TeraGrid John-Paul Navarro, Lee Liming.
1 CYBERINFRASTRUCTURE FOR THE GEOSCIENCES Global Earth Observation Grid Workshop, Bangkok, Thailand, March Integration Platform.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
Computing in Atmospheric Sciences Workshop: 2003 Challenges of Cyberinfrastructure Alan Blatecky Executive Director San Diego Supercomputer Center.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Scientific Data Infrastructure in CAS Dr. Jianhui Scientific Data Center Computer Network Information Center Chinese Academy of Sciences.
SICSA student induction day, 2009Slide 1 Social Simulation Tutorial Session 6: Introduction to grids and cloud computing International Symposium on Grid.
Introduction to Grid Computing Ann Chervenak and Ewa Deelman USC Information Sciences Institute.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
CI Days: Planning Your Campus Cyberinfrastructure Strategy Russ Hobby, Internet2 Internet2 Member Meeting 9 October 2007.
PolarGrid Geoffrey Fox (PI) Indiana University Associate Dean for Graduate Studies and Research, School of Informatics and Computing, Indiana University.
CoG Kit Overview Gregor von Laszewski Keith Jackson.
August 2007 Advancing Scientific Discovery through TeraGrid Adapted from S. Lathrop’s talk in SC’07
SAN DIEGO SUPERCOMPUTER CENTER NUCRI Advisory Board Meeting November 9, 2006 Science Gateways on the TeraGrid Nancy Wilkins-Diehr TeraGrid Area Director.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
SG - OSG Improving Campus Research CI Through Leveraging and Integration: Developing a SURAgrid-OSG Collaboration John McGee, RENCI/OSG Engagement Coordinator.
What is Cyberinfrastructure? Russ Hobby, Internet2 Clemson University CI Days 20 May 2008.
Research and Educational Networking and Cyberinfrastructure Russ Hobby, Internet2 Dan Updegrove, NLR University of Kentucky CI Days 22 February 2010.
Open Science Grid  Consortium of many organizations (multiple disciplines)  Production grid cyberinfrastructure  80+ sites, 25,000+ CPU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
NanoHUB.org and HUBzero™ Platform for Reproducible Computational Experiments Michael McLennan Director and Chief Architect, Hub Technology Group and George.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Interoperability Grids, Clouds and Collaboratories Ruth Pordes Executive Director Open Science Grid, Fermilab.
Tools for collaboration How to share your duck tales…
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The Earth System Grid (ESG) Computer Science and Technologies DOE SciDAC ESG Project Review Argonne National Laboratory, Illinois May 8-9, 2003.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
MTA SZTAKI Hungarian Academy of Sciences Introduction to Grid portals Gergely Sipos
Cyberinfrastructure What is it? Russ Hobby Internet2 Joint Techs, 18 July 2007.
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
ISERVOGrid Architecture Working Group Brisbane Australia June Geoffrey Fox Community Grids Lab Indiana University
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
May 6, 2002Earth System Grid - Williams The Earth System Grid Presented by Dean N. Williams PI’s: Ian Foster (ANL); Don Middleton (NCAR); and Dean Williams.
August 3, March, The AC3 GRID An investment in the future of Atlantic Canadian R&D Infrastructure Dr. Virendra C. Bhavsar UNB, Fredericton.
1 NSF/TeraGrid Science Advisory Board Meeting July 19-20, San Diego, CA Brief TeraGrid Overview and Expectations of Science Advisory Board John Towns TeraGrid.
TeraGrid Gateway User Concept – Supporting Users V. E. Lynch, M. L. Chen, J. W. Cobb, J. A. Kohl, S. D. Miller, S. S. Vazhkudai Oak Ridge National Laboratory.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Cyberinfrastructure Overview Russ Hobby, Internet2 ECSU CI Days 4 January 2008.
Cyberinfrastructure: Many Things to Many People Russ Hobby Program Manager Internet2.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
Securing the Grid & other Middleware Challenges Ian Foster Mathematics and Computer Science Division Argonne National Laboratory and Department of Computer.
Network, Operations and Security Area Tony Rimovsky NOS Area Director
GRID ANATOMY Advanced Computing Concepts – Dr. Emmanuel Pilli.
Super Computing 2000 DOE SCIENCE ON THE GRID Storage Resource Management For the Earth Science Grid Scientific Data Management Research Group NERSC, LBNL.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
NORDUnet NORDUnet e-Infrastrucure: Grids and Hybrid Networks Lars Fischer CTO, NORDUnet Fall 2006 Internet2 Member Meeting, Chicago.
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
Northwest Indiana Computational Grid Preston Smith Rosen Center for Advanced Computing Purdue University - West Lafayette West Lafayette Calumet.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
EGI-InSPIRE EGI-InSPIRE RI The European Grid Infrastructure Steven Newhouse Director, EGI.eu Project Director, EGI-InSPIRE 29/06/2016CoreGrid.
TeraGrid Software Integration: Area Overview (detailed in 2007 Annual Report Section 3) Lee Liming, JP Navarro TeraGrid Annual Project Review April, 2008.
Accessing the VI-SEEM infrastructure
EGI Webinar - Introduction -
Cyberinfrastructure and PolarGrid
Presentation transcript:

Some Grid Experiences Laura Pearlman USC Information Sciences Institute ICTP Advanced Training Workshop on Scientific Instruments on the Grid *Most of these slides are from Lee Liming’s GlobusWorld 2006 presentation “A Globus ® Primer: What is the Grid and How Do I Use It?”

GlobusWORLD 2006Globus Primer2 To be Covered I. Grid computing problems II. Some notable U.S. Grids III. How grids are built and used in real life

GlobusWORLD 2006Globus Primer3 Grid Computing Problems l Scientific problems that are big enough that they require people in several organizations to collaborate and share computing resources, data, and instruments. u Interactive simulation (climate modeling) u Very large-scale simulation and analysis (galaxy formation, gravity waves, battlefield simulation) u Engineering (parameter studies, linked component models) u Experimental data analysis (high-energy physics) u Image and sensor analysis (astronomy, climate study, ecology) u Online instrumentation (microscopes, x-ray devices, etc.) u Remote visualization (climate studies, biology) u Engineering (large-scale structural testing, chemical engineering) u Biomedical applications

GlobusWORLD 2006Globus Primer4 Some Core Problems - Heterogeneity l Different authentication mechanisms across institutions l Different mechanisms for monitoring system and application status across institutions l Different ways to submit jobs l Different ways to store & access files and data l Different ways to keep track of data l Different preferences in programming languages and environments l Difficulty in tracking the causes of failures l Conflicting requirements among groups that need to interoperate

GlobusWORLD 2006Globus Primer5 Some Core Problems - Trust l Rigid use policies (authorization, QoS) vs rigid application assumptions. l Authorization needs to happen at many levels (communities, organizations, resource owners, etc.). l Complicated social structures exceed the abilities of simple authorization systems.

GlobusWORLD 2006Globus Primer6 Some Real-World Grids

GlobusWORLD 2006Globus Primer7 Earth System Grid Goal: Give climate scientists easier access to the distributed data and resources that they require to perform their research. Developed new technologies for (1) creating and operating "filtering servers" capable of performing sophisticated analyses, and (2) delivering results to users.

GlobusWORLD 2006Globus Primer8 Collaborative Engineering: NEES U.Nevada Reno

GlobusWORLD 2006Globus Primer9 UCSD UT UC/ANL NCSA PSC ORNL PU IU A National Science Foundation Investment in Cyberinfrastructure $100M 3-year construction ( ) $150M 5-year operation & enhancement ( ) NSF’s TeraGrid * l TeraGrid DEEP: Integrating NSF’s most powerful computers (60+ TF) u 2+ PB Online Data Storage u National data visualization facilities u World’s most powerful network (national footprint) l TeraGrid WIDE Science Gateways: Engaging Scientific Communities u 90+ Community Data Collections u Growing set of community partnerships spanning the science community. u Leveraging NSF ITR, NIH, DOE and other science community projects. u Engaging peer Grid projects such as Open Science Grid in the U.S. as peer Grids in Europe and Asia-Pacific. l Base TeraGrid Cyberinfrastructure: Persistent, Reliable, National u Coordinated distributed computing and information environment u Coherent User Outreach, Training, and Support u Common, open infrastructure services * Slide courtesy of Ray Bair, Argonne National Laboratory

GlobusWORLD 2006Globus Primer10 Open Science Grid l $30M over five years for effort to u sustain and evolve the distributed facility, u bring on board new communities & capabilities, u educate & train. l OSG hardware resources, applications and many other contributions come from OSG consortium members. l OSG technical work is performed together with collaborators & external projects l OSG has partners in Africa, Asia, Europe, North and South America. Text for this slide courtesy of Ruth Pordes

GlobusWORLD 2006Globus Primer11 OSG Partners l Autralian Partnerships for Advanced Computing (APAC) l Data Intensive Science University Network (DISUN) l Enabling Grids for E-SciencE (EGEE) l Grid Laboratory of Wisconsin (GLOW) l Grid Operations Center at Indiana University l Grid Research and Education Group at Iowa (GROW) l Nordic Data Grid Facility (NorduGrid) l Northwest Indiana Computational Grid (NWICG) l New York State Grid (NYSGrid) (in progress). l TeraGrid l Texas Internet Grid for Research and Education (TIGRE) l TWGrid (from Academica Sinica Grid Computing) Worldwide LHC Computing Grid Collaboration (WLCG) Slide courtesy of Ruth Pordes, OSG All Hands Meeting 2007

GlobusWORLD 2006Globus Primer Resources across production & integration infrastructures !ncrease in ~15 since Seattle 27 Virtual Organizations (+ 3 operations VOs) 25% non-physics. ~20,000 cores (from 30 to 4000 cores per cluster) ~6 PB accessible Tapes ~4 PB Shared Disk Sustaining through OSG submissions: Measuring ~180K CPUhours/day. ~Factor of 50% more (being measured) than in Seattle Using production & research networks OSG Snapshot Slide courtesy of Ruth Pordes, OSG All Hands Meeting 2007

GlobusWORLD 2006Globus Primer13 MEDICUS Picture courtesy of Stephan Erberich

II. How Grids are Built and Used

GlobusWORLD 2006Globus Primer15 Methodology l Building a Grid system or application is currently an exercise in software integration. u Define user requirements u Derive system requirements or features u Survey existing components u Identify useful components u Develop components to fit into the gaps u Integrate the system u Deploy and test the system u Maintain the system during its operation l This should be done iteratively, with many loops and eddies in the flow.

GlobusWORLD 2006Globus Primer16 What End Users Need Secure, reliable, on- demand access to data, software, people, and other resources (ideally all via a Web Browser!)

GlobusWORLD 2006Globus Primer17 How it Happens Web Browser Compute Server Data Catalog Data Viewer Tool Certificate authority Chat Tool Credential Repository Web Portal Compute Server Resources implement standard access & management interfaces Collective services aggregate &/or virtualize resources Users work with client applications Application services organize VOs & enable access to other services Database service Database service Database service Simulation Tool Camera Telepresence Monitor Registration Service

GlobusWORLD 2006Globus Primer18 How it Happens l Implementations are provided by a mix of u Application-specific code u “Off the shelf” tools and services u Tools and services from the Grid community (Globus + others using the same standards) l Glued together by… u Application development u System integration

GlobusWORLD 2006Globus Primer19 The Importance of Community l All Grid technology is evolving rapidly. u Web services standards u Grid interfaces u Grid implementations u Grid resource providers (ASP, SSP, etc.) l Community is important! u Best practices (OGF, OASIS, etc.) u Open source (Linux, Axis, Globus, etc.) l Application of community standards is vital. u Increases leverage u Mitigates (a bit) effects of rapid evolution u Paves the way for future integration/partnership