LIGO LSC DataGrid Workshop March 24-26, 2005 Livingston Observatory.

Slides:



Advertisements
Similar presentations
The Access Grid Ivan R. Judson 5/25/2004.
Advertisements

International Grid Communities Dr. Carl Kesselman Information Sciences Institute University of Southern California.
The National Grid Service and OGSA-DAI Mike Mineter
Andrew McNab - Manchester HEP - 22 April 2002 EU DataGrid Testbed EU DataGrid Software releases Testbed 1 Job Lifecycle Authorisation at your site More.
Seminar Grid Computing ‘05 Hui Li Sep 19, Overview Brief Introduction Presentations Projects Remarks.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Milos Kobliha Alejandro Cimadevilla Luis de Alba Parallel Computing Seminar GROUP 12.
Grid Services at NERSC Shreyas Cholia Open Software and Programming Group, NERSC NERSC User Group Meeting September 17, 2007.
Knowledge Environments for Science: Representative Projects Ian Foster Argonne National Laboratory University of Chicago
Simo Niskala Teemu Pasanen
Grid Computing Net 535.
Grid Toolkits Globus, Condor, BOINC, Xgrid Young Suk Moon.
LIGO-G E ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech Argonne, May 10, 2004.
The GRIDS Center, part of the NSF Middleware Initiative The GRIDS Center: Defining and Deploying Grid Middleware presented by Tom.
Patrick R Brady University of Wisconsin-Milwaukee
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
Grid Computing - AAU 14/ Grid Computing Josva Kleist Danish Center for Grid Computing
INFSO-RI Enabling Grids for E-sciencE The US Federation Miron Livny Computer Sciences Department University of Wisconsin – Madison.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
K. De UTA Grid Workshop April 2002 U.S. ATLAS Grid Testbed Workshop at UTA Introduction and Goals Kaushik De University of Texas at Arlington.
LIGO- G Z Planning Meeting (Dec 2002)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO-G9900XX-00-M ITR 2003 DMT Sub-Project John G. Zweizig LIGO/Caltech.
Miron Livny Computer Sciences Department University of Wisconsin-Madison Welcome and Condor Project Overview.
Manuela Campanelli The University of Texas at Brownsville EOT-PACI Alliance All-Hands Meeting 30 April 2003 Urbana, Illinois GriPhyN.
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
José D. Zamora, Sean R. Morriss and Manuela Campanelli.
Sean Morriss and Jose Zamora The University of Texas at Brownsville GriPhyN NSF Project Review January 2003.
Part Four: The LSC DataGrid Part Four: LSC DataGrid A: Data Replication B: What is the LSC DataGrid? C: The LSCDataFind tool.
© 2007 UC Regents1 Track 1: Cluster and Grid Computing NBCR Summer Institute Session 1.1: Introduction to Cluster and Grid Computing July 31, 2007 Wilfred.
Tools for collaboration How to share your duck tales…
Grid Middleware Tutorial / Grid Technologies IntroSlide 1 /14 Grid Technologies Intro Ivan Degtyarenko ivan.degtyarenko dog csc dot fi CSC – The Finnish.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
The Open Science Grid OSG Ruth Pordes Fermilab. 2 What is OSG? A Consortium of people working together to Interface Farms and Storage to a Grid and Researchers.
Authors: Ronnie Julio Cole David
GRIDS Center Middleware Overview Sandra Redman Information Technology and Systems Center and Information Technology Research Center National Space Science.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
GRID Overview Internet2 Member Meeting Spring 2003 Sandra Redman Information Technology and Systems Center and Information Technology Research Center National.
Middleware Camp NMI (NSF Middleware Initiative) Program Director Alan Blatecky Advanced Networking Infrastructure and Research.
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
LIGO- G Z EAC Meeting (Jan 2003)LSC Member Institution (UT Brownsville) 1 Manuela Campanelli The University of Texas at Brownsville
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 GriPhyN Education and Outreach Joe Romano University of.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
Introduction to Grids By: Fetahi Z. Wuhib [CSD2004-Team19]
Manuela Campanelli The University of Texas at Brownsville GriPhyN NSF Project Review January 2003 Chicago Education & Outreach.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
Alain Roy Computer Sciences Department University of Wisconsin-Madison Condor & Middleware: NMI & VDT.
Contact: Junwei Cao SC2005, Seattle, WA, November 12-18, 2005 The authors gratefully acknowledge the support of the United States National.
State of LSC Data Analysis and Software LSC Meeting LIGO Hanford Observatory November 11 th, 2003 Kent Blackburn, Stuart Anderson, Albert Lazzarini LIGO.
2005 GRIDS Community Workshop1 Learning From Cyberinfrastructure Initiatives Grid Research Integration Development & Support
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
Feb 2-4, 2004LNCC Workshop on Computational Grids & Apps Middleware for Production Grids Jim Basney Senior Research Scientist Grid and Security Technologies.
U.S. Grid Projects and Involvement in EGEE Ian Foster Argonne National Laboratory University of Chicago EGEE-LHC Town Meeting,
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
The National Grid Service Mike Mineter.
LIGO-G Z1 Using Condor for Large Scale Data Analysis within the LIGO Scientific Collaboration Duncan Brown California Institute of Technology.
LIGO- G Z iVDGL Kick-off Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli, Joe.
LIGO- G Z GriPhyN All-Hands Meeting LSC Member Institution (UT Brownsville) 1 Education and Outreach Activities Manuela Campanelli,
The Globus Toolkit The Globus project was started by Ian Foster and Carl Kesselman from Argonne National Labs and USC respectively. The Globus toolkit.
GRIDS Center John McGee, USC/ISI April 10, 2003 Internet2 – Spring Member Meeting Arlington, VA NSF Middleware Initiative.
All Hands Meeting 2005 BIRN-CC: Building, Maintaining and Maturing a National Information Infrastructure to Enable and Advance Biomedical Research.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Grid and Cloud Computing
Status of Grids for HEP and HENP
GriPhyN Education and Outreach
Presentation transcript:

LIGO LSC DataGrid Workshop March 24-26, 2005 Livingston Observatory

Part One: Introduction A: Workshop Agenda and Pragmatics B: Defining “the Grid” C: Who’s Who in the Grid World D: Overview of the LSG DataGrid E: Lab 1: Getting Started

A: Workshop Agenda and Pragmatics

Workshop Agenda Thursday, March 24 Introduction Grid Security Data Management Friday, March 25 Job Management Workflow Management MyProxy (Coming Attractions!) Saturday, March 26 Local Presentations

Preparation for the Labs We assume a RedHat 9 installation— Although it’s not impossible that other platforms may work just as well. We’ll assume you’ve installed LSC DataGrid Client Toolkit. We assume your security credentials are already in place.

Bio-Imperatives Food Lunches Dinner Plumbing

Temporal Disclaimer The state of the art is: the art is always changing. Grid infrastructure standards are, however, firming up. For the most part, we’re going to be talking about how things work at the moment. We’ll warn you when we go into Coming Attractions mode.

Who Are Those Guys? GRIDS Center David Gehrig, NCSA-UIUC Mike Freemon, NCSA-UIUC Jaime Frey, University of Wisconson—Madison

Now, everybody—

B: Defining “the Grid”

“Grid” Buzzword of the year(s). In enterprise computing, different meanings at different times. It often simply means “cluster computing.” In research, it usually means…

Definition: 1998 “A computational grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities.” Ian Foster and Carl Kesselman: The Grid: Blueprint for a New Computing Infrastructure

Definition: 2002 “A Grid is a system that coordinates resources that are not subject to centralized control using standard, open, general-purpose protocols and interfaces to deliver nontrivial qualities of service.” Ian Foster, ANL: What is the Grid? A Three-Point Checklist

A Working Definition A distributed computing environment that coordinates Computational jobs Data placement Information management Scales from one computer to thousands Capable of working across many administrative domains

C: Who’s Who in the Grid World

National Middleware Initiative Middleware: an evolving layer of services that resides between the network and more traditional applications for managing security, access, and information exchange Funds GRIDS Center Funds Open Grid Computing Environment

GRIDS Center Grid Research Integration, Deployment, and Support Center Mission: making grid technology deployable and useful outside the development labs Packaging Education

The Globus Alliance Creates core infrastructure services Sponsors include: DARPA, DoE, NSF, NASA e-Science (UK), Vetenskapsrådet (Sweden), KTH (Royal Institute of Technology, Stockholm) IBM, Microsoft Research, Cisco Systems

Globus: Participating Institutions Argonne National Laboratories Information Sciences Institute/USC University of Chicago University of Edinburgh (UK) Center for Parallel Computers (Sweden) “Globus Academic Affiliates”

Globus Toolkit: GT3 Software services and libraries Resource monitoring, discovery, and management Security File management Note! GT4: Expected release sixth quarter of 2004

PyGlobus www-itg.lbl.gov/gtg/projects/pyGlobus/ Lawrence Berkeley National Laboratory An interface to the Globus toolkit using the Python scripting language

Condor A serial/parallel job management system for a pool of compute nodes: job queueing mechanism, scheduling policy, priority scheme, resource monitoring, and resource management. Can be used with Globus Toolkit We’ll use “local Condor” and Condor-G

iVDGL: International Virtual Data Grid Laboratory Goals Deploy a Grid laboratory Use Grid software tools in experiments Support delivery of Grid technologies Education and outreach iVDGL pacman and VDT LSC is an active participant

GriPhyN: Grid Physics Network Coalesced around four experiments Compact Muon Solenoid and ATLAS (“A Toroidal LHC ApparatuS”) at LHC/CERN Laser Interferometer Gravitational-wave Observatory Sloal Digital Sky Survey Petabytes of data annually

VDT: Virtual Data Toolkit Goal: to make it as easy as possible for users to deploy, maintain and use grid middleware Initially developed by GriPhyN and iVDGL Now includes LHC Computing Grid (LCG) and Physics Particle Data Grid (PPDG).

VDT: Components Basic Grid Services Condor, Globus Virtual Data Tools Virtual Data System Utilities Such as GSI-OpenSSH

D: Overview of the LSG DataGrid

What is the LSC DataGrid? A collection of LSC computational and storage resources… … linked through Grid middleware… … into a uniform LSC data analysis environment.

LSC DataGrid Sites Tier 1: CalTech Tier 2: UWM and PSU Tier 3: UT-Brownsville and Salish Kootenai College (SKC) Linux clusters at GEO sites Birmingham, Cardiff and the Albert Einstein Institute (AEI) LDAS instances at Caltech, MIT, PSU, and UWM

For this Workshop LSC DataGrid Sites ldas-grid.ligo.caltech.edu ldas-grid.ligo-wa.caltech.edu ldas-grid.ligo-la.caltech.edu We’ll use ldas-grid.ligo-la.caltech.edu as our head node Full list of LSC DataGrid resources at group.phys.uwm.edu/lscdatagrid/resources More discussion of LSC DataGrid later

E: Lab 1 — Getting Started

Lab 1 — Getting Started This lab will verify: Your software is installed correctly Your sacrifices have pleased the webgod Ping Your security credential (i.e. proxy certificate) is okay Your environment variables won’t suddenly go away

Credits Some slides in this presentation were adapted from presentations from GryPhyN Grid Summer Workshop 2004 The Globus Consortium