Slides from R. Jones setting up an OSG VO for GlueX.

Slides:



Advertisements
Similar presentations
European Strategy for Particle Physics 2013 Preparatory group->Strategy group Individual town meetings Town meeting in Krakow: september 2012 Drafting.
Advertisements

Centauro and STrange Object Research (CASTOR) - A specialized detector system dedicated to the search for Centauros and Strangelets in the baryon dense,
The Open Science Grid: Bringing the power of the Grid to scientific research
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
Open Science Grid and Gluex Richard Jones Gluex Collaboration Meeting, Newport News, Jan , 2010.
Present and Future Computing Requirements for Physics Experiments ] Richard Jones GlueX.
24/04/2007ALICE – Masterclass Presentation1 ALICE Hannah Scott University of Birmingham.
Jan 2010 Current OSG Efforts and Status, Grid Deployment Board, Jan 12 th 2010 OSG has weekly Operations and Production Meetings including US ATLAS and.
Open Science Grid By Zoran Obradovic CSE 510 November 1, 2007.
AustrianGrid, LCG & more Reinhard Bischof HPC-Seminar April 8 th 2005.
OSG GUMS CE SE VOMS VOMRS UConn-OSG University of Connecticut GLUEX support center Gluex VO Open Science Grid All-Hands Meeting, Chicago, IL, Mar. 8-11,
Identification of Upsilon Particles Using the Preshower Detector in STAR Jay Dunkelberger, University of Florida.
April 2009 OSG Grid School - RDU 1 Open Science Grid John McGee – Renaissance Computing Institute University of North Carolina, Chapel.
Assessment of Core Services provided to USLHC by OSG.
OSG Consortium and Stakeholders Bill Kramer – Chair, OSG Counil National Energy Research Scientific Computing Center Lawrence.
GlueX Collaboration Meeting February 2011 Jefferson Lab Our 30’th Collaboration Meeting.
High Throughput Computing on the Open Science Grid lessons learned from doing large-scale physics simulations in a grid environment Richard Jones, University.
Open Science Grid For CI-Days Internet2: Fall Member Meeting, 2007 John McGee – OSG Engagement Manager Renaissance Computing Institute.
Discovery of the Higgs Boson Gavin Lawes Department of Physics and Astronomy.
Jefferson Lab Strategic Planning Divisional Town Meeting Mission Statement of your Division – What is or should be the mission statement of your division?
VOX Project Status T. Levshina. Talk Overview VOX Status –Registration –Globus callouts/Plug-ins –LRAS –SAZ Collaboration with VOMS EDG team Preparation.
GGF12 – 20 Sept LCG Incident Response Ian Neilson LCG Security Officer Grid Deployment Group CERN.
OSG RA plans Doug Olson, LBNL May Contents RA, agent, sponsor layout & OU=People use case Sample web form Agent Role GridAdmin Role Questions.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
8th November 2002Tim Adye1 BaBar Grid Tim Adye Particle Physics Department Rutherford Appleton Laboratory PP Grid Team Coseners House 8 th November 2002.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
Report by the Open Science Grid Council Subcommittee to Address At- Large VO Representation on the Consortium Council Shaowen Wang (on behalf of the committee)
First Results of Curtis A. Meyer GlueX Spokesperson.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Operated by the Southeastern Universities Research Association for the U.S. Depart. Of Energy Thomas Jefferson National Accelerator Facility Andy Kowalski.
Partnerships & Interoperability - SciDAC Centers, Campus Grids, TeraGrid, EGEE, NorduGrid,DISUN Ruth Pordes Fermilab Open Science Grid Joint Oversight.
OSG PKI Transition: Transition Phase Report Von Welch OSG PKI Transition Lead Indiana University Center for Applied Cybersecurity Research.
Curtis A. Meyer Nuclear Science in the United States The Core of Matter The Fuel of Stars.
INFSO-RI Enabling Grids for E-sciencE OSG-LCG Interoperability Activity Author: Laurence Field (CERN)
…building the next IT revolution From Web to Grid…
Open Science Grid Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab September 1, 2005.
ESFRI & e-Infrastructure Collaborations, EGEE’09 Krzysztof Wrona September 21 st, 2009 European XFEL.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
Overview of Privilege Project at Fermilab (compilation of multiple talks and documents written by various authors) Tanya Levshina.
High Energy Physics and Grids at UF (Dec. 13, 2002)Paul Avery1 University of Florida High Energy Physics.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
VO Privilege Activity. The VO Privilege Project develops and implements fine-grained authorization to grid- enabled resources and services Started Spring.
OSG RA, DOEGrids CA features Doug Olson, LBNL August 2006.
Search for a Z′ boson in the dimuon channel in p-p collisions at √s = 7TeV with CMS experiment at the Large Hadron Collider Search for a Z′ boson in the.
Last update 21/01/ :05 LCG 1Maria Dimou- cern-it-gd Current LCG User Registration, VO management and Authorisation Procedures VOMS workshop
Gluex VO Status Report Richard Jones, University of Connecticut OSG Council Meeting, September 11, 2012.
Gluex VO Status Report Richard Jones, University of Connecticut OSG Council Meeting, September 11, 2012.
Open Science Grid: Beyond the Honeymoon Dane Skow Fermilab October 25, 2005.
John Womersley 1/13 Fermilab’s Future John Womersley Fermilab May 2004.
Sept 2008 OSG Engagement VO, RENCI 1 Open Science Grid Embedded Immersive Engagement for Cyberinfrastructure on the Open Science Grid John McGee –
VOX Project Status T. Levshina. 5/7/2003LCG SEC meetings2 Goals, team and collaborators Purpose: To facilitate the remote participation of US based physicists.
Relativistic Nuclear Collisions (RNC) Group Nuclear Science Division (NSD), Lawrence Berkeley National Lab Large Hadron Collider (LHC) Spin physics program.
Opensciencegrid.org Operations Interfaces and Interactions Rob Quick, Indiana University July 21, 2005.
EGEE is a project funded by the European Union under contract IST New VO Integration Fabio Hernandez ROC Managers Workshop,
Detector R&D through the NSF PHY division. Jim Shank/Jim Whitmore, NSF CPAD Meeting Arlington, TX 5-7 October, 2015.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
Grid Deployment Technical Working Groups: Middleware selection AAA,security Resource scheduling Operations User Support GDB Grid Deployment Resource planning,
Building on virtualization capabilities for ExTENCI Carol Song and Preston Smith Rosen Center for Advanced Computing Purdue University ExTENCI Kickoff.
Open Science Grid Consortium Storage on Open Science Grid Placing, Using and Retrieving Data on OSG Resources Abhishek Singh Rana OSG Users Meeting July.
Grid Colombia Workshop with OSG Week 2 Startup Rob Gardner University of Chicago October 26, 2009.
Particle Physics Sector Young-Kee Kim / Greg Bock Leadership Team Strategic Planning Winter Workshop January 29, 2013.
Open Science Grid Progress and Status
LHC DATA ANALYSIS INFN (LNL – PADOVA)
Samples of Hall B Results with Strong Italian Impact
Open Science Grid Overview
Particle Physics Theory
Grid Application Model and Design and Implementation of Grid Services
Presentation transcript:

slides from R. Jones setting up an OSG VO for GlueX

2 OSG: Open Science Grid common grid architecture  A collection of Virtual Organizations (VO’s) devoted to scientific research that share computation and storage resources within a common grid architecture. from “The Open Science Grid aims to promote discovery and collaboration in data-intensive research by providing a computing facility and services that integrate distributed, reliable and shared resources to support computation at all scales.”

3 Partial list of existing VO’s ALICE: ALICE collaboration, High Energy Physics experiment at CERN LHC ATLAS: United States ATLAS Collaboration CDF: Collider Detector at Fermilab CIGI: CyberInfrastructure and Geospatial Information Laboratory CMS: Compact Muon Solenoid CompBioGrid: CompBioGrid DES: Dark Energy Survey DOSAR: Distributed Organization for Scientific and Academic Research DZero: D0 Experiment at Fermilab Engage: Engagement Fermilab: Fermi National Accelerator Laboratory FermilabAccelerator: Fermilab/Accelerator FermilabAstro: Fermilab/Astro FermilabCdms: Fermilab/Cdms FermilabGrid: fermilab VO grid group FermilabHypercp: Fermilab/Hypercp FermilabKTeV: Fermilab/KTeV FermilabMinerva: Fermilab/Minerva FermilabMiniboone: Fermilab/Miniboone FermilabMinos: Fermilab/Minos FermilabMipp: Fermilab/Mipp FermilabMu2e: Fermilab/Mu2e FermilabNova: Fermilab/Nova FermilabNumi: Fermilab/Numi FermilabPatriot: Fermilab/Patriot FermilabTest: Fermilab/Test FermilabTheory: Fermilab/Theory geant4: Geant4 Software Toolkit GLOW: Grid Laboratory of Wisconsin GPN: Great Plains Network GRASE: Group Researching Advances in Software Engineering at University of New York at Buffalo GROW: Grid Research and Education Group at Iowa i2u2: Interactions in Understanding the Universe Initiative IceCube: IceCube Neutrino Telescope ILC: International Linear Collider JDEM: Joint Dark Energy Mission, Science Operations Center LIGO: Laser Interferometer Gravitational-Wave Observatory mariachi: Mixed Apparatus for Radar Investigation of Cosmic-rays of High Ionization Experiment MIS: OSG Monitoring Information System nanoHUB: nanoHUB Network for Computational Nanotechnology (NCN) NEBioGrid: New England Biomedical Grid NWICG: Northwest Indiana Computational Grid NYSGRID: NYSGRID Ops: WLCG Operations Group OSG: Open Science Grid OSGEDU: OSG Education Activity SBGrid: Structural Biology Grid STAR: Solenoidal Tracker at RHIC

4 Why are we interested?  Key component of the Collaborative Analysis Toolkit grant (NSF/PIF-2006) from the proposal:... this project would fund the development of an analysis suite, built on the backbone of the Open Science Grid, that would allow transparent analysis of current and future experimental data. The collaboration among scientists that will be facilitated by this development will ultimately lead to a better understanding of QCD.

5 but there are other reasons… InstitutioncontactNodesCoresCPU JlabSandy Philpott x86 Indiana U.Matt Shepherd x86 U. EdinburghDan Watts1456 ? U. GlasgowKen Livingston x86-64 Carnegie MellonCurtis Meyer47286 x86 Florida State U.Paul Eugenio X86-64 U. of ReginaZisis Papandreou1020 x86 U. of ConnecticutRichard Jones x86 Why don’t we share – everyone wins!

6 Bringing Gluex into the OSG  what is involved in making a new VO? 1.A Charter statement describing the purpose of the VO. 2.A VO Membership Service which meets the requirements of an OSG Release. 3.A support organization (called a Support Center in OSG parlance) that will support the VO in OSG Operations. 4.Completion of the registration form using these instructions.registration formthese instructions

7 1. Gluex VO: Charter statement  “This should be concise, yet long enough to scope intended usage of OSG resources.” example statement for ALICE: The ALICE Collaboration is building a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected. The existence of such a phase and its properties are key issues in QCD for the understanding of confinement and of chiral-symmetry restoration. For this purpose, we intend to carry out a comprehensive study of the hadrons, electrons, muons and photons produced in the collision of heavy nuclei. Alice will also study proton-proton collisions both as a comparison with lead-lead collisions and in physics areas where Alice is competitive with other LHC experiments. The ALICE Collaboration is building a dedicated heavy-ion detector to exploit the unique physics potential of nucleus-nucleus interactions at LHC energies. Our aim is to study the physics of strongly interacting matter at extreme energy densities, where the formation of a new phase of matter, the quark-gluon plasma, is expected. The existence of such a phase and its properties are key issues in QCD for the understanding of confinement and of chiral-symmetry restoration. For this purpose, we intend to carry out a comprehensive study of the hadrons, electrons, muons and photons produced in the collision of heavy nuclei. Alice will also study proton-proton collisions both as a comparison with lead-lead collisions and in physics areas where Alice is competitive with other LHC experiments.

8 1. Gluex VO: Charter statement  RJ’s draft for Gluex, modeled after US-ATLAS: The GlueX Collaboration is building a 12 GeV photon beam line and a dedicated spectrometer to study fundamental issues in strong QCD through meson photoproduction at Jefferson Laboratory. Our primary aim is to identify gluonic resonances by detecting their decays into exclusive final states in a hermetic detector with high acceptance and good resolution for both charged and neutral particles. Unambiguous discovery of a multiplet of hybrid mesons will provide answers to long- standing questions regarding how gluonic degrees of freedom are expressed in hadrons. Other related issues in hadronic physics within the scope of GlueX include chiral symmetry-breaking in the pseudo- scalar nonet, rare neutral meson decays, quark hadronization in nuclear matter, and nucleon structure through inverse-DVCS.

9 2. Gluex VO: a VO membership service  “This means being able to provide a full list of members' DNs. The currently recommended way to do this is to deploy the VOMS package from the OSG software package.”VOMS package  VOMS is just one package from a list of web services that are part of the grid “middleware” infrastructure for supporting a VO.  a lot of work – happily, most of this has to be done only once (by me)

10 2. OSG infrastructure: a lot of stuff This is your site VOMS VOMRS This is at UConn users register admins grant access

11 2. OSG infrastructure: a lot of work  Setting up the VO (first time) – a couple of weeks  Setting up a site (first time) – a couple of days  Becoming a user (first time) – a couple of hours  It looks like once the tools are set up, it will be relatively painless to administer – decentralized authorization management based on groups, roles. experience will tell, but VO admin  Representative  Site admin  LRP where  means “can delegate authorization to”

12 2. Gluex VOMS: current configuration  Institutions defined so far: Carnegie Mellon University Catholic University of America Christopher Newport University Florida International University Florida State University Indiana University Jefferson Lab North Carolina A&T Santa Maria University University of Athens University of Connecticut University of Massachusetts University of North Carolina Wilmington University of Regina

13 2. Gluex VOMS: what’s next  One person from each institution  One person from each institution must agree to serve as VO representative for that group. 1.Go to and fill out the form to request a personal grid certificate. [Non-US institutions may decide to use the equivalent certificate provider from their home country.] 2.Under “sponsor” I listed Jefferson Lab, with Elton as my contact. [I know it says, list your VO as your sponsor, except that I need a list of users with certificates before I can complete the VO application process for Gluex !] 3.Wait for a day or so, and they will you back with instructions for how to fetch your grid certificate. Install it in your browser under “client certificates” and keep a copy in a safe place. Without this, you will only be able to access the Gluex VO facilities as a guest.

14 2. Gluex VOMS: what’s next  Once your institution has an authorized VO rep, that person will be responsible for granting membership to all of the other members from that institution.  Resource access is controlled locally at each site, based on what they want to grant. Permission is granted based on the map  Mapping is configured using web gui tools by the site administrator for each site. [user, group, role]  resources

15 2. Gluex VOMS: what’s next  Only the site reps need to request certificates at this point.  Once site reps are registered, I will submit the formal request for admission as a OSG VO. Only the UConn cluster resources will be configured for OSG access at that point – enough to get started.  Groups and roles defined so far are:  Eg. /Gluex/software/Role=admin  admin of /Gluex/software groups /Gluex – all VO registered people are members /Gluex/cat-pwa – CAT project people, restricted /Gluex/software – software developers, open /Gluex/simulation – simulation producers, open /Gluex/production – analysis producers, openroles admin – grants access to group, if restricted

16 3. Gluex VO: a support organization  “A support organization (called a Support Center in OSG parlance) that will support the VO in OSG Operations. The Support Center should provide at least the following: a written description of the registration process, instructions for the members of the VO on how to complete the VO registration process, instructions for the members of the VO on how to report problems and/or obtain help.  Notice to site reps: I will be seeking help from you in writing these help pages, based on your experience as you complete the registration.  Ongoing support organization is at UConn for next 2 years – included in duties of CAT project postdoc.

17 4. Gluex VO: complete registration  “Completion of the registration form using these instructions.”registration formthese instructions  Will happen as soon as a core set of site administrators are registered, demonstrating that we have the critical mass to justify setting up a new VO.  Resource questions will follow later. For the moment, it is people and institutional interest that is needed to get the VO in place.

18 Yet to come: client package  allows you to submit and track compute jobs store data files to the grid search grid storage for data files and retrieve them monitor resource availability  all using standard tools you already know condor-g, grid-ftp, globus-url-copy, srm, … your web browser