D0 Event Production volume (Residual impact of Joint D0-OSG Taskforce, overall till July 2010) Joint D0-OSG TaskForce Started using Opportunistic Storage.

Slides:



Advertisements
Similar presentations
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Advertisements

LCG Tiziana Ferrari - SC3: INFN installation status report 1 Service Challenge Phase 3: Status report Tiziana Ferrari on behalf of the INFN SC team INFN.
INFSO-RI Enabling Grids for E-sciencE The EGEE project Fabrizio Gagliardi Project Director EGEE CERN, Switzerland Research Infrastructures.
Green Action for Change: Camden’s environmental sustainability delivery plan ( ) Camden Sustainability Partnership Board 12 th January 2011.
9/25/08DLP1 OSG Operational Security D. Petravick For the OSG Security Team: Don Petravick, Bob Cowles, Leigh Grundhoefer, Irwin Gaines, Doug Olson, Alain.
 Contributing >30% of throughput to ATLAS and CMS in Worldwide LHC Computing Grid  Reliant on production and advanced networking from ESNET, LHCNET and.
Comparison of Two Preoperative Skin Antiseptic Preparations and Resultant Surgical Incise Drape Adhesion to Skin in Healthy Volunteers by Gary L. Grove,
Jan 2010 Current OSG Efforts and Status, Grid Deployment Board, Jan 12 th 2010 OSG has weekly Operations and Production Meetings including US ATLAS and.
Open Science Grid Frank Würthwein UCSD. 2/13/2006 GGF 2 “Airplane view” of the OSG  High Throughput Computing — Opportunistic scavenging on cheap hardware.
Phase Difference = Phase Difference = 0.05.
Ian M. Fisk Fermilab February 23, Global Schedule External Items ➨ gLite 3.0 is released for pre-production in mid-April ➨ gLite 3.0 is rolled onto.
Stefano Belforte INFN Trieste 1 CMS SC4 etc. July 5, 2006 CMS Service Challenge 4 and beyond.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
Open Science Grid Software Stack, Virtual Data Toolkit and Interoperability Activities D. Olson, LBNL for the OSG International.
José M. Hernández CIEMAT Grid Computing in the Experiment at LHC Jornada de usuarios de Infraestructuras Grid January 2012, CIEMAT, Madrid.
Computing Infrastructure Status. LHCb Computing Status LHCb LHCC mini-review, February The LHCb Computing Model: a reminder m Simulation is using.
Open Science Grid For CI-Days Elizabeth City State University Jan-2008 John McGee – OSG Engagement Manager Manager, Cyberinfrastructure.
Rackspace Analyst Event Tim Bell
Concept: Well-managed provisioning of storage space on OSG sites owned by large communities, for usage by other science communities in OSG. Examples –Providers:
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
OSG Project Manager Report for OSG Council Meeting OSG Project Manager Report for OSG Council Meeting October 14, 2008 Chander Sehgal.
SAWS' Twin Oaks Aquifer Storage and Recovery Project SAWS' Twin Oaks Aquifer Storage and Recovery Project October Texas Innovative Water 2010 Roberto.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
PDSF at NERSC Site Report HEPiX April 2010 Jay Srinivasan (w/contributions from I. Sakrejda, C. Whitney, and B. Draney) (Presented by Sandy.
São Paulo Regional Analysis Center SPRACE Status Report 22/Aug/2006 SPRACE Status Report 22/Aug/2006.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
June 10, D0 Use of OSG D0 relies on OSG for a significant throughput of Monte Carlo simulation jobs, will use it if there is another reprocessing.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
1 User Analysis Workgroup Discussion  Understand and document analysis models  Best in a way that allows to compare them easily.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
Open Science Grid & its Security Technical Group ESCC22 Jul 2004 Bob Cowles
High Energy FermiLab Two physics detectors (5 stories tall each) to understand smallest scale of matter Each experiment has ~500 people doing.
External Communication & Coordination Ruth Pordes
April 26, Executive Director Report Executive Board 4/26/07 Things under control Things out of control.
Jan 2010 OSG Update Grid Deployment Board, Feb 10 th 2010 Now having daily attendance at the WLCG daily operations meeting. Helping in ensuring tickets.
Introducing the Open Science Grid Project supported by the Department of Energy Office of Science SciDAC-2 program from the High Energy Physics, Nuclear.
CMS Usage of the Open Science Grid and the US Tier-2 Centers Ajit Mohapatra, University of Wisconsin, Madison (On Behalf of CMS Offline and Computing Projects)
June 15, PMG Ruth Pordes Status Report US CMS PMG July 15th Tier-1 –LCG Service Challenge 3 (SC3) –FY05 hardware delivery –UAF support Grid Services.
Plans for Service Challenge 3 Ian Bird LHCC Referees Meeting 27 th June 2005.
Sep 25, 20071/5 Grid Services Activities on Security Gabriele Garzoglio Grid Services Activities on Security Gabriele Garzoglio Computing Division, Fermilab.
OSG Deployment Preparations Status Dane Skow OSG Council Meeting May 3, 2005 Madison, WI.
Report from GSSD Storage Workshop Flavia Donno CERN WLCG GDB 4 July 2007.
An Introduction to Campus Grids 19-Apr-2010 Keith Chadwick & Steve Timm.
1 Andrea Sciabà CERN The commissioning of CMS computing centres in the WLCG Grid ACAT November 2008 Erice, Italy Andrea Sciabà S. Belforte, A.
A. Mohapatra, T. Sarangi, HEPiX-Lincoln, NE1 University of Wisconsin-Madison CMS Tier-2 Site Report D. Bradley, S. Dasu, A. Mohapatra, T. Sarangi, C. Vuosalo.
External Communication & Coordination Ruth Pordes
1 Open Science Grid.. An introduction Ruth Pordes Fermilab.
DOSAR Workshop III, OU P. Skubic DOSAR Workshop III Goals and Organization P. Skubic OU.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
ALICE Physics Data Challenge ’05 and LCG Service Challenge 3 Latchezar Betev / ALICE Geneva, 6 April 2005 LCG Storage Management Workshop.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Open Science Grid Consortium Storage on Open Science Grid Placing, Using and Retrieving Data on OSG Resources Abhishek Singh Rana OSG Users Meeting July.
Dissemination and User Feedback Castor deployment team Castor Readiness Review – June 2006.
Scientific Computing at Fermilab Lothar Bauerdick, Deputy Head Scientific Computing Division 1 of 7 10k slot tape robots.
Storage, Networking and Data Management Don Petravick, Fermilab OSG Milwaukee meeting July, 2005.
ATLAS on Grid3/OSG R. Gardner December 16, 2004.
Bob Jones EGEE Technical Director
ATLAS Sites Jamboree, CERN January, 2017
University of Delaware
Tips Need to Consider When Organizing a College Event
Continuous Slot Well Screens.
ماجستير إدارة المعارض من بريطانيا
SOCIAL DIALOGUE TIMELINE
© T Madas.
Table 2. Progression-Free Survival
Leadership Programmes & Alignment
Effective Partnerships: What Role in Migration and Development?
Building a CMMI Data Infrastructure
Presentation transcript:

D0 Event Production volume (Residual impact of Joint D0-OSG Taskforce, overall till July 2010) Joint D0-OSG TaskForce Started using Opportunistic Storage on OSG D0 SRM Debug phase Initiation phase CAUTION: CYCLIC D0 Grid DP Initiative Infrastructure Overhaul phase-2 D0 Grid DP Initiative Infrastructure Overhaul phase-1 Sustained production (more CMS T1 slots, more SRM sites, improved D0 infrastructure)

D0 Event Production volume (Residual impact of Joint D0-OSG Taskforce, overall till April 2010) Joint D0-OSG TaskForce Started using Opportunistic Storage on OSG D0 SRM Debug phase Initiation phase CAUTION: CYCLIC D0 Grid DP Initiative Infrastructure Overhaul phase-2 D0 Grid DP Initiative Infrastructure Overhaul phase-1 Sustained production (more CMS T1 slots, more SRM sites, improved D0 infrastructure)

D0 Event Production volume (During Joint D0-OSG Taskforce) D0 SRM Debug phase Initiation phase Joint D0-OSG TaskForce Started using Opportunistic Storage on OSG

Old peaks

D0 Event Production volume (Residual impact, overall till July 2009) Joint D0-OSG TaskForce Started using Opportunistic Storage on OSG D0 SRM Debug phase Initiation phase CAUTION: CYCLIC D0 Grid DP Initiative Infrastructure Overhaul phase-2 D0 Grid DP Initiative Infrastructure Overhaul phase-1

D0 Event Production volume (Overall till May 2009) D0 SRM Debug phase Initiation phase CAUTION: CYCLIC D0 Grid DP Initiative Infrastructure Overhaul phase-2 D0 Grid DP Initiative Infrastructure Overhaul phase-1 Joint D0-OSG TaskForce Started using Opportunistic Storage on OSG

D0 Event Production volume (Overall till Jan 2009) D0 SRM Debug phase D0 Infrastructure Overhaul phase Initiation phase CAUTION: CYCLIC Joint D0-OSG TaskForce Started using Opportunistic Storage on OSG