Presentation is loading. Please wait.

Presentation is loading. Please wait.

DØ and CDF Detectors/Computing Bill Lee Fermilab DOE Annual Science & Technology Review July 12-14, 2010.

Similar presentations


Presentation on theme: "DØ and CDF Detectors/Computing Bill Lee Fermilab DOE Annual Science & Technology Review July 12-14, 2010."— Presentation transcript:

1 DØ and CDF Detectors/Computing Bill Lee Fermilab DOE Annual Science & Technology Review July 12-14, 2010

2 Acknowledgements My thanks to my CDF colleagues who assisted me with the preparation of this presentation.  Massimo Casarsa  Phil Schlabach  Richard St. Denis Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 2

3 Brief Outline Overview of Detectors Current Operations Computing Future Operations Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 3

4 Fermilab Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 4 You Are Here

5 CDF and DØ Detectors Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 5 Muon systems EM and Had Calorimeters Solenoid Tracker Silicon Vertex Detector

6 CDF and DØ Collaborations The DØ Collaboration  19 Countries  86 Institutions  492 collaborators  11% FNAL Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 6 North America  32 institutions Europe  19 institutions Asia  8 institutions The CDF Collaboration  15 Countries  59 Institutions  538 collaborators  15% FNAL

7 DØ Technical Organization Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 7 Spokespersons Technical Integration Coordinator G. Ginther Special Projects M. Johnson Data Quality A. Jonckheere N. Khalatyan Detectors Muon: T. Diehl CFT/PS: M. Corcoran SMT: Z. Ye CTT: M.Corcoran Cal: J.Sekaric, L. Zivkovic L1CAL: S. Cihangir Lum: G.Snow Online W. Lee L3/DAQ J. BackusMayes G. Watts Global Monitoring E. Cheu V. Sirotenko L1 CTT S. Gruenendahl L1 Muon/Cal Track N. Khalatyan L2 M. Mulhearn Luminosity Monitor I. Katsanos M. Prewitt Central Muon A. Ito —————— PDT’s P. Kasper Trigger counters A. Ito Forward Muon V. Evdokimov —————— MDT detectors V. Malyshev MDT Electronics P. Neustroev Pixel detectors S. Kulikov Pixel electronics T. Fitzpatrick L2STT D. Boline V. Parihar SMT N. Parua S. Youn Fiber Tracker/ Preshowers J. Warchol —————— Fiber Tracker J. Warchol Preshowers A. Evdokimov Calorimeter D. Schamberger S. Dyshkant (Deputy) —————— L1 Cal S. Cihangir D. Edmunds ICD L. Sawyer A. White Solenoid H. Fisk Controls G. Savage Triggermeister Run Coordination S. Gruenendahl, W. Lee Electrical Operations: M. Matulik Mechanical Operations: R. Rucinski (FNAL)

8 CDF Operations Organization Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 8 Associate Head, Detector Infrastructure Del Allspach - Steve Hahn Detector Operations Massimo Casarsa Philip Schlabach Operations Managers Safety Coordinator Dee Hahn Admin. Support Process Systems Bill Noe(Leader) Dean Beckner Cutchlow Cahill Jim Humbert Jim Loskot Bruce Vollmer Wayne Waldon Electrical and Mechanical Dervin Allen(Leader) John Bell Roberto Davila Jamie Grado (Bldg. Manager) Lew Morris George Wyatt Slow Controls Steve Hahn(Leader) JC Yun Daily/Weekly Ops Shift Crews Sci-Co Aces Co Silicon Michelle Stancari Sebastian Carron Calorimeter/TOF Larry Nodulman Willis Sakumoto CSL Willis Sakumoto Associate Head, Shift Operations JJ Schmidt CLC Iuri Oksuzian N. Goldschmidt Trigger Dataset Working Group Heather Gerberich Simone Donati TSI/Fred Jonathan Lewis EVB/L3 Farm Pasha Murat L3 Filter Farrukh Azfar Asscociate Head, Online Systems Jonathan Lewis Data Acquisition Bill Badgett Muon Systems Phil Schlabach Giovanni Pauletta Trigger L1/L2 Pierluigi Catastini Pedro Fernandez Sys. Admin./ Database Comp. Div. DQM M. Martinez-Perez COT Bob Wagner Aseet Mukherjee Associate Head, Detector Systems Farrukh Azfar BSC Ken Hatakeyama Jim Lungu Monitoring/Valid Kaori Maeshima Pasha Murat (FNAL)

9 CDF and DØ Collaboration Scientific Operations/Computing Effort Fermilab continues to provide a significant portion of the effort. The ongoing streamlining of detector operations has resulted in a reduction of effort without negatively impacting performance. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 9 CDFToday Operations40 FTE Offline15 FTE Management10 FTE Algorithms10 FTE Total Effort75 FTE DØ 2009 Effort Operations33 FTE Computing13 FTE Management15 FTE Algorithms27 FTE Total technical contributions 88 FTE Shifts not included in effort

10 CDF OPERATIONS Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 10

11 CDF Data Taking Efficiency Last 12 months:  Recorded 83% of delivered luminosity, 79% with the full detector. Run II Average:  83% acquired, 73% good with full detector. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 11 average

12 CDF Data Taking Performance In the last 12 months (includes the 2009 shutdown)  Delivered:2.13 fb -1  Recorded:1.77 fb -1 (83%)  With full det.:1.68 fb -1 (79%) Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 12

13 CDF Total Integrated Luminosity in Run II Integrated Luminosity with full detector: 6.55 fb -1 (73%); Depending on run quality requirements, analyses use 6.3-7.2 fb -1. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 13 9.00 fb -1 7.48 fb -1

14 CDF Shutdown 2009: Highlights Silicon Detector:  Cooling system: replaced the elbows on the ISL cooling lines and attached COT face tubing, sealed a few leaks;  Power supply maintenance: replaced aged capacitors in 21 CAEN power supply modules. Drift Chamber:  Recovered many channels replacing blown resistors in the wires readout circuit. Calorimeter:  Plug calorimeter sources maintenance. Front end crates preventative maintenance: replaced fan packs and filters, new fuse installation, heat exchanger and drip sensor cleaning. Replaced 64 nodes of the L3 farm. Tied in new diesel emergency generator. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 14

15 CDF Start-up after Shutdowns Luminosity delivered and CDF data-taking efficiency in the first 35 days after the last three shutdowns:  2007: 11 weeks duration;  2008: 1 week duration;  2009: 12 weeks duration. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 15

16 Operations Improvements at CDF Track fitter upgrade (GigaFitter) in the level-2 track trigger:  More powerful FPGAs: 1 board in place of 16 boards, more compact, easier maintenance;  More memory available: possibility to extend the track acceptance in impact parameter and momentum. Optimization of the trigger bandwidth:  Optimized track trigger selection at level-2 to fill up the bandwidth at low luminosity. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 16

17 DØ OPERATIONS Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 17

18 DØ has recorded >91% of the delivered luminosity over the past year. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 18

19 Over the past 12 months  Delivered luminosity:2.14 fb -1  Recorded Luminosity:1.98 fb -1 Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 19

20 DØ Shutdown 2009: Highlights Replace scintillator in luminosity monitor. Recover individual silicon HDIs.  Largest fraction of functioning channels in DØ history. Liquid nitrogen dewar vacuum leak repair. Routine maintenance and power supply recovery.  Refurbished rack blowers. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 20 New Old

21 DØ start-up after shutdowns  2007: 11 weeks;  2008: 1 week;  2009: 12 weeks. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 21

22 Operational Improvements at DØ Reduced the downtime at the beginning and end of stores. New FPGA programming increased the efficiency of the L1 central track trigger. Updated trigger lists to address higher peak luminosities. Enhanced monitoring. Documentation improvements to facilitate smoother downtime recoveries. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 22

23 Accessing the Detectors On average CDF and/or DØ access their collision halls 1-2 times per week.  Sometimes there is a high priority need to access the hall. o One of the detectors has a data quality problem.  Most other accesses are opportunistic. o Tevatron problem or other issue allows access. A few times per year CDF or DØ will need a long access (>6 hours) Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 23 Installation of Run IIb upgrades

24 Safety at CDF and DØ Safety remains a central aspect of the collider experiments at Fermilab.  Over the past year PPD has not had a DART case. Shutdown Safety:  Safety is integrated in all aspects of shutdown activities.  A Job Hazard Analysis is a vital portion of the planning of any shutdown job.  Personnel are reminded to keep safety first. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 24

25 DØ COMPUTING Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 25

26 DØ Data Reconstruction Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 26 Plot includes 125 million events that have been processed twice to remove a calorimeter hot cell. Currently the D0 farms are processing data within 3-4 days after recording. This is our minimum allowed processing delay (to accommodate calibrations.

27 DØ Monte Carlo Production Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 27 DØ uses farms throughout the world to produce Monte Carlo events. IN2P3 is a dedicated D0 MC site. Other sites are Grid. Over the last year, the total number of generated Monte Carlo has almost doubled to four billion.

28 DØ Processing Time Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 28 Data taken at higher luminosity values takes longer to process. Higher occupancy/multiplicity. Average luminosity is not expected to greatly increase.

29 CDF COMPUTING Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 29

30 CDF Data Collection for the Past Year Data TypeData Volume (TB)# Events (M)# Files Raw Data306.21892.2340487 Production404.02516.3331081 MC181.3893.9224156 Stripped-Prd14.14080.211360 Stripped-MC000 Ntuple149.54810.9120416 MC Ntuple116.81905.8100308 Total1172.012099.31127808 Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 30 1.9 billion raw events → 2.5 billion reconstructed events → 4.8 billion ntuple events. Additionally 1.9 billion Monte Carlo ntuple events were produced. Almost 1.2 PetaBytes of data.

31 CDF Monte Carlo Production The North American Grid provides Monte Carlo production.  Steady usage of NAmGrid.  Peaks tend to occur before conference periods. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 31

32 CDFGrid usage CDF Grid provides the full environment for data handling.  Provides the majority of computing for analyses. Peaks show over 30k queued jobs  Also conference dependent. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 32

33 CDF Open Science Grid Usage The Open Science Grid (OSG) has provided CDF over 40 million hours of computing over the last year. The use of the OSG has been fruitful and more resources are expected to come online. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 33

34 Common Tools DØ and CDF use a variety of tools common to both experiments.  Enstore – the underlying data transport mechanism for moving data from online to tape. o The tape silos are a shared responsibility.  SAM – a data storage and retrieval tool  GRID – the Open Science Grid is used to distribute processing to locations throughout the world.  GlideinWMS – a work management system which eases submission of computing jobs. The FNAL Computing Division provides common system management  Oracle services, farm management, desktop support, security, data storage management, and more. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 34

35 FUTURE Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 35

36 DØ 2010 Shutdown plans This is a 4 week shutdown beginning next week. Replace luminosity scintillator.  And ~16 PMTs. Silicon HDI recovery.  And other individual channel recovery. Alignment measurements. Calibrations. Trigger framework maintenance. General Maintenance. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 36

37 CDF Shutdown 2010 Plans Silicon detector  Cooling system check.  Junction cards reseating.  Power supply maintenance: replacement of aged capacitors in power supply modules. Drift chamber  Replacement of failing resistors in the wire readout circuit.  Low voltage short repair. General preventative maintenance. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 37

38 Running in FY 2011 Staffing will continue to need attention.  Control room shifters, detector experts, on-call personnel, algorithms, support groups and system administrators. o DOE and Fermilab support of visitors and guest scientists for operational/computing continues to be very valuable to the experiments. (Details in Kilminster/Verzocchi talks.) Tevatron experiments’ computing budgets are 25% below FY10 and half of the experiments’ request. This will force the experiments to depend upon beyond warranty CPUs and will not support planned increases in data storage capacity and analysis speed.  Anticipated FY12 computing budget is ~40% lower than FY10 and below that required to efficiently and reliably support computing and analysis power of the experiments during active stage of Tevatron data analysis. The experiments expect to continue to maintain the high efficiency of the past year. Keep the delay between acquiring data and its reconstruction to a minimum. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 38

39 Summary Collider experiments are operating smoothly and efficiently. Offline processing is maintaining pace with data collection. Total delivered luminosity should reach 12 fb -1 with around 10 fb -1 recorded by the end of FY11.  Will require a dedicated effort from the Tevatron and collider experiments. Challenges maintaining effective Operations:  Increasing pressure on available resources o Personnel and computing.  Ageing detectors and infrastructure. CDF and D0 will take advantage of the shutdown to keep up detector maintenance.  Both experiments continue to improve on their ability to come out of a shutdown efficiently. The DOE's support facilitates our ability to capitalize on these opportunities.  Fermilab has made and continues to make significant contributions to CDF and DØ. Bill Lee, Fermilab - DOE Science & Technology Review July 12-14, 2010 39


Download ppt "DØ and CDF Detectors/Computing Bill Lee Fermilab DOE Annual Science & Technology Review July 12-14, 2010."

Similar presentations


Ads by Google