Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 CHEP2007 Highlights. 2 Computing in High-Energy Physics 2001 - Beijing 2003 – La Jolla 2005 - Interlaken 2006 - Mumbai 1997 - Berlin 1998 - Chicago.

Similar presentations


Presentation on theme: "1 CHEP2007 Highlights. 2 Computing in High-Energy Physics 2001 - Beijing 2003 – La Jolla 2005 - Interlaken 2006 - Mumbai 1997 - Berlin 1998 - Chicago."— Presentation transcript:

1 1 CHEP2007 Highlights

2 2 Computing in High-Energy Physics 2001 - Beijing 2003 – La Jolla 2005 - Interlaken 2006 - Mumbai 1997 - Berlin 1998 - Chicago

3 3 “CHEP 2007 -- Before the LHC turnon” CHEP has traditionally been dominated by the core HEP facilities – CERN, SLAC, FNAL, etc, and the experiments RHIC and HI played a lesser role Very few HI folks in committees, I scored two plenaries in the past 2007 was LHC-centric like never before (ok, they thought they were turning on this year) ‏ More on this later...

4 4 The Program Plenary Sessions 2 Poster Sessions Online Computing Event Processing Software components, tools and databases Computer facilities, production grids and networking Grid middleware and tools Distributed data analysis and information management Collaborative tools On the side WLCG Workshop (before the CHEP) [Worldwide LHC Computing Grid Project ] Summary at the conference BOF Session about Globus Integrated Site Security workshop “Towards Petascale and Exascale Computing” - IBM

5 5 Opening Day LHC: Machine and Experiments Status and Prospects

6 6 WLCG Worldwide LHC Computing Grid Project

7 7 “The Rest” (that's us) ‏ Frank Wuerthwein (UCSD) supplied w/ info from Carla, me) ‏ I think we should get on board with this, also for other, non- PHENIX things (RatCAP for example) ‏

8 8 This and That Rene Brun --

9 9 AOD or DOA? Federico Carmiati for ALICE “Analysis Object Data” is what we traditionally call “DST” Did you catch this? The “AOD” (DST) size will go down to < 3% of the raw data size. We'll see.

10 10 Back to the GRID I think we should use the technologies more for valid reasons, we have used what actually works (“Middleware”) only (compare to STAR, a lot of effort if grid-ifying s/w, we use RCF much better) ‏ in the eyes of many, this is not really “using the GRID” get on board with local expertise --> Panda to use OSG

11 11 Power Problems From Richard Mount, SLAC I think at the very least, future buildings should be integrated into the heat distribution of a facility You generate heat that you blow off with A/C, while you run a furnace in winter to heat office space illegal in many European countries It drives me up the wall how the 1008 building, for example, is run typical PC power supplies run at 30% efficiency at best any improvements save x3 – the primary power, plus A/C which again runs at 2x load – 100 watts cost 200w to cool

12 12 EVO We (that is mostly the”Gabor crowd” around the current PPG080) tried to use EVO for the remote folks (Baldo, Christian, Tadaaki, me on occasion) ‏ Not smooth sailing at all, initially, like the early radio days (important that you heard anything at all, not that it was good) ‏ However, meetings are much easier to follow with even a thumbnail-sized video feed (we point a webcm at the screen to give a cue what slide we're looking at) ‏ I met Phillipe Galvez at CHEP and we could sort ot some issues right on the spot. He identified some firewall/config issues, and took home some bug reports that I could demonstrate “live” Also...

13 13 Odds 'n Ends

14 14 Noted

15 15 LHC DAQ's

16 16 Disappointments “The one thing that was disappointing was, in my opinion, the too strong focus of the conference program on the LHC.... One would think that the experiences of our experiments would be of interest to the physicists and engineers at the LHC, but virtually all RHIC- related presentations were downgraded to posters, and my own single PHENIX talk was allocated 12 minutes in a parallel session. “ From my trip report... Does this looks like we are well represented? With all due respect to Jim, his universe holds ATLAS and little else He isn't even BNL Michael sees things the same way, lots of grumbling Good that things changed a bit @ RCF.

17 17 Summary On my rating scale, this conference gets a “C+” As I said, a bit too much bias towards LHC cozy environment; I like if there is one conference hotel that hosts all relatively relaxed, it was Poster Sessions too crowded, environmentally challenged (soggy posters) ‏ wireless network was atrocious for a computing conference From Niko's Online Track summary


Download ppt "1 CHEP2007 Highlights. 2 Computing in High-Energy Physics 2001 - Beijing 2003 – La Jolla 2005 - Interlaken 2006 - Mumbai 1997 - Berlin 1998 - Chicago."

Similar presentations


Ads by Google