October 30 2001 LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University.

Slides:



Advertisements
Similar presentations
05/07/00LHCb Computing Model Meeting (CERN) LHCb(UK) Computing Status Glenn Patrick Prototype (Tier 1) UK national computing centre Bid to Joint.
Advertisements

Oklahoma Center for High Energy Physics A DOE EPSCoR Proposal Submitted by a consortium of University of Oklahoma, Oklahoma State University, Langston.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
The LHC Computing Grid Project Tomi Kauppi Timo Larjo.
ATLAS computing in Geneva 268 CPU cores (login + batch) 180 TB for data the analysis facility for Geneva group grid batch production for ATLAS special.
US ATLAS Distributed IT Infrastructure Rob Gardner Indiana University October 26, 2000
Miron Livny Computer Sciences Department University of Wisconsin-Madison From Compute Intensive to Data.
GridPP Steve Lloyd, Chair of the GridPP Collaboration Board.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
High Energy Physics At OSCER A User Perspective OU Supercomputing Symposium 2003 Joel Snow, Langston U.
Grid Computing Oxana Smirnova NDGF- Lund University R-ECFA meeting in Sweden Uppsala, May 9, 2008.
HEP Experiment Integration within GriPhyN/PPDG/iVDGL Rick Cavanaugh University of Florida DataTAG/WP4 Meeting 23 May, 2002.
Andrew McNab - Manchester HEP - 5 July 2001 WP6/Testbed Status Status by partner –CNRS, Czech R., INFN, NIKHEF, NorduGrid, LIP, Russia, UK Security Integration.
16 October 2005 Collaboration Meeting1 Computing Issues & Status L. Pinsky Computing Coordinator ALICE-USA.
Grid Status - PPDG / Magda / pacman Torre Wenaus BNL U.S. ATLAS Physics and Computing Advisory Panel Review Argonne National Laboratory Oct 30, 2001.
Tier 1 Facility Status and Current Activities Rich Baker Brookhaven National Laboratory NSF/DOE Review of ATLAS Computing June 20, 2002.
ATLAS and GridPP GridPP Collaboration Meeting, Edinburgh, 5 th November 2001 RWL Jones, Lancaster University.
LHC Computing Review Recommendations John Harvey CERN/EP March 28 th, th LHCb Software Week.
ALICE-USA Grid-Deployment Plans (By the way, ALICE is an LHC Experiment, TOO!) Or (We Sometimes Feel Like and “AliEn” in our own Home…) Larry Pinsky—Computing.
Data Grid projects in HENP R. Pordes, Fermilab Many HENP projects are working on the infrastructure for global distributed simulated data production, data.
10/24/2015OSG at CANS1 Open Science Grid Ruth Pordes Fermilab
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
5 May 98 1 Jürgen Knobloch Computing Planning for ATLAS ATLAS Software Week 5 May 1998 Jürgen Knobloch Slides also on:
Sean Morriss and Jose Zamora The University of Texas at Brownsville GriPhyN NSF Project Review January 2003.
The ALICE short-term use case DataGrid WP6 Meeting Milano, 11 Dec 2000Piergiorgio Cerello 1 Physics Performance Report (PPR) production starting in Feb2001.
D0RACE: Testbed Session Lee Lueking D0 Remote Analysis Workshop February 12, 2002.
Data Intensive Science Network (DISUN). DISUN Started in May sites: Caltech University of California at San Diego University of Florida University.
Questions for ATLAS  How can the US ATLAS costs per SW FTE be lowered?  Is the scope of the T1 facility matched to the foreseen physics requirements.
EGEE is a project funded by the European Union under contract IST Middleware Planning for LCG/EGEE Bob Jones EGEE Technical Director e-Science.
1 DØ Grid PP Plans – SAM, Grid, Ceiling Wax and Things Iain Bertram Lancaster University Monday 5 November 2001.
GridPP Building a UK Computing Grid for Particle Physics Professor Steve Lloyd, Queen Mary, University of London Chair of the GridPP Collaboration Board.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
October 30, 2001ATLAS PCAP1 LHC Computing at CERN and elsewhere The LHC Computing Grid Project as approved by Council, on September 20, 2001 M Kasemann,
…building the next IT revolution From Web to Grid…
US ATLAS Tier 1 Facility Rich Baker Brookhaven National Laboratory Review of U.S. LHC Software and Computing Projects Fermi National Laboratory November.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
LIGO-G E LIGO Scientific Collaboration Data Grid Status Albert Lazzarini Caltech LIGO Laboratory Trillium Steering Committee Meeting 20 May 2004.
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
GriPhyN EAC Meeting (Jan. 7, 2002)Paul Avery1 Integration with iVDGL è International Virtual-Data Grid Laboratory  A global Grid laboratory (US, EU, Asia,
Tiziana FerrariThe DataTAG Projct, Roma Nov DataTAG Project.
AliEn AliEn at OSC The ALICE distributed computing environment by Bjørn S. Nilsen The Ohio State University.
Alex Read, Dept. of Physics Grid Activities in Norway R-ECFA, Oslo, 15 May, 2009.
GriPhyN Project Paul Avery, University of Florida, Ian Foster, University of Chicago NSF Grant ITR Research Objectives Significant Results Approach.
U.S. ATLAS Computing Facilities Overview Bruce G. Gibbard Brookhaven National Laboratory U.S. LHC Software and Computing Review Brookhaven National Laboratory.
Data Placement Intro Dirk Duellmann WLCG TEG Workshop Amsterdam 24. Jan 2012.
Computing Division FY03 Budget and budget outlook for FY04 + CDF International Finance Committee April 4, 2003 Vicky White Head, Computing Division.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
September Presented At the Alice Offline week, Bjørn S. Nilsen AliEn/Grid/Computing status in the US by Bjørn S. Nilsen The Ohio State University.
US ATLAS – new grid initiatives John Huth Harvard University US ATLAS Software Meeting: BNL Aug 03.
LHC Computing, SPC-FC-CC-C; H F Hoffmann1 CERN/2379/Rev: Proposal for building the LHC computing environment at CERN (Phase 1) Goals of Phase.
1 Application status F.Carminati 11 December 2001.
Dominique Boutigny December 12, 2006 CC-IN2P3 a Tier-1 for W-LCG 1 st Chinese – French Workshop on LHC Physics and associated Grid Computing IHEP - Beijing.
14 June 2001LHCb workshop at Bologna1 LHCb and Datagrid - Status and Planning F Harris(Oxford)
May 23, 2007ALICE DOE Review - Computing1 ALICE-USA Computing Overview of Hard and Soft Computing Resources Needed to Achieve Research Goals 1.Calibration.
J. Templon Nikhef Amsterdam Physics Data Processing Group Large Scale Computing Jeff Templon Nikhef Jamboree, Utrecht, 10 december 2012.
Report from US ALICE Yves Schutz WLCG 24/01/2007.
10-Feb-00 CERN HepCCC Grid Initiative ATLAS meeting – 16 February 2000 Les Robertson CERN/IT.
Bob Jones EGEE Technical Director
BaBar-Grid Status and Prospects
LCG middleware and LHC experiments ARDA project
CERN-USA connectivity update DataTAG project
Geant3 All collaborations are using G3 to a certain extent
5th EU DataGrid Conference
Status of Grids for HEP and HENP
LHC Computing, RRB; H F Hoffmann
Presentation transcript:

October LHCUSA meeting BNL Bjørn S. Nilsen Update on NSF-ITR Proposal Bjørn S. Nilsen The Ohio State University

October LHCUSA meeting BNL Bjørn S. Nilsen Ultimate Purpose To get the needed computing resources for LHC- RHI in the US. –Hardware CPU, Disk-Tape Storage, Local-National-International Networking, Room/Building. –Software AliRoot, CMS-Off line, Grid-middle ware (distributed computing). –Support Personal to maintain and run these systems. Personal to develop the Grid interfaces.

October LHCUSA meeting BNL Bjørn S. Nilsen NSF-ITR Information Technology Research 3 Categories 5 years max, –Small (  $500K, Odds Good), –Medium (  $5M,  $1M/year, Odds middling), –Large (  $15M,  $3M/year, Odds poor). No $ for Federal employees of other agencies. $ Equipment < $ Personal (Not an infrastructure program).

October LHCUSA meeting BNL Bjørn S. Nilsen NSF-ITR Proposal 5 Year $3-4M Medium Project. Support for ALICE only. Core team, OSU, OSC, NCSU. 3+ Postdocs for 5 years. Hardware placed at OSC. –60% or more for Hardware. –Mostly gotten in last 3 years. 2 Year $500K Small Project. Support for ALICE only. Core team, OSU, OSC. 2 Postdocs for 2 years. Hardware –Use OSC resources via OSC computing grants. –maybe add some tapes, and workstations. Get Hardware in 2004 from other Sources.

October LHCUSA meeting BNL Bjørn S. Nilsen Background Grid Activities USA PPDG, GriPhyN/iVDGL –ATLAS, & CMS + well represented. –Good connections to Globus & Condor teams –Improved connections to CERN needed. –No ALICE connections, being established now. –Funding 2000 GriPhyN $15M 5 years, 2001 iVDGL $13.65M 5 years, PPDG $3.1M 1 year?. Europe DataGrid / DataTag –ALICE, ATLAS, CMS & LHCb + well represented. –Good connections with CERN. –Poor connections with Globus & Condor teams, their software in use. –Funding DataGrid €10M PPARC GRID €30M. INFN GRUD €30M DataTag €4M.

October LHCUSA meeting BNL Bjørn S. Nilsen LHCUSA Grid CMS-RHI Well connected to DataGrid- DataTag CMS HEP full participants in PPDG, GriPhyN, iVDGL. PPDG, GriPhyN, & iVDGL will be integrated into CMS framework by HEP, PPDG, GriPhyN, & iVDGL people. Need hardware and people to do RHI software development. ALICE-USA Well connected to DataGrid- DataTag Connections with PPDG, GriPhyN, iVDGL to be established. Need support to integrating PPDG, GriPhyN, & iVDGL products into ALICE framework. Need people and facilities to do this integration. Need more hardware and people to do RHI software development.

October LHCUSA meeting BNL Bjørn S. Nilsen GriPhyN/iVDGL meeting Mostly internal matters. General GriPhyN Grid development – VDT Many CS people unfamiliar with how physics is done. –User interface – Web. –Talk about using Objectivity – not CERN supported. iVDGL takes VDT & disseminates, maintains, and supports it. First iVDGL meeting. –Working on Organization. –First year, hardware and <2 cs FTE, later years manpower. –Order of 20% $ goes to hardware. iVDGL requested draft MOU’s from “external experiments”. Specifically Federico & Bjørn for ALICE.

October LHCUSA meeting BNL Bjørn S. Nilsen iVDGL ALICE MOU Get ALICE in the US Grid Game! Too late to become members of iVDGL/PPDG. What to do? –Be recognized as an active partner external “Experiment” –Establish a solid working relation with iVDGL. –Get good support for iVDGL, including PPDG & GriPhyN=VDT. –Be represented on selected iVDGL boards/working-groups (Steering committees, Oversight committees, and the like). –Get access to iVDGL test bed for ALICE Grid tests. –Supply iVDGL with resources via OSC and NERSC. –Share ALICE technology (AliEn, AliRoot, ALICE Proof applications).

October LHCUSA meeting BNL Bjørn S. Nilsen ALICE Software Activities Continue evolutionary development of AliRoot framework. After intensive physics testing, we have stopping further development with GEANT 4 Monte Carlo. Integration of the FLUKA Monte Carlo. Starting development of a Geometric modeler & data base. An ALICE distributed data catalog and production framework developed, AliEn (Grid in reverse). First ALICE full scale production started –Test production of pp-Pythia events completed. –Pb-Pb production starts 10/24/01 AliRoot 3.06 released 10/10/01 –Patched update released 10/24/01 (requires ROOT ).

October LHCUSA meeting BNL Bjørn S. Nilsen Conclusion 1.ALICE production is underway using existing technologies. 2.We have a chance to get involved with iVDGL if we act soon. 3.ITR proposal for short term funding is in the works 4.Still much work to be done.