Project Coordination R. Cavanaugh University of Florida.

Slides:



Advertisements
Similar presentations
Fabric and Storage Management GridPP Fabric and Storage Management GridPP 24/24 May 2001.
Advertisements

State of Indiana Business One Stop (BOS) Program Roadmap Updated June 6, 2013 RFI ATTACHMENT D.
May 9, 2008 Reorganization of the OSG Project The existing project organization chart was put in place at the beginning of It has worked very well.
EU-GRID Work Program Massimo Sgaravatto – INFN Padova Cristina Vistoli – INFN Cnaf as INFN members of the EU-GRID technical team.
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
1 Ideas About the Future of HPC in Europe “The views expressed in this presentation are those of the author and do not necessarily reflect the views of.
EC Review – 01/03/2002 – G. Zaquine – Quality Assurance – WP12 – CS-SI – n° 1 DataGrid Quality Assurance Gabriel Zaquine Quality Engineer - WP12 – CS-SI.
Assessment of Core Services provided to USLHC by OSG.
October 24, 2000Milestones, Funding of USCMS S&C Matthias Kasemann1 US CMS Software and Computing Milestones and Funding Profiles Matthias Kasemann Fermilab.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
EGEE (EU IST project) – Networking Activities 4: biomedical applications NA4 Biomedical Activities Johan Montagnat First EGEE Meeting Cork,
EGI: A European Distributed Computing Infrastructure Steven Newhouse Interim EGI.eu Director.
Key Project Drivers - FY11 Ruth Pordes, June 15th 2010.
CMS Report – GridPP Collaboration Meeting VI Peter Hobson, Brunel University30/1/2003 CMS Status and Plans Progress towards GridPP milestones Workload.
Mantychore Oct 2010 WP 7 Andrew Mackarel. Agenda 1. Scope of the WP 2. Mm distribution 3. The WP plan 4. Objectives 5. Deliverables 6. Deadlines 7. Partners.
ARGONNE  CHICAGO Ian Foster Discussion Points l Maintaining the right balance between research and development l Maintaining focus vs. accepting broader.
LCG and HEPiX Ian Bird LCG Project - CERN HEPiX - FNAL 25-Oct-2002.
PPDG and ATLAS Particle Physics Data Grid Ed May - ANL ATLAS Software Week LBNL May 12, 2000.
Nick Walker, Brian Foster LAL, Orsay WP2: Coordination with the GDE.
DOSAR Workshop, Sao Paulo, Brazil, September 16-17, 2005 LCG Tier 2 and DOSAR Pat Skubic OU.
Virtual Data Grid Architecture Ewa Deelman, Ian Foster, Carl Kesselman, Miron Livny.
DataTAG Research and Technological Development for a Transatlantic Grid Abstract Several major international Grid development projects are underway at.
7April 2000F Harris LHCb Software Workshop 1 LHCb planning on EU GRID activities (for discussion) F Harris.
GriPhyN EAC Meeting (Jan. 7, 2002)Carl Kesselman1 University of Southern California GriPhyN External Advisory Committee Meeting Gainesville,
GridPP Deployment & Operations GridPP has built a Computing Grid of more than 5,000 CPUs, with equipment based at many of the particle physics centres.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
Bob Jones Technical Director CERN - August 2003 EGEE is proposed as a project to be funded by the European Union under contract IST
1 HiGrade Kick-off Welcome to DESY Hamburg Zeuthen.
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
24-Aug-11 ILCSC -Mumbai Global Design Effort 1 ILC: Future after 2012 preserving GDE assets post-TDR pre-construction program.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
…building the next IT revolution From Web to Grid…
RI The DEISA Sustainability Model Wolfgang Gentzsch DEISA-2 and OGF rzg.mpg.de.
SEAL Core Libraries and Services CLHEP Workshop 28 January 2003 P. Mato / CERN Shared Environment for Applications at LHC.
E-Science Research Councils awarded e-Science funds ” science increasingly done through distributed global collaborations enabled by the Internet, using.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks IPv6 test methodology Mathieu Goutelle (CNRS.
DESY Photon Science XFEL official start of project: 5 June 2007 FLASH upgrade to 1 GeV done, cool down started PETRA III construction started 2 July 2007.
Ruth Pordes November 2004TeraGrid GIG Site Review1 TeraGrid and Open Science Grid Ruth Pordes, Fermilab representing the Open Science.
The Grid Effort at UF Presented by Craig Prescott.
US LHC OSG Technology Roadmap May 4-5th, 2005 Welcome. Thank you to Deirdre for the arrangements.
INFSO-RI Enabling Grids for E-sciencE Experience of using gLite for analysis of ATLAS combined test beam data A. Zalite / PNPI.
6/23/2005 R. GARDNER OSG Baseline Services 1 OSG Baseline Services In my talk I’d like to discuss two questions:  What capabilities are we aiming for.
15 December 2015M. Lamanna “The ARDA project”1 The ARDA Project (meeting with the LCG referees) Massimo Lamanna CERN.
Status Organization Overview of Program of Work Education, Training It’s the People who make it happen & make it Work.
Ian Bird GDB CERN, 9 th September Sept 2015
EMI is partially funded by the European Commission under Grant Agreement RI Project Status and NA1 Alberto Di Meglio, CERN 3 rd EMI All-Hands Meeting.
1 Future Circular Collider Study Preparatory Collaboration Board Meeting September 2014 R-D Heuer Global Future Circular Collider (FCC) Study Goals and.
State of Georgia Release Management Training
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
The ATLAS Computing Model and USATLAS Tier-2/Tier-3 Meeting Shawn McKee University of Michigan Joint Techs, FNAL July 16 th, 2007.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
Open Science Grid in the U.S. Vicky White, Fermilab U.S. GDB Representative.
INFSO-RI Enabling Grids for E-sciencE Quality Assurance Gabriel Zaquine - JRA2 Activity Manager - CS SI EGEE Final EU Review
PCAP Close Out Feb 2, 2004 BNL. Overall  Good progress in all areas  Good accomplishments in DC-2 (and CTB) –Late, but good.
LCG Project Organisation Requirements and Monitoring LHCC Comprehensive Review November 24, 2003 Matthias Kasemann Software + Computing Committee (SC2)
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Victoria A. White Head, Computing Division, Fermilab Fermilab Grid Computing – CDF, D0 and more..
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
12 March, 2002 LCG Applications Area - Introduction slide 1 LCG Applications Session LCG Launch Workshop March 12, 2002 John Harvey, CERN LHCb Computing.
1 Open Science Grid: Project Statement & Vision Transform compute and data intensive science through a cross- domain self-managed national distributed.
DOE /ESnet-related IPv6 Activities Phil DeMar HEPix IPv6 Workshop (CERN) Sept. 6,
TeraGrid’s Process for Meeting User Needs. Jay Boisseau, Texas Advanced Computing Center Dennis Gannon, Indiana University Ralph Roskies, University of.
Bob Jones EGEE Technical Director
EGEE Middleware Activities Overview
Scaling Science Communities Lessons learned by and future plans of the Open Science Grid Frank Würthwein OSG Executive Director Professor of Physics UCSD/SDSC.
DataGrid Quality Assurance
Ian Bird GDB Meeting CERN 9 September 2003
Leigh Grundhoefer Indiana University
Presentation transcript:

Project Coordination R. Cavanaugh University of Florida

Important UltraLight is a project with two equal and symbiotic activities –Application driven Network R&D –Not just a networking project, nor solely a distributed data analysis project UltraLight is a Physics ITR –Ultimate goal Enable and produce physics (more generally e-science), which could not otherwise be performed –Network Technical Group is the “backbone” of the Project Activity: Perform network R&D Metric: CS Publications, demonstrations –Applications Technical Group is the “driver” of the Project Activity: Perform “middleware” R&D, perform LHC Physics research Metric: CS and Physics Publications, demonstrations, community adoption

Relationship between UltraLight and LHC CMS –CCS APROM –US-CMS S&C Tier-1 UAF Tier-2 Program DYSUN Tier-2c ATLAS ATLAS and CMS will integrate the UltraLight Application Services respectively into their own separate Software Stacks

LHC Computing Model: Requirements and Scale

Reminder from the Proposal Phase 1 (12 months) –Implementation of network, equipment and initial services Phase 2 (18 Months) –Integration Phase 3 (18 Months) –Transition to Production

Connecting to the LHC schedule

CMS Milestones and activities UltraLight Deliverables UltraLight Users Now 6-m from now Phase 1 Phase 2 Phase 3 UltraLight Milestones Original UltraLight Synchronisation Plan with CMS

Scope of UltraLight Original Proposed Programme of Work Current Amended Programme of Work

Relationship between Ultralight, and proposed GISNET

Project Management Web-page portal –Wiki, etc Mail-lists Regularly scheduled phone and video meetings Periodic face-to-face workshops Persistent VRVS room for collaboration Reports –Technical –Annual

Project Structure of UltraLight In the Proposal –HEP Application-layer Services –e-VLBI Application- layer Services –Global Services –Testbed Deployment and Operations –Network Engineering –Education & Outreach Current –Applications CMS ATLAS e-VLBI Global Services Testbed –Network Global Services Testbed Engineering –Education & Outreach

Connection between Application and Network Tech. Groups

Project Plan Short term Longer term

6-month Overall Project Goals Establish early ties to the LHC –Tightly couple with LHC experiment needs and timelines Possibly take on official roles within experiments? –Must aggressively come up to speed to be meet LHC milestones –Already underway… Establish collaborative ties with external partners –OSG, Grid3, CHEPREO, AMPATH, etc –Already underway… Establish scope of the project –Evaluate trade-offs between R&D interests and application needs Functionality and LHC timeline –Determine what technology we Can adopt off the shelf Must develop to meet project goals Establish initial UltraLight infrastructure and user community

Early Focus of the UltraLight Technical Groups Networking –Construct the UltraLight Network Testbed (UNT) Applications –Construct the UltraLight Applications Testbed (UAT) Leverage GAE, Grid3, CMS, ATLAS, etc –Prepare applications that will exploit the Network Testbed E&O –Build relationships with external partners CHEPREO, CIARA, AMPATH, etc To first order, the Network Testbed can be instantiated independently from the Application Testbed and E&O activities –This will be our early strategy –Later, bring the (largely orthogonal) testbeds together in an evolutionary way

Longer Term Project Goals (past initial testbed building phase) Global Services –Develop handles which monitor, control, and provision network resources Manually at first, then move to automate –Close collaborative effort between Applications Group and Networking Group Requires the that the UNT and UAT work together as a coherent whole –Combine to operate single UltraLight Testbed (UT) Smooth transformation to UT expected as UNT and UAT activities are very complementary

6-month Network Goals Inventory the different UltraLight sites –What are the current connections available at each site? –What type of switches, how many, and where are they located, etc? Construct the UltraLight Network Testbed –Persistent development environment Piece together all the different network components –Create the NOC Team Begin thinking about disk-to-disk transfers –Interface with Storage Elements –Important for later integration work with Applications

6-month Application Goals Establish UltraLight Application Grid Testbed –Persistent development environment Deploy Application-layer services and middleware Deploy HEP Applications –Create the GOC Team Perform System Integration Tests: Demonstrate –Interoperability of existing UltraLight Application-layer Services –Operability of HEP Applications on top of existing Application-layer Services Study HEP application (ORCA & ATHENA) behaviour in-depth –Execution environment, Data/Metadata Model, Performance Profiles –Current and future versions

6-month E&O Goals Earni and Fabian Network engineers –Networking to brasil $ for E&O connected with CHEPREO Julio and Heidi EO (CHEPREO and UltraLight) QuarkNet –Bring people together to do science –Community aspect

6-month E&O Goals Quark net –Research emphasized –Monte Carlo at FNAL Z0’s with fake detector Teachers see what it is like to be a particle physicists –HS teachers are a different group Need to be long term oriented, substantial, something teachers can sink teeth into

6-month E&O Goals Ideas for what is practical How to fit together Had a grid needs/assesment workshop –There is a writeup Look at live data from CMS Coordinate with applications group –ORCA -> comma separated file -> excel Web browser driven –No special software

6-month E&O Goals Showing students how scientists collaborate –Have meetings between scientists and students (via VRVS?) –Ask a scientist day to highlight networking and HEP –Thinkquest competition Awards Put together ad-hoc grid networks –Workshops and teaching Seamless bridge Ultralight/HEP tools with java applet

6-month User Community Goals Feedback loop between developers and users –Update and confirm the Proposed UltraLight Grid- enabled Analysis Environment –Use actual LHC data analysis work Contribute to the CMS Physics TDR Milestones –Many UltraLight members are strongly engaged Effort recognised in CMS –Application driver for UltraLight! Challenging and essential! Very tight, ambitious schedule to meet however…

Demonstrated at SC04 Much Delivered for SC04 CMS P-TDR Analysis Efforts Already Underway (very ambitious schedule) Application Group is already working hard to ramp up in time for the LHC Current UltraLight Application Group Status

Current UltraLight Network Group Status Also working hard –Refer to Shawn’s UltraLight Network Technical Document

Major Milestones over the Next 6 Months Dec –Initial UltraLight Collaboration Meeting at CIT Jan –UltraLight Week at CIT: UltraLight Applications Testbed Started Feb:CMS PRS Meeting –UM connects to MiLR, UF connects to FLR Mar:CMS Week –FLR declared to be “Production” Apr: May: CMS CPT Week –First round of CMS approvals for analyses to enter P-TDR Jun:CMS Week –UltraLight Meeting (UM?): UltraLight Network Testbed in place

Major Project Milestones

Conclusion

Notes Rick –Setting LHC requirements and scale –Connect to LHC schedule –Project plan (short term + longer term) –Project management strategy Regularly scheduled phone and video conf Persistent VRVS room for collaboration –Relationship to GISNET, DYSUN, UltraLight –Make connection between application and networking groups Frank –CMS (LHC) use-cases –Summary of the application services document –Short term plan for Application Technical Group