Presentation is loading. Please wait.

Presentation is loading. Please wait.

EGI-InSPIRE RI-261323 EGI-InSPIRE EGI-InSPIRE RI-261323 EGI-InSPIRE SA3 “Heavy User Communities” Past, Present & Future

Similar presentations


Presentation on theme: "EGI-InSPIRE RI-261323 EGI-InSPIRE EGI-InSPIRE RI-261323 EGI-InSPIRE SA3 “Heavy User Communities” Past, Present & Future"— Presentation transcript:

1 www.egi.eu EGI-InSPIRE RI-261323 EGI-InSPIRE www.egi.eu EGI-InSPIRE RI-261323 EGI-InSPIRE SA3 “Heavy User Communities” Past, Present & Future Jamie.Shiers@cern.ch

2 EGI InSPIRE SA3: Status & Plans Jamie.Shiers@cern.ch WLCG Grid Deployment Board June 2010

3 Overview Status & Directions of EGI InSPIRE SA3EGI InSPIRE SA3 – 732 person month project over 36 months – ~400 person months for services for HEP Σ(tasks) – Duration: 1 st May 2010 – 31 st April 2013 Workplan and directions, including collaboration with other communities EGEE – EGI transition & outlook 3

4 WP6: Tasks Task (PM)PurposeContact / Leader TSA3.1 (18)ManagementJamie Shiers (50%) TSA3.2 (315)Shared Services & Tools 1.Dashboards 2.Applications 3.Services 4.Workflows & Schedulers 5.MPI Jamie Shiers TSA3.3 (263)Services for HEPMaria Girone TSA3.4 (79)Services for LSJohan Montagnat TSA3.5 (30)Services for A&AClaudio Vuerli TSA3.6 (27)Services for ESHorst Schwichtenberg We need a transparent mechanism for agreeing workplan, particularly in TSA3.2 4

5 WP6: Partner / Activity Overview Partner (by ID)Areas of work (may not be complete…) KIT-GTSA3.6 Services for ES CSICTSA3.2.4 Workflows and Schedulers CSCTSA3.2.4 Workflows and Schedulers CNRSTSA3.2.3 Services: HYDRA, TSA3.4 Services for LS TCDTSA3.2.5 MPI INFNTSA3.2.3 Services: GReIC, TSA3.2.5 MPI, TSA3.3 (HEP), TSA3.4 (A&A) CYFRONETTSA3.2.4 Workflows and Schedulers ARNESTSA3.2.2 Applications UI SAVTSA3.2.2 Applications CERNTSA3.1, TSA3.2, TSA3.3 EMBLTSA3.2.4 Workflows and Schedulers 5

6 6 Participant Number Participant Short name / Lead Beneficiary WP 6 (SA3): Services for Heavy User Communities Person Months per Task Person Months per Effort Type Total Person Months TSA3.1TSA3.2 +F+CCMST TSA3.3 HEP TSA3.4 LS TSA3.5 A&A TSA3.6 ES General 10KIT-G27 Fraunhofer27 12CSIC45 CSIC27 CIEMT18 13CSC18 14CNRS305383 19TCD21 INFN366030126 INFN60 UNIPG999 SPACI27 28CYFRONET666 31ARNES333 32UI SAV18 35CERN18120203341 37EMBL182644 TOTALS18315263793027732 Total “HEP” = 18 + 120 + 263 = 401 PM / 36 months

7 WP6: “HEP” Manpower TSA1.1Management18PMJamie Shiers (staff) TSA3.2.1Dashboards60PMJulia Andreeva (staff) Edward Karavakis (FELL) 1 July 2010 TSA3.2.2Ganga / Diane60PMMassimo Lamanna (staff) LD / FELL TSA3.3HEP: CERN203PMMaria Girone (staff) Fernando Barreiro Megino (FELL) 1 June 2010 Alexander Loth (DOCT) 1 June 2010 Raffaello Trentadue (FELL) 1 July 2010 ~3 LD / FELL slots - IT-ES-2010-153-LD TSA3.3HEP: INFN60PM2 slots – 2011? 401PM  Constraints:  341 PM (currently at 342)  EUR1,868,618  Project (task?) end date  (InSPIRE is 48 months…) 7

8 The EGI-InSPIRE Project Integrated Sustainable Pan-European Infrastructure for Researchers in Europe A proposal for an FP7 project –Work in progress..., i.e. this may all change! Targeting call objectives: –1.2.1.1: European Grid Initiative  1.2.1.2: Service deployment for Heavy Users Targeting a 3 year project (this did change!) Seeking a total 25M€ EC contribution Slides from S. Newhouse

9 www.egi.eu EGI-InSPIRE RI-261323 PY1-PY2 Trend PY2 II. Resource infrastructure PY1 CPU norm. wall clock hours 9 SA1 and JRA1 - June 2012

10 www.egi.eu EGI-InSPIRE RI-261323 CPU Usage II. Resource infrastructure 10 SA1 and JRA1 - June 2012

11 www.egi.eu EGI-InSPIRE RI-261323 Communities & Activities 11 High Energy Physics TSA3.3 The LHC experiments use grid computing for data distribution, processing and analysis. Strong focus on common tools and solutions. Areas supported include: Data Management, Data Analysis and Monitoring. Main VOs: ALICE, ATLAS, CMS, LHCb but covers many other HEP experiments + related projects. Life Sciences Covers the European Extremely Large Telescope (E-ELT), the Square Kilometre Array (SKA) and Cerenkov Telescope Array (CTA) and others. Activities focus on visualisation tools and database/catalog access from the grid. Main VOs: Argo, Auger, Glast, Magic, Planck, CTA, plus others (total 23) across 7 NGIs. Large variety of ES disciplines. Provides also access from the grid to resources within the Ground European Network for Earth Science Interoperations - Digital Earth Community (GENESI-DEC); assists scientists working on climate change via the Climate-G testbed. Main VOs: esr, egeode, climate-g, env.see-grid-sci.eu, meteo.see-grid-sci.eu, seismo.see-grid-sci.eu- support by ~20 NGIs Astronomy & Astrophysics TSA3.5 Earth Sciences TSA3.6 Focuses on medical, biomedical and bioinformatics sectors to connect worldwide laboratories, share resources and ease access to data in a secure and confidential way. Supports 5 VOs (biomed, lsgri, vlemed, pneumogrid + medigrid) across 6 NGIs via the Life Science Grid Community Life Sciences TSA3.4 These and other communities supported by shared tools & services EGI-InSPIRE Review 2012

12 www.egi.eu EGI-InSPIRE RI-261323 Communities & Activities 12 High Energy Physics TSA3.3 The LHC experiments use grid computing for data distribution, processing and analysis. Strong focus on common tools and solutions. Areas supported include: Data Management, Data Analysis and Monitoring. Main VOs: ALICE, ATLAS, CMS, LHCb but covers many other HEP experiments + related projects. Life Sciences Covers the European Extremely Large Telescope (E-ELT), the Square Kilometre Array (SKA) and Cerenkov Telescope Array (CTA) and others. Activities focus on visualisation tools and database/catalog access from the grid. Main VOs: Argo, Auger, Glast, Magic, Planck, CTA, plus others (total 23) across 7 NGIs. Large variety of ES disciplines. Provides also access from the grid to resources within the Ground European Network for Earth Science Interoperations - Digital Earth Community (GENESI-DEC); assists scientists working on climate change via the Climate-G testbed. Main VOs: esr, egeode, climate-g, env.see-grid-sci.eu, meteo.see-grid-sci.eu, seismo.see-grid-sci.eu- support by ~20 NGIs Astronomy & Astrophysics TSA3.5 Earth Sciences TSA3.6 Focuses on medical, biomedical and bioinformatics sectors to connect worldwide laboratories, share resources and ease access to data in a secure and confidential way. Supports 5 VOs (biomed, lsgri, vlemed, pneumogrid + medigrid) across 6 NGIs via the Life Science Grid Community Life Sciences TSA3.4 These and other communities supported by shared tools & services EGI-InSPIRE Review 2012

13 www.egi.eu EGI-InSPIRE RI-261323 SA3 Overview 13 CERN France Slovenia Slovakia Italy Spain Finland Poland EMBL Ireland Germany SA3 Effort 9 Countries 11 Beneficiaries 725 PMs 20.1 FTEs EGI-InSPIRE Review 2012

14 www.egi.eu EGI-InSPIRE RI-261323 SA3 Objectives  Transition to sustainable support: +Identify tools of benefit to multiple communities –Migrate these as part of the core infrastructure +Establish support models for those relevant to individual communities 14 EGI-InSPIRE Review 2012

15 www.egi.eu EGI-InSPIRE RI-261323 Achievements in Context As an explicit example, we use the case of HEP / support for WLCG  The 3 phases of EGEE (I/II/III) overlapped almost exactly with final preparations for LHC data taking: –WLCG Service Challenges 1-4, CCRC’08, STEP’09  EGI-InSPIRE SA3 covered virtually all the initial data taking run (3.5TeV/beam) of the LHC: first data taking and discoveries! The transition from EGEE to EGI was non-disruptive Continuous service improvement has been demonstrated Problems encountered during initial data taking were rapidly solved Significant progress in the identification and delivery of common solutions Active participation in the definition of the future evolution of WLCG 15 EGI-InSPIRE Review 2012

16 www.egi.eu EGI-InSPIRE RI-261323 WLCG Service Incidents Scale Test 16EGI-InSPIRE Review 2012 These are significant service incidents wrt targets defined in the WLCG MoU.MoU Basically mean major disruption to data taking, distribution, processing or analysis. A Service Incident Report is required.Service Incident Report

17 www.egi.eu EGI-InSPIRE RI-261323 WLCG Service Incidents Scale Test 17 Start of Data Taking EGI-InSPIRE Review 2012

18 www.egi.eu EGI-InSPIRE RI-261323 Resolution of Incidents Data taking 18 Incidents EGI-InSPIRE Review 2012

19 www.egi.eu EGI-InSPIRE RI-261323 Services for HEP ActivityPY2 Results Distributed Analysis Common Analysis Framework study for ATLAS and CMS initiated; first stage successfully completed (May 2012); next phase launched (Sep 2012); Data Management Dynamic caching / data popularity – move away from static data placement: common solutions deployed; others under development Persistency Framework Handles the event and detector conditions data from the experiments Monitoring / Dashboards All aspects of production and analysis: additional common solutions deployed Task Leader: Maria Girone 19 EGI-InSPIRE Review 2012 Focus on Common Solutions Across (all) VOs

20 Experiment Support CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t DBES The Common Solutions Strategy of the Experiment Support group at CERN for the LHC Experiments Maria Girone, CERN On behalf of the CERN IT-ES Group CHEP, New York City, May 2012

21 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Motivation 21Maria Girone, CERN Despite their differences as experiments at the LHC, from a computing perspective a lot of the workflows are similar and can be done with common services While the collaborations are huge and highly distributed, effort available in ICT development is limited and decreasing –Effort is focused on analysis and physics Common solutions are a more efficient use of effort and more sustainable in the long run

22 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Anatomy of Common Solution Most common solutions can be diagrammed as the interface layer between common infrastructure elements and the truly experiment specific components –One of the successes of the grid deployment has been the use of common grid interfaces and local site service interfaces –The experiments have a environments and techniques that are unique –In common solutions we target the box in between. A lot of effort is spent in these layers and there are big savings of effort in commonality not necessarily implementation, but approach & architecture –LHC schedule presents a good opportunity for technology changes Maria Girone, CERN22 Higher Level Services that translate between Experiment Specific Elements Common Infrastructure Components and Interfaces

23 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES The Group IT-ES is a unique resource in WLCG –The group is currently supported with substantial EGI-InSPIRE project effort –Careful balance of effort embedded in the experiments & on common solutions –Development of expertise in experiment systems & across experiment boundaries –People uniquely qualified to identify and implement common solutions Matches well with the EGI-InSPIRE mandate of developing sustainable solutions A strong and enthusiastic team Maria Girone, CERN23 EGI-InSPIRE INFSO-RI-261323

24 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Activities Monitoring and Experiment Dashboards –Allows experiments and sites to monitor and track their production and analysis activities across the grid Including services for data popularity, data cleaning and data integrity and site test stressing Distributed Production and Analysis –Design and development for experiment workload management and analysis components Data Management support –Covers development and integration of the experiment specific and shared grid middleware The LCG Persistency Framework –Handles the event and detector conditions data from the experiments Maria Girone, CERN24

25 www.egi.eu EGI-InSPIRE RI-261323 25 Life Sciences Grid Community User Management Tools User support Technical assistance for end users Dissemination and knowledge transfer Community resource monitoring All WMSs, CEs and SEs for the biomed VO Storage space monitoring tools Improvements to the Nagios monitoring probes Improving user experience Deployment of redundant VOMS server Investigate viability of redundant LFC servers Improvements to storage decommissioning procedures Future direction Development of a HUC users database and management tools to assist VO administrators in their daily task Services for LS LSGC User Management Tools EGI-InSPIRE Review 2012 GOCDB

26 www.egi.eu EGI-InSPIRE RI-261323 Achievements in Context SA3 has fostered and developed cross- VO and cross-community solutions beyond that previously achieved  Benefits of multi-community WP The production use of grid at the petascale and “Terra”scale has been fully and smoothly achieved  Benefits of many years of grid funding 26 EGI-InSPIRE Review 2012

27 www.egi.eu EGI-InSPIRE RI-261323 Reviewers’ Comments “In view of the recent news from CERN, it can easily be seen that the objectives of WP6 (=SA3) for the current period have not only been achieved but exceeded. Technically, the work carried out in WP6 is well managed and is of a consistently high quality, meeting the goals, milestones and objectives described in the DoW.” [ etc. ] 27 EGI-InSPIRE Review 2012

28 www.egi.eu EGI-InSPIRE RI-261323 LHC Timeline 28 EGI-InSPIRE Review 2012

29 www.egi.eu EGI-InSPIRE RI-261323 EGI-InSPIRE www.egi.eu EGI-InSPIRE RI-261323 FUTURE OUTLOOK

30 www.egi.eu EGI-InSPIRE RI-261323 Sustainability Statements 30 EGI-InSPIRE D6.8 Draft Tool / PackageImplementation of Sustainable Support Persistency FrameworkPOOL component maintained by experiments; COOL and CORAL by CERN-IT and experiments Data Analysis ToolsProof-of-concept and prototype developed partially using EGI- InSPIRE resources. Production system – if approved – to be resourced by key sites (e.g. CERN, FNAL,...) plus experiments. The development of this system is in any case outside the scope of EGI- InSPIRE SA3. Data Management ToolsReleased to production early in PY3. Long-term support taken over by PH department at CERN (outside SA3 scope). GangaCERN’s involvement in Ganga-core will cease some months after EGI-InSPIRE SA3 terminates and will be picked up by the remainder of the Ganga project (various universities and experiments). [ Ganga allowed us to get other project effort at low cost. ] Experiment DashboardAll key functionality has been delivered to production before or during PY3. Long-term support guaranteed through CERN-Russia and CERN-India agreements, in conjunction with other monitoring efforts within CERN-IT.

31 www.egi.eu EGI-InSPIRE RI-261323 SA3 – Departures 31 30.04.2013MASCHERONIMarcoIT-ES-VOS01.07.2011 30.04.2013TRENTADUERaffaelloIT-ES-VOS01.07.2010 30.04.2013GIORDANODomenicoIT-ES-VOS01.05.2011 30.04.2013CINQUILLIMattiaIT-ES-VOS01.09.2010 30.04.2013NEGRIGuidoIT-ES-VOS01.07.2011 30.04.2013LANCIOTTIElisaIT-ES-VOS01.10.2010 30.04.2013KARAVAKISEdouardosIT-ES-DNG01.07.2010 30.04.2013KENYONMichael JohnIT-ES-DNG01.11.2010 30.04.2013BARREIRO MEGINOFernando HaraldIT-ES-VOS01.06.2010 30.04.2013DENISMarek KamilIT-ES-VOS01.09.2012 30.04.2013KUCHARCZYKKatarzynaIT-ES-VOS01.10.2012

32 www.egi.eu EGI-InSPIRE RI-261323 FP8 / Horizon 2020 Expect first calls in 2013 – funding from late 2013 / early 2014 IMHO, calls relating to data management and/or data preservation plus specific disciplines (e.g. LS) are likely Will we be part of these projects? Actively pursuing leads now with this objective Will not solve the problem directly related to experiment support, nor address “the gap” EU projects need not have a high overhead! 32

33 www.egi.eu EGI-InSPIRE RI-261323 Summary EGI-InSPIRE SA3 has provided support for many disciplines – the key grid communities at the end of EGEE III It has played a key role in the overall support provided to experiments by IT-ES All of the main grid communities will be affected by the end of the work package The “sustainability plans” are documented in D6.8, due January 2013 Expect no miracles 33

34 www.egi.eu EGI-InSPIRE RI-261323 EGI-InSPIRE www.egi.eu EGI-InSPIRE RI-261323 BACKUP

35 www.egi.eu EGI-InSPIRE RI-261323 35 Hydra – an EMI-developed service for handling the encryption and decryption of sensitive files. Future direction Distributed (3-servers) Hydra key-store deployment Client packages compatible with the latest middleware release Services for LS Hydra Encryption Service EGI-InSPIRE Review 2012 Hydra key stores SE CE WN Register key Register encrypted file Get encrypted file Fetch key Development highlights Server deployed on gLite 3.1 and gLite 3.2 Improvements to the installation and configuration procedure W ork on provisioning client for sites providing resources to Life Sciences Discussion with EMI to integrate the client in the official m/w releases Prototype service available for testing

36 www.egi.eu EGI-InSPIRE RI-261323 36 Visualization Interface for the Virtual Observatory A visualization and analysis software for astrophysical data. Recent work: Fully compliant implementation of the VisIVO MPI utility developed provide compatibility with the gLite grid infrastructure. A hybrid server was procured and deployed as a grid computing node to provide a combined GPU (Graphical Processing Unit) and CPU processing facility. Plans for the future: Ability of VisIVO to utilise GPUs on grid worker nodes. Optimisation of VisIVO visualization component when running on huge user data tables. Services for A&A VisIVO EGI-InSPIRE Review 2012

37 www.egi.eu EGI-InSPIRE RI-261323 37 Identification of use-cases and test-beds requiring simultaneous access to astronomical data (also that federated in the Virtual Observatory) and to computing resources of different nature. Recent work: Work in progress to integrate BaSTI (A Bag of Stellar Tracks and Isochrones) Astronomical Database and its feeding FARANEC code with grid technology. Web portal developed to facilitate grid execution of user-generated stellar evolution simulations. Plans for the future: GReIC is going to be evaluated for providing a mechanism to access databases from DCIs; a crucial requirement of the A&A community. New upgraded web portal for BaSTI/FRANEC built on top of gUse/WS-PGRADE Intensify the activity aimed to the harvesting of requirements, use-cases and test-beds to be brought in EGI and other strategic FP7 projects. Increase the number of A&A applications and workflows able to run on EGI DCI. Services for A&A Heterogeneous resources, DBs and the VObs Access to heterogeneous resources & databases and interoperability with the VOBs EGI-InSPIRE Review 2012

38 www.egi.eu EGI-InSPIRE RI-261323 Services for ES (SUB)TASKPY2 RESULTS TSA3.6Support ES Activities in ES communities and projects, and carried out by researchers & students in Universities Main activity by proposal: access to GENESI-DEC Status: Webservice available and maintained Extensions of the service by ESSI-Lab GI-cat service to extend search of data Further developments in Task: Integration with available GEOSS services to access ES Data not only from GENESI and Climate Data from ESG Since January 2011 common developments with Earth System Grid (ESG) Solution to access data from ESG nodes and from the EGI infrastructure and vice versa available Main problem: authentication / authorization due to different federations 1 st workaround in place Institute IPSL/CNRS, IPGP still unfunded partner in TSA3.6 Task leader: Horst Schwichtenberg, Deputy: André Gemünd EGI-InSPIRE Review 2012 38

39 www.egi.eu EGI-InSPIRE RI-261323 EGI-InSPIRE Review 2012 39 Distributed infrastructure developed to support CMIP5 The Coupled Model Intercomparison Project, Phase 5 Internationally co-ordinated set of climate model experiments Involving climate model centres from all over the world. Recent work: MPI implementation of ESG application code runs on EGI infrastructure Implementation of multi-threaded climate data transfer program to download data from the ESG nodes Investigation of a solution to streamline ESG data access from EGI infrastructure. Review and testing instance of NorduGrid Security Stack Problem communicated to other EGI representatives with situation description and possible solutions Collaboration between representatives of EGI, TSA3.6 and Earth System Grid Federation Services for ES The Earth System Grid

40 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Examples: Data Popularity Experiments want to know which datasets are used, how much, and by whom –Good chance of a common solution Data popularity uses the fact that all experiments open files and access storage The monitoring information can be accessed in a common way using generic and common plug-ins The experiments have systems that identify how those files are mapped onto logical objects like datasets, reprocessing and simulation campaigns Maria Girone, CERN40 Files accessed, users and CPU used Experiment Booking Systems Mapping Files to Datasets File Opens and Reads

41 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Popularity Service Used by the experiments to assess the importance of computing processing work, and to decide when the number of replicas of a sample needs to be adjusted either up or down Maria Girone, CERN41 See D. Giordano et al., [176] Implementing data placement strategies for the CMS experiment based on a popularity model

42 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Cleaning Service The Site Cleaning Agent is used to suggest obsolete or unused data that can be safely deleted without affecting analysis. The information about space usage is taken from the experiment dedicated data management and transfer system Maria Girone, CERN42

43 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES D. Tuckett et al., [300], Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework Dashboard Framework and Applications Dashboard is one of the original common services –All experiments execute jobs and transfer data –Dashboard services rely on experiment specific information for site names, activity mapping, error codes –The job monitoring system collects centrally information from workflows about the job status and success Database, framework and visualization are common Maria Girone, CERN43 Framework & visualization Sites and activities Job submission & data transfers

44 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Site Status Board Another example of a good common service –Takes specific lower level checks on the health of common services –Combines with some experiment specific workflow probes –Includes links into the ticketing system –Combines to a common view Maria Girone, CERN44

45 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES HammerCloud HammerCloud is a common testing framework for ATLAS (PanDA), CMS (CRAB) and LHCb (Dirac) Common layer for functional testing of CEs and SEs from a user perspective Continuous testing and monitoring of site status and readiness. Automatic Site exclusion based on defined policies Same development, same interface, same infrastructure  less workforce, Maria Girone, CERN45 Testing and Monitoring Framework Distributed analysis Frameworks Computing & Storage Elements

46 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES HammerCloud D. van der Ster et al. [283], Experience in Grid Site Testing for ATLAS, CMS and LHCb with HammerCloud

47 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES New Activities – Analysis Workflow Up to now services have generally focused on monitoring activities –All of these are important and commonality saves effort –Not normally in the core workflows of the experiment Success with the self contained services has provided confidence moving into a core functionality –Looking at the Analysis Workflow Feasibility Study for a Common Analysis Framework between ATLAS and CMS Maria Girone, CERN47 Job Tracking, Resubmission, and scheduling Data discovery, environment configuration, and job splitting Job submission and Pilots

48 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Analysis Workflow Progress Looking at ways to make the workflow engine common between the two experiments –Improving the sustainability of the central components that interface to low-level services A thick layer that handles prioritization, job tracking and resubmission –Maintaining experiment specific interfaces Job splitting, environment, and data discovery would continue to be experiment specific Maria Girone, CERN48 Job Tracking, Resubmission, and scheduling Data discovery, job splitting and packaging of user environment Job submission and Pilots

49 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Proof of Concept Diagram Maria Girone, CERN49 Feasibility Study proved that there are no show- stoppers to design a common analysis framework Next step is a proof of concept

50 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Even Further Ahead As we move forward, we would also like to assess and document the process –This should not be the only common project The diagram for data management would look similar –A thick layer between the experiment logical definitions of datasets and the service that moves files Deals with persistent location information and tracks files in progress and validates file consistency Currently no plans for common services, but has the right properties Maria Girone, CERN50 File locations and files in transfer Datasets to file mapping File Transfer Service (FTS)

51 CERN IT Department CH-1211 Geneva 23 Switzerland www.cern.ch/i t ES Outlook IT-ES has a good record of identifying and developing common solutions between the LHC experiments –Setup and expertise of the group have helped Several services focused primarily on monitoring have been developed and are in production use As a result, more ambitious services that would be closer to the experiment core workflows are under investigation The first is a feasibility study and proof of concept of a common analysis framework between ATLAS and CMS Both better and more sustainable solutions could result – with lower operational and maintenance costs Maria Girone, CERN51


Download ppt "EGI-InSPIRE RI-261323 EGI-InSPIRE EGI-InSPIRE RI-261323 EGI-InSPIRE SA3 “Heavy User Communities” Past, Present & Future"

Similar presentations


Ads by Google