Presentation is loading. Please wait.

Presentation is loading. Please wait.

The EGI-InSPIRE Second Year Project Review

Similar presentations


Presentation on theme: "The EGI-InSPIRE Second Year Project Review"— Presentation transcript:

1 The EGI-InSPIRE Second Year Project Review
27 & 28 June 2012 EGI.eu, Amsterdam

2 SA3: Heavy User Communities
Jamie Shiers CERN

3 SA3 Overview 9 Countries 11 Beneficiaries 725 PMs 20.1 FTEs SA3 Effort
CERN France Slovenia Slovakia Italy Spain Finland Poland EMBL Ireland Germany WP Task Beneficiary Total PMs WP6-G TSA3.1 CERN 18 TSA3.2 ARNES 3 120 CNRS 30 CSC CSIC 45 CYFRONET 6 EMBL 15 INFN 36 TCD 21 UI SAV Sub-total 312 TSA3.3 60 203 263 TSA3.4 53 22 75 TSA3.5 TSA3.6 KIT-G 27 57 TOTAL 725 SA3 Effort EGI-InSPIRE Review 2012

4 SA3 Objectives Transition to sustainable support:
Identify tools of benefit to multiple communities Migrate these as part of the core infrastructure Establish support models for those relevant to individual communities EGI-InSPIRE Review 2012

5 Achievements Achievements and Future plans are based on D6.6 – the SA3 Annual Report These are presented in a single pass: By community for TSA3.3 – TSA3.6 HEP, LS, A&A, ES By tool for TSA3.2 (shared tools and services) Sustainability plans and progress addressed both by D6.6 as well as D6.5 EGI-InSPIRE Review 2012

6 Achievements in Context
As an explicit example, we use the case of HEP / support for WLCG The 3 phases of EGEE (I/II/III) overlapped almost exactly with final preparations for LHC data taking: WLCG Service Challenges 1-4, CCRC’08, STEP’09 EGI-InSPIRE SA3 covered virtually all the initial data taking run (3.5TeV/beam) of the LHC: first data taking and discoveries! The transition from EGEE to EGI was non-disruptive Continuous service improvement has been demonstrated Problems encountered during initial data taking were rapidly solved Significant progress in the identification and delivery of common solutions Active participation in the definition of the future evolution of WLCG EGI-InSPIRE Review 2012

7 LHC(b) – Results! See also plenary at CHEP 2012: LHC experience so far
EGI-InSPIRE Review 2012

8 WLCG Service Incidents
These are significant service incidents wrt targets defined in the WLCG MoU. Basically mean major disruption to data taking, distribution, processing or analysis. A Service Incident Report is required. Scale Test EGI-InSPIRE Review 2012

9 WLCG Service Incidents
Start of Data Taking Scale Test EGI-InSPIRE Review 2012

10 WLCG Service Incidents
Start of Data Taking Scale Test EGI-InSPIRE Review 2012

11 Resolution of Incidents
Data taking EGI-InSPIRE Review 2012

12 Resolution of Incidents
Data taking EGI-InSPIRE Review 2012

13 Resolution of Incidents
Data taking EGI-InSPIRE Review 2012

14 Resolution of Incidents
Many of today’s incidents are largely “transparent” to users due to grid architecture and deployment models. (There is an operational cost so further improvement needed). Early incidents were highly disruptive. Data taking EGI-InSPIRE Review 2012

15 Resolution of Incidents
Data taking EGI-InSPIRE Review 2012

16 Achievements & Plans Presented as Summary tables Details – based on D6.6 – provided as backup slides

17 Communities & Activities
High Energy Physics TSA3.3 The LHC experiments use grid computing for data distribution, processing and analysis. Strong focus on common tools and solutions. Areas supported include: Data Management, Data Analysis and Monitoring. Main VOs: ALICE, ATLAS, CMS, LHCb but covers many other HEP experiments + related projects. Focuses on medical, biomedical and bioinformatics sectors to connect worldwide laboratories, share resources and ease access to data in a secure and confidential way. Supports 5 VOs (biomed, lsgri, vlemed, pneumogrid + medigrid) across 6 NGIs via the Life Science Grid Community Life Sciences TSA3.4 Life Sciences Covers the European Extremely Large Telescope (E-ELT), the Square Kilometre Array (SKA) and Cerenkov Telescope Array (CTA) and others. Activities focus on visualisation tools and database/catalog access from the grid. Main VOs: Argo, Auger, Glast, Magic, Planck, CTA, plus others (total 23) across 7 NGIs. Astronomy & Astrophysics TSA3.5 Large variety of ES disciplines. Provides also access from the grid to resources within the Ground European Network for Earth Science Interoperations - Digital Earth Community (GENESI-DEC); assists scientists working on climate change via the Climate-G testbed. Main VOs: esr, egeode, climate-g, env.see-grid-sci.eu, meteo.see-grid-sci.eu, seismo.see-grid-sci.eu- support by ~20 NGIs Earth Sciences TSA3.6 EGI-InSPIRE Review 2012

18 Communities & Activities
These and other communities supported by shared tools & services High Energy Physics TSA3.3 The LHC experiments use grid computing for data distribution, processing and analysis. Strong focus on common tools and solutions. Areas supported include: Data Management, Data Analysis and Monitoring. Main VOs: ALICE, ATLAS, CMS, LHCb but covers many other HEP experiments + related projects. Focuses on medical, biomedical and bioinformatics sectors to connect worldwide laboratories, share resources and ease access to data in a secure and confidential way. Supports 5 VOs (biomed, lsgri, vlemed, pneumogrid + medigrid) across 6 NGIs via the Life Science Grid Community Life Sciences TSA3.4 Life Sciences Covers the European Extremely Large Telescope (E-ELT), the Square Kilometre Array (SKA) and Cerenkov Telescope Array (CTA) and others. Activities focus on visualisation tools and database/catalog access from the grid. Main VOs: Argo, Auger, Glast, Magic, Planck, CTA, plus others (total 23) across 7 NGIs. Astronomy & Astrophysics TSA3.5 Large variety of ES disciplines. Provides also access from the grid to resources within the Ground European Network for Earth Science Interoperations - Digital Earth Community (GENESI-DEC); assists scientists working on climate change via the Climate-G testbed. Main VOs: esr, egeode, climate-g, env.see-grid-sci.eu, meteo.see-grid-sci.eu, seismo.see-grid-sci.eu- support by ~20 NGIs Earth Sciences TSA3.6 EGI-InSPIRE Review 2012

19 Communities & Activities
These and other communities supported by shared tools & services High Energy Physics TSA3.3 The LHC experiments use grid computing for data distribution, processing and analysis. Strong focus on common tools and solutions. Areas supported include: Data Management, Data Analysis and Monitoring. Main VOs: ALICE, ATLAS, CMS, LHCb but covers many other HEP experiments + related projects. Focuses on medical, biomedical and bioinformatics sectors to connect worldwide laboratories, share resources and ease access to data in a secure and confidential way. Supports 5 VOs (biomed, lsgri, vlemed, pneumogrid + medigrid) across 6 NGIs via the Life Science Grid Community Life Sciences TSA3.4 Life Sciences Covers the European Extremely Large Telescope (E-ELT), the Square Kilometre Array (SKA) and Cerenkov Telescope Array (CTA) and others. Activities focus on visualisation tools and database/catalog access from the grid. Main VOs: Argo, Auger, Glast, Magic, Planck, CTA, plus others (total 23) across 7 NGIs. Astronomy & Astrophysics TSA3.5 Large variety of ES disciplines. Provides also access from the grid to resources within the Ground European Network for Earth Science Interoperations - Digital Earth Community (GENESI-DEC); assists scientists working on climate change via the Climate-G testbed. Main VOs: esr, egeode, climate-g, env.see-grid-sci.eu, meteo.see-grid-sci.eu, seismo.see-grid-sci.eu- support by ~20 NGIs Earth Sciences TSA3.6 EGI-InSPIRE Review 2012

20 Task Leader: Maria Girone
Services for HEP Activity PY2 Results Distributed Analysis Common Analysis Framework study for ATLAS and CMS initiated; first stage successfully completed (May 2012); next phase launched (Sep 2012); Data Management Dynamic caching / data popularity – move away from static data placement: common solutions deployed; others under development Persistency Framework Handles the event and detector conditions data from the experiments Monitoring / Dashboards All aspects of production and analysis: additional common solutions deployed Task Leader: Maria Girone Focus on Common Solutions Across (all) VOs Common Solutions EGI-InSPIRE Review 2012

21 LHC Timeline EGI-InSPIRE Review 2012

22 LSGC User Management Tools
Services for LS LSGC User Management Tools Life Sciences Grid Community User Management Tools User support Technical assistance for end users Dissemination and knowledge transfer Community resource monitoring All WMSs, CEs and SEs for the biomed VO Storage space monitoring tools Improvements to the Nagios monitoring probes Improving user experience Deployment of redundant VOMS server Investigate viability of redundant LFC servers Improvements to storage decommissioning procedures Future direction Development of a HUC users database and management tools to assist VO administrators in their daily task GOCDB EGI-InSPIRE Review 2012

23 Hydra Encryption Service
Services for LS Hydra Encryption Service Hydra – an EMI-developed service for handling the encryption and decryption of sensitive files. Future direction Distributed (3-servers) Hydra key-store deployment Client packages compatible with the latest middleware release Development highlights Server deployed on gLite 3.1 and gLite 3.2 Improvements to the installation and configuration procedure Work on provisioning client for sites providing resources to Life Sciences Discussion with EMI to integrate the client in the official m/w releases Prototype service available for testing Hydra key stores Fetch key Register key CE WN WN Register encrypted file SE Get encrypted file EGI-InSPIRE Review 2012

24 Services for A&A VisIVO
Visualization Interface for the Virtual Observatory A visualization and analysis software for astrophysical data. Recent work: Fully compliant implementation of the VisIVO MPI utility developed provide compatibility with the gLite grid infrastructure. A hybrid server was procured and deployed as a grid computing node to provide a combined GPU (Graphical Processing Unit) and CPU processing facility. Plans for the future: Ability of VisIVO to utilise GPUs on grid worker nodes. Optimisation of VisIVO visualization component when running on huge user data tables. EGI-InSPIRE Review 2012

25 Services for A&A Heterogeneous resources, DBs and the VObs
Access to heterogeneous resources & databases and interoperability with the VOBs Identification of use-cases and test-beds requiring simultaneous access to astronomical data (also that federated in the Virtual Observatory) and to computing resources of different nature. Recent work: Work in progress to integrate BaSTI (A Bag of Stellar Tracks and Isochrones) Astronomical Database and its feeding FARANEC code with grid technology. Web portal developed to facilitate grid execution of user-generated stellar evolution simulations. Plans for the future: GReIC is going to be evaluated for providing a mechanism to access databases from DCIs; a crucial requirement of the A&A community. New upgraded web portal for BaSTI/FRANEC built on top of gUse/WS-PGRADE Intensify the activity aimed to the harvesting of requirements, use-cases and test-beds to be brought in EGI and other strategic FP7 projects. Increase the number of A&A applications and workflows able to run on EGI DCI. EGI-InSPIRE Review 2012

26 Services for ES (SUB)TASK PY2 RESULTS TSA3.6
Support ES Activities in ES communities and projects, and carried out by researchers & students in Universities Main activity by proposal: access to GENESI-DEC Status: Webservice available and maintained Extensions of the service by ESSI-Lab GI-cat service to extend search of data Further developments in Task: Integration with available GEOSS services to access ES Data not only from GENESI and Climate Data from ESG Since January 2011 common developments with Earth System Grid (ESG) Solution to access data from ESG nodes and from the EGI infrastructure and vice versa available Main problem: authentication / authorization due to different federations 1st workaround in place Institute IPSL/CNRS, IPGP still unfunded partner in TSA3.6 Task leader: Horst Schwichtenberg, Deputy: André Gemünd EGI-InSPIRE Review 2012

27 Services for ES The Earth System Grid
Distributed infrastructure developed to support CMIP5 The Coupled Model Intercomparison Project, Phase 5 Internationally co-ordinated set of climate model experiments Involving climate model centres from all over the world. Recent work: MPI implementation of ESG application code runs on EGI infrastructure Implementation of multi-threaded climate data transfer program to download data from the ESG nodes Investigation of a solution to streamline ESG data access from EGI infrastructure. Review and testing instance of NorduGrid Security Stack Problem communicated to other EGI representatives with situation description and possible solutions Collaboration between representatives of EGI, TSA3.6 and Earth System Grid Federation EGI-InSPIRE Review 2012

28 Plans for next year (1/3) Build on success in identifying and delivering common solutions Some significant successes seen – beyond expectations! Further development of sustainability plans which have been demonstrated during project in some cases Collaborative support, less sensitive to changes at individual sites Minor modifications to the DoW proposed: Synchronisation of MS619 with EGI CF 2013 PC Move MS620 to PM30 (from PM34) to reflect end of SA3 in PM 36 (See next slide) D6.8 (sustainability – PM33) & D6.9 (annual report – PM35) could also be merged Significant overlap: both address sustainability EGI-InSPIRE Review 2012

29 Plans for next year (2/3) The SA3 work package ends in April 2013
All of the objectives must be achieved by then Any developments must be completed, fully documented, put into production & their long-term support handed over Even if, in some cases, there is on-going work the plans must take into account this reality EGI-InSPIRE Review 2012

30 Plans for next year (3/3) All disciplines supported by SA3 rely on external funding, typically through national funding agencies (including international organisations such as CERN – funded by member states); The specific support provided by SA3 has ensured a smooth transition to EGI, brought new services to production and increased commonality, reducing long-term support costs; It will be missed but we will (need to) adapt within individual communities to ensure continued support; This adaption – together with completion of existing tasks – will define the work plan for PY3. EGI-InSPIRE Review 2012

31 Review of Objectives 2011 Objective Status
Supporting the tools, services and capabilities required by different HUCs Achieved – work continues in PY2 / PY3 Identifying the tools, services and capabilities currently used by the HUCs that can benefit all user communities and to promote their adoption Several additional items identified and shared across other communities – work will expand and continue in PY2 / PY3 Migrating the tools, services and capabilities that could benefit all user communities into a sustainable support model as part of the core EGI infrastructure Not started – needs further work and discussion First steps: D6.2 (sustainability) & EGI TF w/s Establishing a sustainable support model for the tools, services and capabilities that will remain relevant to single HUCs Collaborative support is the basic model for sustainability that does not depend on individual partners nor specific project funding. A workshop on sustainability at the EGI TF is proposed – work continues and expands in PY2 and PY3. See also D6.2. EGI-InSPIRE Review 2012

32 Review of Objectives 2012 Objective Status
Supporting the tools, services and capabilities required by different HUCs Achieved – work continues in PY3; support must be handed over prior to WP end! Identifying the tools, services and capabilities currently used by the HUCs that can benefit all user communities and to promote their adoption Several additional items identified and shared across communities – work expanded and will continue in PY3. By definition, the work has focused on production use, including enabling new (and old) VOs. Migrating the tools, services and capabilities that could benefit all user communities into a sustainable support model as part of the core EGI infrastructure The long-term support of these tools will depend on the communities that require them. Establishing a sustainable support model for the tools, services and capabilities that will remain relevant to single HUCs Collaborative support is the basic model for sustainability that does not depend on individual partners nor specific project funding. A workshop on sustainability at the EGI TF 2011 was held and is documented in D6.5. EGI-InSPIRE Review 2012

33 Review of Objectives 2012 Objective Status Supporting the tools, services and capabilities required by different HUCs Achieved – work continues in PY3; support must be handed over prior to WP end! Identifying the tools, services and capabilities currently used by the HUCs that can benefit all user communities and to promote their adoption Several additional items identified and shared across communities – work expanded and will continue in PY3. By definition, the work has focused on production use, including enabling new (and old) VOs. Migrating the tools, services and capabilities that could benefit all user communities into a sustainable support model as part of the core EGI infrastructure The long-term support of these tools will depend on the communities that require them. Establishing a sustainable support model for the tools, services and capabilities that will remain relevant to single HUCs Collaborative support is the basic model for sustainability that does not depend on individual partners nor specific project funding. A workshop on sustainability at the EGI TF 2011 was held and is documented in D6.5. Some features required by HUC(s) have become an integral part of the core i/s, e.g. GGUS enhancements EGI-InSPIRE Review 2012

34 (Some) Data Related Issues
Data Preservation Storage Management Data Access Data Management From IEEE MSST, May 2010

35 Decomposition Area Situation Opportunity Storage Management
BE: both commercial and home-grown. FE: home-grown with standard solutions gaining ground at T2/T3s (CASTOR, dCache+DMF|Enstore|HPSS|TSM, DPM, StoRM, BeStMan …) IMHO ‘home-grown’ solutions cannot be considered a long-term requirement. But maybe post-LHC… Data Access SRM: bulk operations, e.g. transfers Xrootd: lan/wan access, transfers http: widely accepted as promising NFS 4.x: controversial Move to standard protocols (Standard = used outside HEP) Data Management Data Placement, Caching, Popularity ++ Strong coupling to computing models (which in HEP started similar and are converging) T0/T1/T2 adopted by Fusion; Jim Gray Significant progress in finding commonalities across experiments. Can we share with other communities? Data Preservation Finally(?) accepted as critical in HEP. An area where we have much to learn from other communities See key Use Cases

36 Achievements in Context
SA3 has fostered and developed cross-VO and cross-community solutions beyond that previously achieved Benefits of multi-community WP The production use of grid at the petascale and “Terra”scale has been fully and smoothly achieved Benefits of many years of grid funding EGI-InSPIRE Review 2012

37 Achievements in Context
SA3 has fostered and developed cross-VO and cross-community solutions beyond that previously achieved Benefits of multi-community WP The production use of grid at the petascale and “Terra”scale has been fully and smoothly achieved Benefits of many years of grid funding EGI-InSPIRE Review 2012

38 Achievements in Context
SA3 has fostered and developed cross-VO and cross-community solutions beyond that previously achieved Benefits of multi-community WP The production use of grid at the petascale and “Terra”scale has been fully and smoothly achieved Benefits of many years of grid funding EGI-InSPIRE Review 2012

39 Summary Successfully supported major production computing at an unprecedented scale – both quantitatively and qualitatively Successfully delivered common solutions in a variety of areas – with exciting new activities in progress Actively participated in EGI Technical & Community Forum via presentations, tutorials and demos Completed second round of Milestones & Deliverables together with associated technical work Extended and demonstrated sustainability models (e.g. collaborative development) in a number of key areas Actively participated in WLCG Technology Evolution working groups and will be active in their implementation (extends post-SA3) EGI-InSPIRE Review 2012

40 Summary Reliability Innovation Openness Leadership
Successfully supported major production computing at an unprecedented scale – both quantitatively and qualitatively Successfully delivered common solutions in a variety of areas – with exciting new activities in progress Actively participated in EGI Technical & Community Forum via presentations, tutorials and demos Completed second round of Milestones & Deliverables together with associated technical work Extended and demonstrated sustainability models (e.g. collaborative development) in a number of key areas Actively participated in WLCG Technology Evolution working groups and will be active in their implementation (extends post-SA3) Innovation Openness Leadership EGI-InSPIRE Review 2012

41 Summary Successfully supported major production computing at an unprecedented scale – both quantitatively and qualitatively Successfully delivered common solutions in a variety of areas – with exciting new activities in progress Actively participated in EGI Technical & Community Forum via presentations, tutorials and demos Completed second round of Milestones & Deliverables together with associated technical work Extended and demonstrated sustainability models (e.g. collaborative development) in a number of key areas Actively participated in WLCG Technology Evolution working groups and will be active in their implementation (extends post-SA3) EGI-InSPIRE Review 2012

42 BACKUP – Detailed Achievements & Plans
Summarized from D6.6

43 HMMXV – Tentative Schedule
Network stuff…. IPV6? HMMXV – Tentative Schedule T1.RU? Others? Frontier/squid as WLCG services Café? EMI StAR? Federated storage? FTS3 testing… IS? CVMFS Ready for launch? decommissioned Streams (T0-T1?) decommissioning? T0.HU Non-critical path stuff, e.g. monitoring? Ciao WMS glexec? CE stress 2012 2014 2013 2015 LS1 starts LHC restart pp CHEP12 End EMI & EGI SA3 CHEP13 CHEP15 Additional workshops? WLCG OPS 2.0

44 Analysis Tools & Support
Services for HEP Analysis Tools & Support HammerCloud - a Grid site testing service. Modus operandi Site availability & functionality; frequent, short jobs; Site commissioning & benchmarking; on-demand stress testing. Development highlights Rationalisation of codebase; Extension of ‘job robot’ functionality for grid site validation; Migration of databases to CERN-IT hosted MySQL service. Future direction Standard benchmarks for storage element evaluation; Improvements to packaging and installation procedures; Central reporting for LHCb functionality tests; Pan-VO error and performance correlation; Interface for user-initiated testing of grid sites. EGI-InSPIRE Review 2012

45 Analysis Tools & Support
Services for HEP Analysis Tools & Support CMS CRAB Client - The CMS Remote Analysis Builder Development highlights Maintenance of CRAB2. Development and commissioning of CRAB3. Future direction Wide scale testing of entire CRAB3 software chain. Reimplementation of monitoring service. Centralised CouchDB-based solution. Monitoring test results collected from distributed agents. Support for merging and publication of (user) output files. Support for HammerCloud-driven task submission. Further developments based on user-feedback. EGI-InSPIRE Review 2012

46 Data Management Tools & Support
Services for HEP Data Management Tools & Support ATLAS Distributed Data Management - Replication, access and bookkeeping of ATLAS data. Development highlights DDM infrastructure visible in CERN-IT hosted Service Level Status overview pages. DDM site-service optimisation (e.g. data-placement utilities) Future direction Focus on code maintenance and support operations. EGI-InSPIRE Review 2012

47 Data Management Tools & Support
Services for HEP Data Management Tools & Support CMS Data Management - The CMS Popularity Service; monitoring data-access patterns. Development highlights. Deployment of site cleaning agent. Based on original ATLAS codebase; both implementations share a common core. Modular, expandable architecture. Fully automated scanning of grid site storage elements. Obsolete and ‘unpopular’ datasets removed from sites approaching capacity. Database backend and web frontend improved. Future direction Usage-pattern analysis for XRootD accessed datasets. Dynamic data placement. Replication of ‘popular’ datasets. EGI-InSPIRE Review 2012

48 Data Management Tools & Support
Services for HEP Data Management Tools & Support LHCb DIRAC – Framework for all grid computing activities for the LHCb VO (also used by others). Development highlights LHCb-specific functionality separated from DIRAC. Streamlined development cycle for LHCb community feature requests. Future direction Data consistency service consolidation. Support and improvement of recently deployed storage accounting tools. Handling anticipated changes to data-management middleware. Further development of data-popularity service. LHCb computing operations support. EGI-InSPIRE Review 2012

49 WLCG Technical Evolution Groups
Services for HEP WLCG Technical Evolution Groups WLCG Technical Evolution Groups – (TEGs) established in the 2nd half of 2011. Recent work Reassessment of grid infrastructure implementation; based on LHC run experience and evolution of technology. Presentation of TEGs findings presented in February 2012. Future direction Document summarising TEG reports. Prepared by TEG chairs. Presented at WLCG Collaboration Workshop New York (May 2012). Recommendations from TEGs implemented within relatively short timescale (based on LHC shutdown schedule). EGI-InSPIRE Review 2012

50 LSGC User Management Tools
Services for LS LSGC User Management Tools Life Sciences Grid Community User Management Tools User support Technical assistance for end users. Dissemination and knowledge transfer. Community resource monitoring All WMSs, CEs and SEs for the biomed VO Storage space monitoring tools Improvements to the Nagios monitoring probes Improving user experience Deployment of redundant VOMS server. Investigate viability of redundant LFC servers. Improvements to storage decommissioning procedures. Future direction Development of a HUC users database and management tools to assist VO administrators in their daily task EGI-InSPIRE Review 2012

51 Hydra Encryption Service
Services for LS Hydra Encryption Service Hydra – an EMI-developed service for handling the encryption and decryption of sensitive files. Development highlights Hydra server: Deployed on gLite 3.1 and gLite 3.2 (beta-release) Improvements to the installation and configuration procedure. Hydra client: Work on Hydra client provision for sites providing resources to Life Sciences Discussion with EMI to integrate the client in the official m/w releases Service deployment: prototype service available for testing Future direction Distributed (3-servers) Hydra key-store deployment Client packages compatible with the latest middleware release EGI-InSPIRE Review 2012

52 Coordination of the A&A Community
Services for A&A Coordination of the A&A Community Aims to provide requirements, use-cases and test-beds for EGI. Scope includes interactivity between heterogeneous e-Infrastructures (e.g. Grid, HCP, Cloud). Support for access and management of Grid-based astronomical databases. Recent work: Coordination of activities within the community concerning the usage of DCIs by both small-scale and large-scale projects has been intensified. Input encouraged during requirements gathering for EGI. A coordinating workshop was organized at the Astronomical Observatory of Paris on November 7th 2011 Major astronomical projects and research areas represented. Plans for the future: Based on conclusions at Paris workshop Community effort should focus on identifying major A&A projects and institutes suited to DCI adoption. The process aimed at the creation of new VRCs is currently in progress with regular checkpoints to verify its progress. EGI-InSPIRE Review 2012

53 Services for A&A VisIVO
Visualization Interface for the Virtual Observatory (VisIVO) - a visualization and analysis software for astrophysical data. Recent work: Fully compliant implementation of the VisIVO MPI utility developed provide compatibility with the gLite grid infrastructure. A hybrid server was procured and deployed as a grid computing node to provide a combined GPU (Graphical Processing Unit) and CPU processing facility. Plans for the future: Ability of VisIVO to utilise GPUs on grid worker nodes. Optimisation of VisIVO visualization component when running on huge user data tables. EGI-InSPIRE Review 2012

54 Services for A&A Grid & HPC
Working towards a stable environment for A & A activities. Recent work: Close coordination of activities between EGI and IGI (Italian NGI) resources. Workflows defined to facilitate smooth operation of complex algorithms running on very large data files. Plans for the future: Identify and define cosmological simulation routines, and validate their output. Deploy small-scale HPC resources at grid sites. Challenges include configuration of systems and integration into the grid middleware infrastructure. Validation of resources based on results from commonly used applications. HPC activity to continue in Italy, driven by demands of interested communities. EGI-InSPIRE Review 2012

55 Services for A&A Access to Databases and interoperability with the VOBs Identification of use-cases and test-beds requiring simultaneous access to astronomical data and to computing resources. Recent work: Work in progress to integrate BaSTI (A Bag of Stellar Tracks and Isochrones) Astronomical Database and its feeding FARANEC code with grid technology. Web portal developed to facilitate grid execution of user-generated stellar evolution simulations. Plans for the future: GReIC is under evaluation for providing a mechanism to access databases from DCIs; a crucial requirement of the A&A community. EGI-InSPIRE Review 2012

56 Services for ES GENESI-DR
A standardized data discovery interface for a federation of repositories. Recent work: A smarter data transfer component capable of discovering and downloading resources. The ability to set configurable parameters on a per-user basis from a dedicated configuration file. Improvements to end-user documentation, including typical use case examples and ‘getting started’ examples. A flexible web GUI was deployed. Plans for the future: Development of a command-line client to provide GENESI-DEC features to jobs running on the EGI grid infrastructure. EGI-InSPIRE Review 2012

57 Introduction to Earth System Grid
Services for ES Introduction to Earth System Grid Distributed infrastructure developed to support CMIP5 The Coupled Model Intercomparison Project, Phase 5 Internationally co-ordinated set of climate model experiments Involving climate model centres from all over the world. Recent work: MPI implementation of ESG application code runs on EGI infrastructure Implementation of multi-threaded climate data transfer program to download data from the ESG nodes Investigation of a solution to streamline ESG data access from EGI infrastructure. Review and testing instance of NorduGrid Security Stack Problem communicated to other EGI representatives with situation description and possible solutions Collaboration between representatives of EGI, SA3.6 and Earth System Grid Federation EGI-InSPIRE Review 2012

58 Introduction to Earth System Grid
Services for ES Introduction to Earth System Grid Recent work II: Improvements of the smart data transfer tool ‘synchro-data’ Contribution to the federated identity Management Workshop Maintenance of loosely coordination in ES VRC with EGU sessions and possible collaboration Plans for the future: ESG interoperability team is developing and testing a scenario based on an application from IPSL which uses CMIP5 data (climate model data stored on the ESG); A prototype modification of MyProxy is currently being developed Will issue ESG certificates based on EGI certificate authentication. EGI-InSPIRE Review 2012

59 Other Communities & Shared Services & Tools
Monitoring & Dashboards Experiment Dashboard – Monitoring distributed computing infrastructure and activities. Development highlights Performance, scalability and functionality improvements driven by LHC computing activity growth. Hardware migration and Oracle upgrade for Dashboard database infrastructure. Common monitoring solutions for LHC communities. Migration to new Service Availability Monitoring architecture. Future direction Development guided by requests from LHC communities. EGI-InSPIRE Review 2012

60 Other Communities & Shared Services & Tools
GReIC Grid database management service providing access and management functionalities related to relational and non-relational databases in a Grid environment. Recent work: The first production releases of the DashboardDB registry and global monitoring for GRelC servers were deployed on the GRelC website as web gadgets (connection with NA2 activity, EGI Gadgets; The porting of the GRelC software on the gLite 3.2 middleware release was completed. RPMS available; Two use cases (UNIPROT, GeneOntology) related to the LS community jointly defined with the bioinformatics group (University of Salento) and implemented; Another use case (Invasive Alien Species) to support a platform for sharing invasive species information across the Italian country in the Italian LifeWatch Virtual Laboratory was defined. Activity on-going. Plans for the future: The support for management, monitoring and control of the GRelC services provided through the DashboardDB will be further extended and improved; Strong dissemination about the DashboardDB registry to register the GRelC service instances deployed in EGI; Additional support to the HUC will be provided to address new user needs and requirements; Website, training events, tutorials, talks and papers will further disseminate the results of this activity. EGI-InSPIRE Review 2012

61 GRelC Monitoring gadget
EGI-InSPIRE Review 2012

62 GRelC Monitoring gadget
EGI-InSPIRE Review 2012

63 DashboardDB (social) gadget
EGI-InSPIRE Review 2012 63 63

64 DashboardDB (social) gadget
EGI-InSPIRE Review 2012 64 64

65 Other Communities & Shared Services & Tools
SOMA2 A versatile modelling environment for computational drug discovery and molecular modelling. Recent work: Autodock 4 integration work continued and investigations started into how to setup a service for user communities; Program development in upgrading the core UI library components plus associated migration work. Basic Grid support in SOMA2 along with other minor fixes and improvements; New version release of SOMA2 – version 1.4 (Aluminium) was released – available on the web site; CSC maintained and operated the SOMA2 service. Plans for the future: Enable a Grid-enabled application service for interested user communities; Continue to operate and support the existing SOMA2 services; Advertise the Grid enabled SOMA2 service to different user communities; Longer term: expand the scientific applications integrated in to the SOMA2 service and integrate application services from different Grids hosted by different virtual organizations into SOMA2. EGI-InSPIRE Review 2012

66 Other Communities & Shared Services & Tools
Workflow & Schedulers These tools are critical in integrating complex processes, generally involving multiple data sources and different computational resources, as needed within many disciplines.  Recent work: New template use cases basing on user requirements, improving the performance of the Kepler actors, and performing tests of the different use cases; Use cases initially prepared for the Fusion community have been customised and reused by other communities; The initial work on this workflow has already started on DKES measuring the transport of particles using an equilibrium previously calculated by VMEC; The first astrophysics workflow has been customised and tested. It controls the production of realistic simulations of the anisotropies of the Cosmic Microwave Background (CMB) radiation; The first workflows from the CompChem area have been arranged and customised. Plans for the future: Dissemination activities have allowed us to show the impact of the work to other communities. The improvements introduced in the actors to improve reliability and scalability, can be of interest for many other communities and can potentially extend the usage of this tool. EGI-InSPIRE Review 2012

67 Other Communities & Shared Services & Tools
MPI The MPI sub-task produces numerous workbenches of increasing complexity with specific high impact on the Computational Chemistry, Earth Sciences, Fusion, and Astronomy & Astrophysics communities. Recent work: TCD and UNIPG started independent work on exploiting GPGPU enabled devices; UNIPG worked on grid implementations of many parallel codes, including CHIMERE, RWAVEPR, ABC, DL_POLY, NAMD and VENU96; CSIC has made continuous improvements to the MPI documentation wiki. CSIC has also tested MPI-START integration with SGE and PBS like job managers with newer versions of MPICH2; TCD produced a proof-of-concept work using MPI-START  to support generic parallel workloads. Plans for the future: TCD is investigating proof-of-concept execution of generic parallel codes using MPI-START; The current MPI-Virtual Team will come to an end in May The focused outcome of this groups work should be complete by then, including a better MPI testing infrastructure; TCD will propose a GPGPU virtual team. The aim of this group will be on how best to support GPGPU (and other generic resources) on the EGI grid infrastructure; CSIC continues to lead the MPI accounting sub-task ensuring correct accounting for MPI and parallel jobs. Aims to help produce better detection of PBS/Torque per-process limitations that affect parallel jobs at the EGI sites. EGI-InSPIRE Review 2012

68 2013 – 2014: Long Shutdown 1 (LS1) consolidate for 6.5 / 7TeV
Not yet approved! 2013 – 2014: Long Shutdown 1 (LS1) consolidate for 6.5 / 7TeV Measure all splices and repair the defective Consolidate interconnects with new design (clamp, shunt) Finish installation of pressure release valves (DN200) Magnet consolidation Measures to further reduce SEE (R2E): relocation, redesign, … Install collimators with integrated button BPMs (tertiary collimators and a few secondary collimators) Experiments consolidation/upgrades 2022 LS3 Installation of the HL-LHC hardware. Installation of LHeC Preparation for HE-LHC EGI-InSPIRE Review 2012

69 2015 – 2017: Physics at 6.5 / 7 TeV Not yet approved! 2022 LS3
Installation of the HL-LHC hardware. Installation of LHeC Preparation for HE-LHC EGI-InSPIRE Review 2012

70 2018: LS2 to prepare for ‘ultimate LHC’ parameter set:
Not yet approved! 2018: LS2 to prepare for ‘ultimate LHC’ parameter set: Phase II collimation upgrade Major injectors upgrade (LINAC4, 2GeV PS Booster, SPS coating, …) Prepare for crab cavities (for HL- LHC) 2022 LS3 Installation of the HL-LHC hardware. Installation of LHeC Preparation for HE-LHC EGI-InSPIRE Review 2012

71 Physics with ‘ultimate LHC’ parameter set
Not yet approved! 2019 – 2021: Physics with ‘ultimate LHC’ parameter set 2022 LS3 Installation of the HL-LHC hardware. Installation of LHeC Preparation for HE-LHC Parameters ‘Ultimate’ k (# of bunches) 2808 N (bunch intensity) 1.7*1011 p * 0.5 m Luminosity [cm-2s-1] 2.4*1034 E[TeV] 7 E[MJ] 541 EGI-InSPIRE Review 2012

72 Installation of HE-LHC and/or Installation of LHeC and/or
Not yet approved! 2022: LS3 Installation of HE-LHC and/or Installation of LHeC and/or Prepare for HL-LHC 2022 LS3 Installation of the HL-LHC hardware. Installation of LHeC Preparation for HE-LHC EGI-InSPIRE Review 2012


Download ppt "The EGI-InSPIRE Second Year Project Review"

Similar presentations


Ads by Google