Download presentation
Presentation is loading. Please wait.
Published byAusten Johnson Modified over 9 years ago
1
www.see-grid.eu SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no. 031775 WP3 progress report Antun Balaz WP3 Leader Institute of Physics, Belgrade antun@phy.bg.ac.yu SEE-GRID-2 PSC05 Meeting, Thessaloniki, Greece 11-12 September 2007
2
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20072 Overview WP3 objectives and activities WP3 position in SEE-GRID-2 WP3 deliverables and milestones WP3 schedule WP3 activities reports WP3 country reports WP3 Action points and issues
3
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20073 WP3 objectives & activities (1) Develop the next-generation SEE-GRID infrastructure Next generation of EGEE middleware (gLite), the VOMS, the WMS, information services and file catalogue services will be assessed having in mind project and WP3 objectives SEE-GRID infrastructure deployment regarding the middleware services will follow and adapt its services according to the results of the assessment. Support in deployment and operations of the Resource Centres Next generation monitoring services will be deployed so as to support the over-the-board infrastructure monitoring. The current SEE-GRID helpdesk will be expanded in SEE-GRID-2, with the main goal of full EGEE interoperability. Support the expansion and deal with the overall upgrade of the current infrastructure by proliferation of RCs in each SEE country increasing: the total available regional resources (CPUs, storage, etc.) thus boosting the capacity and reliability of the provision of Grid services at regional level, and the diversity and distribution of participating teams per country thus strengthening cooperation and collaboration at national level.
4
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20074 WP3 objectives & activities (2) Network resource provision and assurance WP3 will also deal with network resource provision, in close cooperation with the SEEREN2 project, thus ensuring stable connectivity for the RCs in the region. Attention will be paid to Bandwidth-on-Demand requirements, to cater for bandwidth-intensive applications in case they need dedicated resources for particular experiments. CA and RA guidelines and deployment Regional SEE-GRID catch-all Certification Authority (CA) will continue to operate providing certificates for countries without a CA. Experienced CA team will provide support for per-country CA deployment and accreditation. The cycle to establish a National Grid CA will be in compliance with the procedures and accreditation process of the EU Grid Policy Management Authority (EUGridPMA). Operations will be strengthened so as to support per-country CA operations
5
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20075 WP3 objectives & activities (3) User portal deployment and operations A user-friendly multi-Grid access portal will be deployed, enabling universal and more flexible user access to the regional infrastructure. The work on the SEE-GRID-2 portal should increase the user- friendliness of being able to select a grid and execute a workflow on the selected grid, so that interoperability of different grids is seamlessly and transparently solved at the application (workflow) level.
6
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20076 WP3 objectives & activities (4) A3.1 - Implementation of the advanced SEE-GRID infrastructure (UOB-IPB/IPP) Deals with support for configuration, deployment and operations of the Resource Centres within the SEE-GRID pilot infrastructure, as well as transition of mature centres into EGEE. Effort: 89 PMs Subactivities: A3.1.1 - Expand the existing SEE-GRID topology by inclusion of new sites per SEE country A3.1.2 - Deploy M/W components and OS in SEE Resource Centers A3.1.3 - Test the site installations in local and Grid mode A3.1.4 - Operate the SEE-GRID infrastructure A3.1.5 - Monitor the infrastructure performance and assess its usage A3.1.6 - Certify and Migrate SEE-GRID sites from Regional Pilot to Global production-level eInfrastructures
7
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20077 WP3 objectives & activities (5) A3.2 - Network Resource Provision and BoD requirements (IPP) Support liaison actions to ensure adequate network provision, including the requirements for Bandwidth-on-demand, if and where necessary depending on the application. Effort: 39 PMs A3.3 - Deploy and operate Grid CAs (GRNET) Should provide CA and RA guidelines and help establish per- country CAs to cover the authentication issues Effort: 73 PMs A3.4 - Provide a user portal (SZTAKI) Supports the deployment of a user-friendly and multi-grid interoperable portal for convenient Grid access and usage. Effort: 15 PMs
8
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20078 WP3 position in SEE-GRID-2 (1) WP4 Users & Applications WP3 Infrastructure & Operations WP2 Strategies & Policies A4.1: Select multi- disciplinary applications A4.2: Adopt applications for the SEE user communities A4.3: Support deployed applications WP5 Training, Dissemination and Communication A2.1: Study grid deployment solutions A2.2: Deliver sustainable roadmap for SEE NGIs A3.1: Implementation (deploy, test, operate, monitor, certify, migrate) of an advanced SEE Grid Infrastructure A3.4: Grid access Portal A3.2: Network resource provision A3.3: Deployment and Operational support for accredited Grid Certification Authorities A4.4: Assess application usage WP1 Project Administrative and Technical Management
9
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 20079 WP3 position in SEE-GRID-2 (2) Results of WP2 will be used as inputs D2.1 - Regional and National Organisational and Policy Schemes D2.2 - Sustainable organizational and operational approach D2.3(a,b) - Sustainability and Impact Analysis of SEE National Grid Initiatives Results of WP3 used as input to WP4 All partners participate in WP3 Activities start in M1, end in M24 WP3 planned budget is 672,512.00 €, or ~33.6% of the SEE-GRID-2 budget WP3 is planned to take 216 PMs, or ~34.2% of all SEE- GRID-2 PMs
10
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200710 WP3 Deliverables & Milestones (1) D3.1a - Infrastructure Deployment Plan, M04 (CERN) Describes the envisaged infrastructure deployment execution plan to be followed in the region. Prepared and submitted on time D3.2 - CA and RA guidelines for new candidates, M05 (GRNET/AUTH) This deliverable describes the guidelines and best practices of the per-country CA and RA organization and policies. Prepared and submitted on time D3.3 - Portal specifications and functionality, M06 (SZTAKI) This deliverable provides the characteristics and structure of the multi-grid user-oriented portal. Prepared and submitted on time
11
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200711 WP3 Deliverables & Milestones (2) D3.1b - Infrastructure Deployment Plan, M14 (CERN) Final version of D3.1. Prepared and submitted on time We are currently in M17 Future: D3.4 - Infrastructure overview and assessment, M23 (UOB-IPB) This deliverable presents an overview and assessment of the progress in the regional infrastructure and operations in the life of the project
12
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200712 WP3 Deliverables & Milestones (3) M3.1 - Infrastructure Deployment Plan Defined M04, Status: OK M3.2 - CA and RA guidelines for new candidates defined M05, Status: OK M3.3 - Portal operational across the pilot Grid M12, Status: OK
13
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200713 WP3 schedule M01 - Start of WP3 M04 - Infrastructure Deployment Plan (M3.1, D3.1a) M05 - CA and RA guidelines for new candidates (M3.2, D3.2) M06 - Portal specifications and functionality (D3.3) M12 - Portal operational across the pilot Grid (M3.3) M14 - Final Infrastructure Deployment Plan (D3.1b) M17 – This is where we are currently M23 - Infrastructure overview and assessment (D3.4) M24 - End of WP3 (and of the project)
14
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200714 WP3 activities overview A3.1: Implementation of the advanced SEE-GRID infrastructure (UOB-IPB/IPP) A3.2: Network Resource Provision and BoD requirements (IPP) A3.3: Deploy and operate Grid CAs (GRNET) A3.4: Provide a user portal (SZTAKI)
15
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200715 A3.1: Overview Infrastructure status gLite deployment status SEEGRID VO metrics and accounting Operations SLA conformance monitoring per site Helpdesk tickets procedures and statistics analysis GOOD shifts MPI support on SEE-GRID sites Operational & monitoring tools deployment & integration HGSM SAM (+ porting to MySQL) WiatG R-GMA Pakiti SEE-GRID Wiki status WP3 developments Infrastructure, Site and VO metrics
16
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200716 Infrastructure status (1) Concerning the middleware deployments, the current SEE-GRID infrastructure supports a set of core services which provide user access to resources: Catch-all Certification Authority for the region has been officially accredited by the EU Grid Policy Management Authority - EUGridPMA, and is currently operational thus enabling regional sites to obtain user and host certificates Virtual Organisation Management Service (VOMS), server has been installed as an authorization system for the SEE-GRID Virtual Organisation (VO), which provides information on the user's relationship with the Virtual Organization, his/her groups, roles and capabilities Workload management service (lcg-RB and glite-WMSLB) and Information Services (BDII) nodes (several instances) have been installed at partners’ sites and are operational MyProxy is operational, and supports certificate renewal FTS deployed and used in production
17
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200717 Infrastructure status (2) SEE-GRID total and free CPUs in the last year
18
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200718 Infrastructure status (3) SEE-GRID infrastructure contains currently the following resources: 30 sites in SEE-GRID production 6 sites in certification phase (2 AL + 1 HR + 2 RO + 1 MD) CPUs: 1105 total, but unknown number available to SEEGRID VO; increase of approx. 400 CPUs compared to PSC04 Storage: 17.6 TB (no increase) All sites on gLite-3, with 3 sites on gLite-3.1 and the rest on gLite-3.0 glite-CE final assessment by EGEE is that this service is not stable enough for production; we agree glite-WMSLB actively used Guides provided for deployment of gLite-3.1 WNs on SL4.5 (32-bit); in preparation guide for 64-bit WNs
19
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200719 Infrastructure (4): VO membership steady growth start of project ~90 members end of 02/07 ~110 members end of 08/07 ~160 members
20
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200720 Infrastructure (5): VO members per country
21
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200721 Operations (1) SLA conformance monitored per site; tools used: HGSM SAM GStat WiatG Helpdesk SLA conformance analysis: SEEGRID2-WP3-RS-018-SLA-Q5-2007-08-06.xls Helpdesk tickets procedures GOOD shifts introduced, initial results positive Tickets handling: response times need to be improved! Problems with GOOD shifts – some partners not performing duty! Helpdesk statistics analysis: SEEGRID2-WP3-RO-008-PSC05-Helpdesk_Statistics-2007-09-10.xls SEEGRID2-WP3-RS-020-PSC05-Tickets_not_closed-a-2007-09-11.xls OPS role implemented in VOMS, documented, implemented & used
22
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200722 Operations (2) MPI support on SEE-GRID sites Important for many applications Current support not sufficient Seems there is a problem with setup on some sites – GOOD shifts are addressing this, but this is not sufficient Proposal: WG to be created which should define minimal standards for MPI support, and provide template scripts and JDL files for submission of MPI jobs WG can be composed of representatives from sites supporting MPI and having experience: Bulgaria Turkey Greece Serbia
23
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200723 Operational & monitoring tools Operational & monitoring tools deployment status HGSM – Turkey SAM (+ porting to MySQL) – Bosnia and Herzegovina with CERN support BBmSAM (Bosnia and Herzegovina) WiatG R-GMA – Bulgaria Pakiti
24
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200724 HGSM is a central database that holds all crucial information about a grid site It has an interactive interface for users to see and update the information available in the database It has exports to interface with other services to enable integration between HGSM and other services that make use of HGSM's information HGSM is constantly improved with new features and new exports to enhance its functionality, role and information exchange over grid services for administrative purposes HGSM (1): Background
25
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200725 Started in May-2007 with appearance of requests for new functionality and enhancements due to inefficiency of HGSM in some areas in the GIM list. Since the requests was big and required some major changes in HGSM structure, an “Internals Document” is prepared and passed to people in the discussion in the end of May. During June, the requests are anaylzed and refined down so everybody was aware of the missing parts, possible solutions and impacts of the new functionality HGSM (2): Latest Development Period (Initiation and Planning)
26
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200726 In the end of June a detailed “Roadmap Document” -which covers the problems, solutions and a step- by-step action plan in every detail- is prepared and opened for discussion in the GIM list. The Document is revised and updated according to feedback taken by the list. Document yielded 4 revisions, three drafts and a final which is published on Jul 12, 2007. Active development started on Jul, 16 according to plan announced in the Document. HGSM (3): Latest Development Period (Finalizing and Taking Action)
27
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200727 Following improvements are qualified for this development period : Revision of visual interface fields. Introduction universal exporting subsystem which can be used by humans and other computers. Introduction of an importing subsystem which can import exported data for administrators. An automatic field filling tool for administrators. Functionality improvement on various pages of visual interface. Improving the handling of supported applications in HGSM and developing a MAUI configuration editing tool for reflecting changes to grid sites directly and automatically. Enabling HGSM to track site information history for statistical purposes. (and Progress So Far) (100%) (80%) (0%, 10 Sep) (0%, 24 Sep) (0%, 8 Nov) (0%, 22 Nov) (0%, 3 Dec) Note: Dates indicate rough estimations of kick-off dates. HGSM (4): The Improvements
28
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200728 SAM/BBmSAM SAM Server Portal BBmSAM Service availability and SLA calculations implemented MAINTENANCE status implemented in a better way Enhanced OVERVIEW (main) page of BBmSAM –Showing uptime for last 24h –Filters available for: country, tier, certified status, last test state BBmobileSAM Now also showing uptime percentage for last 24h HGSM integration Preparation for HGSM shift to new version Database Now running strictly off MySQL, no Oracle used Reorganization of indexes – improved performance
29
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200729 WiatG: Introduction Web application for visualization of BDII information http://bdii.phy.bg.ac.yu/WiatG/pl/WiatG.pl Used as an operational tool for site monitoring Highly responsive tool because it uses AJAX Partial refresh (client receives part by part of the page) Asynchronous (server is processing in the background, so one may send several requests) Current version seeks for: CE, gCE, RB, gRB, SE, LFC, FTS and GridICE Documentation available: http://wiki.egee-see.org/index.php/WiatG SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007
30
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200730 WiatG: Who is using it Several regional projects EUMedGRID (bdii.isabella.grnet.gr) EUChinaGrid (euchina-bdii-1.cnaf.infn.it) EELA (lnx112.eela.if.ufrj.br) BalticGrid (bdii.mif.vu.lt) Int-EU-Grid (i2g-ii01.lip.pt) Health-e-Child(hec-maat-server2.cern.ch) http://hec-maat-server1.cern.ch/WiatG/pl/WiatG.pl ROC CERN PROD (lcg-bdii.cern.ch) PPS (pps-bdii.cern.ch) OPS (sam-bdii.cern.ch) SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007
31
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200731 WiatG: Technologies Following technologies are included in WiatG: Perl is used for LDAP connection to BDII and generation of HTML and XML data. XML, the format for sending data from the web server to the client. Cascading Style Sheets (CSS), a markup language used to define the presentation style of a page. JavaScript, a scripting language. XMLHttpRequest, an object that is used to exchange data between web client and web server. Document Object Model (DOM), which provides a logical view of a web page as a tree structure. SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007
32
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200732 WiatG: Architecture SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007 Browser Client WiatG Server Side BDII LDAP User Interface XMLHTTPRequest ApacheHTTPServer XMLHTTPRequest callback() LDAPtoXML script Java Script Call HTTP Request QueryResponse XML Data HTML & CSS Data
33
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200733 WiatG in action SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007
34
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200734 WiatG: Further development (short term) Addition of new services (MyProxy, localLFC, VO software tags, …) Development of the new tool “What should be at the Grid” (WsbatG) Based on the site configuration exported from HGSM/GOCDB Visually identical tool, providing the expected status of BDII in WiatG SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007
35
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200735 WiatG: Further development (long term) SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007 Alarms Dashboard BDII web service sBDII web service Check correctness of sBDII data Check correctness of BDII data Check equality of sBDII-BDII information SAM HGSM/GOCDB web service HGSM/GOCDB web service Check equality of BDII- HGSM/GOCDB information Check correctness of HGSM/GOCDB data WiatGS User Interface WiatGS WiatG WiatG WsbatG WsbatG Alarms Dashboard UI
36
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200736 R-GMA (1) Accounting views for SEEGRID-only sites per site accounting: ● https://gserv1.ipp.acad.bg:8443/Accounting
37
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200737 R-GMA Accounting views for SEEGRID and EGEE sites that support SEEGRID – per country/institution user accounting https://gserv1.ipp.acad.bg:8443/Accounting-2
38
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200738 R-GMA (2) Accounting views for Job success rates and other statistics –in progress, currently running on our own data only. https://gserv1.ipp.acad.bg:8443/Jmon
39
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200739 R-GMA (3) Accounting views for Job success rates and other statistics –in progress, currently running on our own data only. https://gserv1.ipp.acad.bg:8443/Jmon
40
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200740 Pakiti: Overview Pakiti Client Installed on all nodes Checks software versions against configured repositories Sends report once per day to pakiti server Pakiti Server Running at the Aristotle University of Thessaloniki Main Components: Feed –Daily reports from clients Site Administrator ’ s front-end –Detailed view of the rpm package status at each node –Access is permitted only to each the administrator ’ s of each site via TLS Authentication using X.509v3 Certificates Addon Components ROC Manager ’ s front-end –Aggregated view of the status of all the sites in the ROC –Developed by the AUTH GOC
41
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200741 Pakiti: Status of the Service Pakiti enabled sites in SEE-GRID ROC: Bosnia Herzegovina BA-04-PMFSA Bulgaria BG01-IPP BG02-IM BG04-ACAD BG05-SUGRID Croatia HR-01-RBI Greece HG-01-GRNET HG-03-AUTH Romania RO-01-ICI Serbia AEGIS01-PHY-SCL AEGIS02-RCUB AEGIS03-ELEF-LEDA AEGIS04-KG AEGIS05-ETFBG Turkey TR-01-ULAKBIM TR-05-BOUN
42
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200742 Pakiti and reporting The deployment of pakiti on sites is voluntary Sites deploying it provide accurate information on updates status on their nodes Proposal is that, in order to further improve status of SEE-GRID sites, all sites report in their 3M reports the following: Middleware version changes during the quarter Status of updates (not needed if pakiti is deployed) Major operational issues Template will be provided for this; if pakiti is deployed) such report would just contain one line with gLite version changed and paragraph or two describing operational issues encountered (basically, structured version of section 2.3 of 3M reports, now submitted by each site)
43
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200743 SEE-GRID reorganized Wiki Reorganized SEE-GRID Wiki is now the main Wiki http://wiki.egee-see.org/index.php/SEE-GRID_Wiki Many documents still missing, main being: Participating in SEEGRID as a Site (AP1) Policy Documents (AP3) SEEGRID certification procedure (AP4) LFC (AP6) RGMA (AP7) My Proxy (AP8) BDII/RB (AP9) FTS (AP10) SAM (AP11) GridICE(AP12) How to Join the SEEGRID infrastructure as a user (AP14) Grid usage basics (AP16) Effort needed by all partners; UKIM coordinating
44
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200744 WP3 developments HGSM WiatG Accounting glite-yaim-seegrid soon-to-be deployed apt/yum repository
45
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200745 Infrastructure, Site and VO metrics Infrastructure growth (CPUs, storage, memory?) Bandwidth growth? Site availabilities and downtimes (CE, SE) Accounting data (per site, per country, per application, per VO, per user community, time distribution); here SEEGRID VO and national VOs should be considered Job success rates (per site, per country, per application, per VO, per user community, time distribution); here SEEGRID VO and national VOs should be considered VO membership time evolution, distribution per country, per application, per user community; here SEEGRID VO and national VOs should be considered
46
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200746 A3.2: Status Concerning the bandwidt-on-demand some tests were done in order to investigate the following protocols and services using our new router - CISCO 12000 XR: Resource Reservation Protocol (RSVP); Generalized Multi Protocol Label Switching (GMPLS); Differentiated Services (DiffServ); Standard activities for BoD.
47
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200747 A3.3: Overview of the work CA establishment in SEE Region Each country must setup each own national Certification Authorities Each CA must be accredited by the EUGridPMA see-ca-incubation mailing list (see-ca-incubation@grid.auth.gr) Support during the process of establishing a new CA and for the accreditation period CA common procedures and best practices advices are provided Help on writing the CP/CPS documents Process for establishing a new CA takes around one year
48
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200748 A3.3: Status New accredited CAs in the Region Serbian CA (AEGIS CA) Accreditation request on August 23, 2006 Under review by GridAUTH team and SRCE CA (on behalf of EUGridPMA) Accredited on June 1, 2007 Operational since June 10, 2007 Romanian CA (ROSA CA) Accreditation request on January 25, 2006 Under review by GridAUTH team, PK-GRID CA and CESNET CA (on behalf of EUGridPMA) Accredited on August 1, 2007 Grid CA candidates Montenegro CA (MREN CA) CP/CPS reviewed by GridAUTH (via see-ca-incubation mailing list) on July 10, 2007 F.Y.R.O.M. CA (MARGI CA) Accreditation request on May 4, 2007 First CP/CPS not yet available All candidates are encouraged to participate at the EUGridPMA meetings (Next meeting in Thessaloniki, Sept 19-21)
49
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200749 A3.4: Overview Official SEE-GRID2 Portal user maintenance (52 users) Quota management, user management Official SEE-GRID2 Portal maintenance Software (P-GRADE Portal version 2.5) P-GRADE Portal software bug fixing
50
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200750 A3.4: Status and development Portlets development has been done by turkish partner (P-GRADE Portal Development Alliance) Already active /beta-test/ in a private portal installation. New portlets File Management Portlet to manage the remote files through the LFC catalog and the LCG interface. Hot topic! Intended to merge into official P-GRADE Portal v2.5. Credential management portlet to complement the existing certificate portlet with info, change pass-phrase and destroy operations. Intended to merge with the default certificates portlet.
51
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200751 WP3 country reports Greece Switzerland/CERN Bulgaria Romania Turkey Hungary Albania Bosnia and Herzegovina FYR of Macedonia Serbia Montenegro Moldova Croatia Work performed since PSC04 Conformance to WP3 objectives Issues, if any
52
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200752 Greece Pakiti Service enabled for SEE-GRID infrastructure https://monitor.grid.auth.gr/services/pakiti/ROC/SEE-GRID October 2007 HG-06-EKT will support SEEGRID VO 228 CPUs, 9.3 TB Storage Exact Resources Dedicated for SEEGRID VO to be decided.
53
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200753 Support to WP3 operations GOOD shifts Solving operational problems Support for SAM deployment and improvements, as well as liaison activities with SAM development team Liaison activities with operations in other regional Grid projects Strong involvement in operations-related developments: WiatG, WsbatG SEE-GRID apt/yum repository Switzerland/CERN
54
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200754 Bulgaria (1) Status of the infrastructure and plan for expansion 5 sites infrastructure – significantly stable with good up-time All sites upgrading now to SL4 WNs. Core services, monitoring tools: R-GMA: graphical on-line user interface Accounting views for –SEEGRID-only sites per site accounting –SEEGRID and EGEE sites that support SEEGRID – per country/institution user accounting –Job success rates and other statistics –in progress, currently running on our own data only. FTS: used in production by SALUTE WMS – moved to a better hardware. BDII stability and capability improvements
55
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200755 Bulgaria (2) 5 production sites in BG After start of EGEE II added new cluster with 80 CPUs and low-latency Myrinet interconnect for 80 CPUs – unique resource for special MPI jobs BG04-ACAD (80 CPU) BG01-IPP (12 CPU) CPUStorageTape April 06301 TB- September 07 1323.2 TB10TB
56
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200756 Bulgaria (3): Petri net performance analysis Anastas Misev from Macedonia was at IPP in a visit, sponsored by project BIS 21++, working on his Ph.D. thesis on Grid scheduling and failover. Host professor was E. Atanassov His analysis of our RB rb001 shows that The overall success rate of the analyzed data is somewhere near 70% The percentage of the successful jobs greatly depends on the users experience. Jobs by more experienced users have success rate above 80%. Interactive diagram helps in identifying the bottlenecks in the process model. It can show throughput time between any 2 transitions Color coding to specify low, middle and high waiting time Conclusions: Job re-submission does not help in 99% of cases 20% of the 470 jobs submitted by one user had waiting times above 57 hours, and all of them failed.
57
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200757 Bulgaria (4): Job success rate – Top ten CEs We have made a pivot analysis of the CEs and the final statuses of the jobs. The top 10 CEs are shown in the table below. Note that the percentage of successful jobs is more then 90%. Top 10 CEsFinal Status CE nameDONEABORTCANCELGrand Total ce.ulakbim.gov.tr:2119/jobmanager-lcgpbs-see754222 976 ce002.ipp.acad.bg:2119/jobmanager-lcgpbs- see63966705 ce01.ariagni.hellasgrid.gr:2119/jobmanager- pbs-see131721319 ce01.athena.hellasgrid.gr:2119/jobmanager- pbs-see30081053113 ce01.isabella.grnet.gr:2119/jobmanager-pbs- see76835811127 ce01.kallisto.hellasgrid.gr:2119/jobmanager- pbs-see21881932381 ce01.marie.hellasgrid.gr:2119/jobmanager- pbs-see1296401336 ce02.grid.acad.bg:2119/jobmanager-pbs- myrinet11201921141 ce02.grid.acad.bg:2119/jobmanager-pbs-see28831603043 ce101.grid.ucy.ac.cy:2119/jobmanager-lcgpbs- see100831011 Grand Total149811168316152
58
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200758 Accredited by EUGridPMA on March 05, 2007 Included in the IGTF CA RPM distribution from version 1.13 Effective operations started March 21, 2007 Web-page: http://www.ca.acad.bg/ Location: IPP-BAS, Sofia, Bulgaria Personnel: 4 CA staff members; 2 RA. Bulgaria (5): BG.ACAD CA – Status overview
59
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200759 Bulgaria (6): BG.ACAD CA – Status overview During the period March-August, 2007: Total of 40 certificates are signed by BG.ACAD CA, including: 30 user certificates 10 host certificates Total of 3 certificates are revoked by BG.ACAD CA. Regular patches and updates to CA ’ s OS and software are applied.
60
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200760 Bulgaria (7): BG.ACAD CA - Development A concise end-user guide is written and published on the web-site. It covers the basics of the application process. A shell-script for easier certificate request generation is developed and published. It contains step-by-step instructions and examples. Three cron-jobs are developed on the CA ’ s web server. These scripts monitor the following things: Validity of the published certificates Expiration of the published certificates Expiration of the published CRL Instant e-mail notifications to the CA ’ s staff members are provided.
61
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200761 Romanian CA was accredited by EuGridPMA body and it will be operated by the Romanian Space Agency (ROSA) Site operational problems Technical: Air cooling (RO-03-UPB), Room renovation (RO-06- UNIBUC) Non-technical(vacations and other personnel issues): all RO-01-ICI, RO-03-UPB, RO-05-INCAS, RO-06-UNIBUC: uncertified status RO-07-NIPNE, RO-08-UVT: certified Objectiv no. 1: Re-certify all the sites until 1st. October Migrate RO-05-INCAS to EGEE if possible Romania
62
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200762 Turkey (1): Sites Operation/Ticket Handling A new EGEE site has been added (TR-05-BOUN) by the beginning of April 2007. Dedicated resources:TR-01-ULAKBIM site (48 CPUs for seegrid, 16 CPUs for sgdemo) and TR-05-BOUN (8 CPUs for seegrid). From Classic SE to DPM migration has been completed at TR-01-ULAKBIM and TR-05-BOUN. DPM ownership patches have been done. SEE-GRID-2 accounting patches have been done.
63
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200763 Turkey (2): Sites Operation/Ticket Handling Although seegrid jobs has run successfully, there has been frequent SAM failures of TR-05-BOUN in the last three months due to unknown prd/sgm account problems. We will compensate this lack of availability with forthcoming good performance of the site. Within October 2007, SEE-GRID-2 TR-* sites are planned to be upgraded to Scientific Linux 4.5 together with glite 3.1 middleware. Periodical updates, security patches have been done for all SEE-GRID-2 TR-* sites. Regular user, site problems were handled through SEE-GRID and national helpdesks.
64
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200764 Turkey (3): Core services Smooth operation of the following core services has been enabled: Core services supporting seegrid VO: RB, BDII, WMS, MYPROXY, P-GRADE Portal Core services supporting sgdemo VO: RB, BDII, WMS, MYPROXY, LFC, P-GRADE Portal RB/WMS statistics have been provided for D3.1b.
65
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200765 Turkey (4): P-GRADE portal File Manager Portlet for Remote Storage Elements has been reviewed and tested together with SZTAKI. Specification of the file management for remote storage elements portlet was co-authored by SZTAKI and METU, and it is developed by METU. The portlet supports: LFC interaction commands for directory management and file management through LCG file naming conventions, namely, LFN and GUID. Credential Manager Portlet for MyProxy has been reviewed and tested. The portlet is to be integrated with the P-GRADE Grid Portal.
66
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200766 Hungary OS upgrade to SL 4.4 Glite upgrade to 3.0x Recabling of the internal network (new switch deployed) Maintenance Grid-Operator-On-Duty (GOOD) 1 week in may/june Infrastructure support, resolving Helpdesk tickets (script installation/hardware changes/security updates)
67
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200767 Albania (1): Overall progress Change of CA Follow-up of certificate problems Received local funding for equipment Installed glite in biggest part of equipment Creation of new sites New experimental site at INIMA New experimental site at FIE Preparations for site of FNS Preparations for site of FECO Plans for University of Elbasani and of Shkodra
68
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200768 Albania (2): INIMA Old site AL-01-INIMA following upgrades New site AL-04-INIMA with 9 nodes (4 nodes will be transferred in other universities) Problems with new status and building of INIMA …
69
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200769 Albania (3): FIE Power supply problems, have to put some money to resolve the problem, to by inverters. Switched of during vacances
70
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200770 Albania (4): FNS Cluster installed Problem with real IPs, in some institutions Administrative problems to get separate Internet link … …
71
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200771 Bosnia and Herzegovina (1) BA-01-ETFBL functioning correctly New 4 x WN (C2D, 1 GB RAM, 80 GB HDD) New SE – C2D, 2GB RAM, 2x320GB HDD BA-03-ETFSA New server node - HP ML110G4, X3040, 4GB, 2x160GB New 11 WNs - HP dc5750, MT A64-35, 1GB, 160GB New Switch BA-04-PMFSA New 4 x WN (C2D, 1 GB RAM, 160 GB HDD) New Switch BA total now: Total CPUs: 50+ Total Storage: 1+ TB Availability much better
72
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200772 Bosnia and Herzegovina (2) SAM Server Portal BBmSAM Service availability and SLA calculations implemented MAINTENANCE status implemented in a better way Enhanced OVERVIEW (main) page of BBmSAM –Showing uptime for last 24h –Filters available for: country, tier, certified status, last test state BBmobileSAM Now also showing uptime percentage for last 24h HGSM integration Preparation for HGSM shift to new version Database Now running strictly off MySQL, no Oracle used Reorganization of indexes – improved performance
73
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200773 FYR of Macedonia (1): Cluster addition equipment MK-01-UKIM cluster 18 new nodes added using VirtualBox WN Tested with support of Antun Currently installed on the old CE node but a new CE will be installed to support these WNs By the end of the year 16 new CPUs will be installed (non VirtualBox) 2TB storage
74
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200774 FYR of Macedonia (2): Cluster addition equipment MK-02-ETF cluster 24 new CPU added By end of september new SE and CE will be installed SE 1TB storage
75
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200775 FYR of Macedonia (3): New Clusters MK-03 Still in progress Hardware purchase is done Consultations of initial installation
76
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200776 FYR of Macedonia (4): Other activities Wiki pages will be provided for the installation of VirtualBox WN CA status: Software is installed. We will proceed in September with the review by EUGRID PMA. Our representative was attending the last EUGRID PMA meeting in Turkey.
77
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200777 Serbia (1): Infrastructure status 6 sites across the country Current number of CPUs: 195 (increase of 43 compared to PSC04) Storage: 0.4 TB (approx. the same) Expansion plans All 3 rd parties already have a site We expect two new sites, one in Novi Sad (Faculty of Agriculture, University of Novi Sad), and one in Nis (IRVAS SME) Hardware delivery for AEGIS01-PHY-SCL is expected this week, 32 64-bit cores, and storage upgrade to more than 20 TB; another purchase is being finalized (more CPUs) AEGIS will propose hardware purchase from Serbian National Investment Plan for the whole NGI
78
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200778 Serbia (2): Core services / SEEGRID Resources LCG-RB, GLITE-WMS, BDII, MyProxy, LFC at IPB LFC at UOB for SEEGRID and SGDEMO VO VOMS for AEGIS VO, can be deployed as a backup for SEEGRID VO if necessary Support to T-infrastructure In all core services In sites: AEGIS02-RCUB, AEGIS04-KG
79
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200779 CP/CPS Document finalised Object identifier: 1.3.6.1.4.1.23658.10.1.1.0 Date: 02 December 2006 DNs: Issuer: C=RS, O=AEGIS, CN=AEGIS-CA Subject: C=RS, O=AEGIS, OU=XXX, CN=Subject-name Country: Must be “RS” Organization: Must be “AEGIS” OrganizationUnit: Must be the name of the subject's institute CommonName: First name and last name of the subject for user certificates, DNS Serbia (3): AEGIS CA
80
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200780 Accreditation request on August 23, 2006 Under review by GridAUTH team and SRCE CA (on behalf of EUGridPMA) Accredited on June 1, 2007 Operational since June 10, 2007 Already issued 53 user and host certificates Serbia (4): AEGIS CA
81
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200781 Serbia (5): Other activities Leading WP3; overall activities coordination and representation at various Grid meetings Providing core services Wiki contributions: GLITE-3 guide SL4.5 WN gLite-3.1 guide for 32 bit and 64 bit architectures Grid-Operator-On-Duty, doing it on shifts and coordinating MW deployment, assessment, upgrade coordination Operations coordination Development coordination Collaboration with EGEE Collaboration with other regional Grid projects Development involvement Problems identification, support for debugging and patching Customizations of YAIM and providing glite-yaim-seegrid
82
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200782 Montenegro Sites & upgrades 1 site (MREN-01-CIS) WN : Upgrade from 4 to 24 CPUs and upgrade on SL 4.4 Storage : upgrade to 0.54 Tb Migration from Classic SE to DPM SE MPI support No centralized services of SEE-GRID in UoM CA CP/CPS document was written and sent to see-ca-incubation mailing list for approval and suggestions Sites operation and ticket handling status/problems Few problems with DPM SE installation
83
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200783 Moldova (1) MD-01-TUM site configuration, setting up and internal tests procedures MD-01-TUM site hardware issues resolved (caused a delay in internal tests and site further operation) Working on CA service development Learning other countries CA experience. Study of EugridPMA documents and selection of those appropriate for conditions in Moldova Determination of future sites hardware configuration Organization of the tender for purchasing of the equipment according to the MoUs with 3 institutions (Institute of the Mathematics and Computer Science, State University of Medicine and Pharmaceutics, Faculty of Radio Electronics of the Technical University of Moldova)
84
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200784 Moldova (2) New sites are expected to join MD-GRID infrastructure till the end of the project: MD-02-IMI site which will be installed in the Institute of the Mathematics and Computer Science (8 Intel Xeon quad core CPUs, 1,5 TB of storage) MD-03-SUMP site which will be installed in the State University of Medicine and Pharmaceutics (5 Intel PIV CPUs, 1 TB of storage) MD-04-RENAM, which will be placed in the FRE TUM (Faculty of Radio Electronics of the Technical University of Moldova) NOC of the RENAM Association (8 Intel Xeon dual core CPUs, 2 TB of storage)
85
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200785 Croatia (1): Site status HR-01-RBI site WNs upgraded to Debian 4.0 gLite 3.1 TAR sgdemo and MPI enabled node packages up-to-date HR-02-GRF site hardware being purchased 4 nodes ~ 30 CPUs: 2 x Intel Xeon 5310 QuadCore CPU, 1.6 GHz, 8 MB L2C 8 GB ECC FBD RAM 2 x 500 GB SATA HDD 2 x Gigabit Ethernet
86
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200786 Croatia (2): Other activities VOMS server regular maintenance primary for seegrid backup for sgdemo, see National CA operated by SRCE GOOD shifts Wiki updates BDII response time standalone SAM VOMS configuration local user support for middleware problems preparation of review reports about seegrid VO
87
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200787 WP3 APs and Issues (1) Communication / response problems Deadlines must be reasonably set but also respected All sites need to resolve their operational problems, and solve all Helpdesk tickets, esp. the outstanding ones SLA conformance monitoring will continue Helpdesk improvements needed – statistics extraction needs to be perfected; currently this is difficult (end of October) Wiki reorganization and updates ASAP finished Application SEEGRID VO VOMS roles started to be used by WP4 application developers ASAP Application level accounting implemented partially; should be fully implemented and used ASAP
88
SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, 11-12 September 200788 WP3 APs and Issues (2) Moldova to join the infrastructure by finishing certification of MD-01-TUM site ASAP 3M reporting of sites to include information on updates and operational problems according to the template (to be provided) Partners should be more responsible when performing GOOD shifts MPI WG to be established and to define standard for MPI setup of SEE-GRID sites; to finish its work until mid- October Infrastructure, site, VO metrics to be precisely defined What happened to live UI (Boro)? SEEGRID VO commitments ASAP Review of critical SAM tests
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.