Www.see-grid.eu SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no. 031775.

Slides:



Advertisements
Similar presentations
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks MyProxy and EGEE Ludek Matyska and Daniel.
Advertisements

FP7-INFRA Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
Polish Infrastructure for Supporting Computational Science in the European Research Space EUROPEAN UNION Services and Operations in Polish NGI M. Radecki,
SEE-GRID-SCI Antun Balaz SA1 Leader Institute of Physics Belgrade National, Regional and World-wide Grid eInfrastructures.
Technology on the NGS Pete Oliver NGS Operations Manager.
System Design/Implementation and Support for Build 2 PDS Management Council Face-to-Face Mountain View, CA Nov 30 - Dec 1, 2011 Sean Hardman.
08/11/908 WP2 e-NMR Grid deployment and operations Technical Review in Brussels, 8 th of December 2008 Marco Verlato.
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
FP6−2004−Infrastructures−6-SSA E-infrastructure shared between Europe and Latin America Pilot Test-bed Operations and Support Work.
EGEE-III INFSO-RI Enabling Grids for E-sciencE Nov. 18, EGEE and gLite are registered trademarks EGEE-III, Regional, and National.
INFSO-RI Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Statistics Monitor of SPMSII Warrior Team Pu Su Heng Tan Kening Zhang.
SEE-GRID-SCI Regional Grid Infrastructure: Resource for e-Science Regional eInfrastructure development and results IT’10, Zabljak,
SEE-GRID-SCI SEE-GRID-SCI Operations Procedures and Tools Antun Balaz Institute of Physics Belgrade, Serbia The SEE-GRID-SCI.
Monitoring in EGEE EGEE/SEEGRID Summer School 2006, Budapest Judit Novak, CERN Piotr Nyczyk, CERN Valentin Vidic, CERN/RBI.
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
Towards Production Grids in Greenfield Regions Dr. Ognjen Prnjat European and Regional Grid Management GRNET - Greek Research & Technology Network
TNC 2006 Catania 17 th May Technical Challenges of Establishing a Pilot Grid Infrastructure in South Eastern Europe Emanouil Atanassov on behalf.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks David Kelsey RAL/STFC,
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE-EGI Grid Operations Transition Maite.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
FP6_2004_Infrastructures_6-SSA [ Empowering e Science across the Mediterranean ] EUMEDGRID Infrastructure Kostas Koumantaros WP3.
Owen SyngeTitle of TalkSlide 1 Storage Management Owen Synge – Developer, Packager, and first line support to System Administrators. Talks Scope –GridPP.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
INFSO-RI Enabling Grids for E-sciencE Introduction to Grid Computing, EGEE and Bulgarian Grid Initiatives - Plovdiv,
Grid Operations Centre LCG SLAs and Site Audits Trevor Daniels, John Gordon GDB 8 Mar 2004.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks SA1: Grid Operations Maite Barroso (CERN)
INFSO-RI Enabling Grids for E-sciencE EGEE SA1 in EGEE-II – Overview Ian Bird IT Department CERN, Switzerland EGEE.
Glite. Architecture Applications have access both to Higher-level Grid Services and to Foundation Grid Middleware Higher-Level Grid Services are supposed.
8 th CIC on Duty meeting Krakow /2006 Enabling Grids for E-sciencE Feedback from SEE first COD shift Emanoil Atanassov Todor Gurov.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE Site Architecture Resource Center Deployment Considerations MIMOS EGEE Tutorial.
INFSO-RI Enabling Grids for E-sciencE ARDA Experiment Dashboard Ricardo Rocha (ARDA – CERN) on behalf of the Dashboard Team.
SAM Sensors & Tests Judit Novak CERN IT/GD SAM Review I. 21. May 2007, CERN.
The SEE-GRID-SCI initiative is co-funded by the European Commission under the FP7 Research Infrastructures contract no Workflow repository, user.
7 September 2007 AEGIS 2007 Annual Assembly Current Status of Serbian eInfrastructure: AEGIS, SEE-GRID-2, EGEE-II Antun Balaz SCL, Institute of Physics,
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no
EGI-InSPIRE RI EGI-InSPIRE EGI-InSPIRE RI How to integrate portals with the EGI monitoring system Dusan Vudragovic.
AEGIS Academic and Educational Grid Initiative of Serbia Antun Balaz (NGI_AEGIS Technical Manager) Dusan Vudragovic (NGI_AEGIS Deputy.
EGEE is a project funded by the European Union under contract IST Roles & Responsibilities Ian Bird SA1 Manager Cork Meeting, April 2004.
INFSO-RI Enabling Grids for E-sciencE Introduction to Grid Computing, EGEE and Bulgarian Grid Initiatives, Sofia, South.
INFSO-RI Enabling Grids for E-sciencE gLite Test and Certification Effort Nick Thackray CERN.
Evangelos Markatos and Charalampos Gkikas FORTH-ICS Athens, th Mar Institute of Computer Science - FORTH Christos.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE Operations: Evolution of the Role of.
SAM Status Update Piotr Nyczyk LCG Management Board CERN, 5 June 2007.
INFSO-RI Enabling Grids for E-sciencE File Transfer Software and Service SC3 Gavin McCance – JRA1 Data Management Cluster Service.
II EGEE conference Den Haag November, ROC-CIC status in Italy
SEE-GRID-SCI Grid Operations Procedures Antun Balaz Institute of Physics Belgrade Serbia The SEE-GRID-SCI initiative.
NGI_TR Emrah Akkoyun TR-Grid Operational Center EGI-InSPIRE – SA1 Kickoff Meeting1.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The Dashboard for Operations Cyril L’Orphelin.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks CYFRONET site report Marcin Radecki CYFRONET.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Bob Jones EGEE Technical Director
WLCG IPv6 deployment strategy
Regional Operations Centres Core infrastructure Centres
Advancing South-East Europe into the eInfrastructure era
SA1 Execution Plan Status and Issues
Ian Bird GDB Meeting CERN 9 September 2003
Grid Operations Procedures
Overview of IPB responsibilities in EGEE-III SA1
High Energy Physics Computing Coordination in Pakistan
Leigh Grundhoefer Indiana University
Department of Licensing HP 3000 Replatforming Project Closeout Report
Site availability Dec. 19 th 2006
Presentation transcript:

SEE-GRID-2 The SEE-GRID-2 initiative is co-funded by the European Commission under the FP6 Research Infrastructures contract no WP3 progress report Antun Balaz WP3 Leader Institute of Physics, Belgrade SEE-GRID-2 PSC05 Meeting, Thessaloniki, Greece September 2007

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Overview WP3 objectives and activities WP3 position in SEE-GRID-2 WP3 deliverables and milestones WP3 schedule WP3 activities reports WP3 country reports WP3 Action points and issues

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 objectives & activities (1) Develop the next-generation SEE-GRID infrastructure  Next generation of EGEE middleware (gLite), the VOMS, the WMS, information services and file catalogue services will be assessed having in mind project and WP3 objectives  SEE-GRID infrastructure deployment regarding the middleware services will follow and adapt its services according to the results of the assessment. Support in deployment and operations of the Resource Centres  Next generation monitoring services will be deployed so as to support the over-the-board infrastructure monitoring.  The current SEE-GRID helpdesk will be expanded in SEE-GRID-2, with the main goal of full EGEE interoperability.  Support the expansion and deal with the overall upgrade of the current infrastructure by proliferation of RCs in each SEE country increasing:  the total available regional resources (CPUs, storage, etc.) thus boosting the capacity and reliability of the provision of Grid services at regional level, and  the diversity and distribution of participating teams per country thus strengthening cooperation and collaboration at national level.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 objectives & activities (2) Network resource provision and assurance  WP3 will also deal with network resource provision, in close cooperation with the SEEREN2 project, thus ensuring stable connectivity for the RCs in the region.  Attention will be paid to Bandwidth-on-Demand requirements, to cater for bandwidth-intensive applications in case they need dedicated resources for particular experiments. CA and RA guidelines and deployment  Regional SEE-GRID catch-all Certification Authority (CA) will continue to operate providing certificates for countries without a CA.  Experienced CA team will provide support for per-country CA deployment and accreditation. The cycle to establish a National Grid CA will be in compliance with the procedures and accreditation process of the EU Grid Policy Management Authority (EUGridPMA).  Operations will be strengthened so as to support per-country CA operations

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 objectives & activities (3) User portal deployment and operations  A user-friendly multi-Grid access portal will be deployed, enabling universal and more flexible user access to the regional infrastructure.  The work on the SEE-GRID-2 portal should increase the user- friendliness of being able to select a grid and execute a workflow on the selected grid, so that interoperability of different grids is seamlessly and transparently solved at the application (workflow) level.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 objectives & activities (4) A3.1 - Implementation of the advanced SEE-GRID infrastructure (UOB-IPB/IPP)  Deals with support for configuration, deployment and operations of the Resource Centres within the SEE-GRID pilot infrastructure, as well as transition of mature centres into EGEE.  Effort: 89 PMs  Subactivities:  A Expand the existing SEE-GRID topology by inclusion of new sites per SEE country  A Deploy M/W components and OS in SEE Resource Centers  A Test the site installations in local and Grid mode  A Operate the SEE-GRID infrastructure  A Monitor the infrastructure performance and assess its usage  A Certify and Migrate SEE-GRID sites from Regional Pilot to Global production-level eInfrastructures

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 objectives & activities (5) A3.2 - Network Resource Provision and BoD requirements (IPP)  Support liaison actions to ensure adequate network provision, including the requirements for Bandwidth-on-demand, if and where necessary depending on the application.  Effort: 39 PMs A3.3 - Deploy and operate Grid CAs (GRNET)  Should provide CA and RA guidelines and help establish per- country CAs to cover the authentication issues  Effort: 73 PMs A3.4 - Provide a user portal (SZTAKI)  Supports the deployment of a user-friendly and multi-grid interoperable portal for convenient Grid access and usage.  Effort: 15 PMs

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 position in SEE-GRID-2 (1) WP4 Users & Applications WP3 Infrastructure & Operations WP2 Strategies & Policies A4.1: Select multi- disciplinary applications A4.2: Adopt applications for the SEE user communities A4.3: Support deployed applications WP5 Training, Dissemination and Communication A2.1: Study grid deployment solutions A2.2: Deliver sustainable roadmap for SEE NGIs A3.1: Implementation (deploy, test, operate, monitor, certify, migrate) of an advanced SEE Grid Infrastructure A3.4: Grid access Portal A3.2: Network resource provision A3.3: Deployment and Operational support for accredited Grid Certification Authorities A4.4: Assess application usage WP1 Project Administrative and Technical Management

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 position in SEE-GRID-2 (2) Results of WP2 will be used as inputs  D2.1 - Regional and National Organisational and Policy Schemes  D2.2 - Sustainable organizational and operational approach  D2.3(a,b) - Sustainability and Impact Analysis of SEE National Grid Initiatives Results of WP3 used as input to WP4 All partners participate in WP3 Activities start in M1, end in M24 WP3 planned budget is 672, €, or ~33.6% of the SEE-GRID-2 budget WP3 is planned to take 216 PMs, or ~34.2% of all SEE- GRID-2 PMs

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 Deliverables & Milestones (1) D3.1a - Infrastructure Deployment Plan, M04 (CERN)  Describes the envisaged infrastructure deployment execution plan to be followed in the region.  Prepared and submitted on time D3.2 - CA and RA guidelines for new candidates, M05 (GRNET/AUTH)  This deliverable describes the guidelines and best practices of the per-country CA and RA organization and policies.  Prepared and submitted on time D3.3 - Portal specifications and functionality, M06 (SZTAKI)  This deliverable provides the characteristics and structure of the multi-grid user-oriented portal.  Prepared and submitted on time

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 Deliverables & Milestones (2) D3.1b - Infrastructure Deployment Plan, M14 (CERN)  Final version of D3.1.  Prepared and submitted on time We are currently in M17 Future: D3.4 - Infrastructure overview and assessment, M23 (UOB-IPB)  This deliverable presents an overview and assessment of the progress in the regional infrastructure and operations in the life of the project

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 Deliverables & Milestones (3) M3.1 - Infrastructure Deployment Plan Defined  M04, Status: OK M3.2 - CA and RA guidelines for new candidates defined  M05, Status: OK M3.3 - Portal operational across the pilot Grid  M12, Status: OK

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 schedule M01 - Start of WP3 M04 - Infrastructure Deployment Plan (M3.1, D3.1a) M05 - CA and RA guidelines for new candidates (M3.2, D3.2) M06 - Portal specifications and functionality (D3.3) M12 - Portal operational across the pilot Grid (M3.3) M14 - Final Infrastructure Deployment Plan (D3.1b) M17 – This is where we are currently M23 - Infrastructure overview and assessment (D3.4) M24 - End of WP3 (and of the project)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 activities overview A3.1: Implementation of the advanced SEE-GRID infrastructure (UOB-IPB/IPP) A3.2: Network Resource Provision and BoD requirements (IPP) A3.3: Deploy and operate Grid CAs (GRNET) A3.4: Provide a user portal (SZTAKI)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September A3.1: Overview Infrastructure status  gLite deployment status  SEEGRID VO metrics and accounting Operations  SLA conformance monitoring per site  Helpdesk tickets procedures and statistics analysis  GOOD shifts  MPI support on SEE-GRID sites Operational & monitoring tools deployment & integration  HGSM  SAM (+ porting to MySQL)  WiatG  R-GMA  Pakiti SEE-GRID Wiki status WP3 developments Infrastructure, Site and VO metrics

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Infrastructure status (1) Concerning the middleware deployments, the current SEE-GRID infrastructure supports a set of core services which provide user access to resources:  Catch-all Certification Authority for the region has been officially accredited by the EU Grid Policy Management Authority - EUGridPMA, and is currently operational thus enabling regional sites to obtain user and host certificates  Virtual Organisation Management Service (VOMS), server has been installed as an authorization system for the SEE-GRID Virtual Organisation (VO), which provides information on the user's relationship with the Virtual Organization, his/her groups, roles and capabilities  Workload management service (lcg-RB and glite-WMSLB) and Information Services (BDII) nodes (several instances) have been installed at partners’ sites and are operational  MyProxy is operational, and supports certificate renewal  FTS deployed and used in production

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Infrastructure status (2) SEE-GRID total and free CPUs in the last year

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Infrastructure status (3) SEE-GRID infrastructure contains currently the following resources:  30 sites in SEE-GRID production  6 sites in certification phase (2 AL + 1 HR + 2 RO + 1 MD)  CPUs: 1105 total, but unknown number available to SEEGRID VO; increase of approx. 400 CPUs compared to PSC04  Storage: 17.6 TB (no increase) All sites on gLite-3, with 3 sites on gLite-3.1 and the rest on gLite-3.0 glite-CE final assessment by EGEE is that this service is not stable enough for production; we agree glite-WMSLB actively used Guides provided for deployment of gLite-3.1 WNs on SL4.5 (32-bit); in preparation guide for 64-bit WNs

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Infrastructure (4): VO membership steady growth start of project  ~90 members end of 02/07  ~110 members end of 08/07  ~160 members

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Infrastructure (5): VO members per country

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Operations (1) SLA conformance monitored per site; tools used:  HGSM  SAM  GStat  WiatG  Helpdesk SLA conformance analysis:  SEEGRID2-WP3-RS-018-SLA-Q xls Helpdesk tickets procedures  GOOD shifts introduced, initial results positive  Tickets handling: response times need to be improved!  Problems with GOOD shifts – some partners not performing duty! Helpdesk statistics analysis:  SEEGRID2-WP3-RO-008-PSC05-Helpdesk_Statistics xls  SEEGRID2-WP3-RS-020-PSC05-Tickets_not_closed-a xls OPS role implemented in VOMS, documented, implemented & used

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Operations (2) MPI support on SEE-GRID sites  Important for many applications  Current support not sufficient  Seems there is a problem with setup on some sites – GOOD shifts are addressing this, but this is not sufficient Proposal: WG to be created which should define minimal standards for MPI support, and provide template scripts and JDL files for submission of MPI jobs WG can be composed of representatives from sites supporting MPI and having experience:  Bulgaria  Turkey  Greece  Serbia

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Operational & monitoring tools Operational & monitoring tools deployment status  HGSM – Turkey  SAM (+ porting to MySQL) – Bosnia and Herzegovina with CERN support  BBmSAM (Bosnia and Herzegovina)  WiatG  R-GMA – Bulgaria  Pakiti

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September HGSM is a central database that holds all crucial information about a grid site It has an interactive interface for users to see and update the information available in the database It has exports to interface with other services to enable integration between HGSM and other services that make use of HGSM's information HGSM is constantly improved with new features and new exports to enhance its functionality, role and information exchange over grid services for administrative purposes HGSM (1): Background

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Started in May-2007 with appearance of requests for new functionality and enhancements due to inefficiency of HGSM in some areas in the GIM list. Since the requests was big and required some major changes in HGSM structure, an “Internals Document” is prepared and passed to people in the discussion in the end of May. During June, the requests are anaylzed and refined down so everybody was aware of the missing parts, possible solutions and impacts of the new functionality HGSM (2): Latest Development Period (Initiation and Planning) ‏

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September In the end of June a detailed “Roadmap Document” -which covers the problems, solutions and a step- by-step action plan in every detail- is prepared and opened for discussion in the GIM list. The Document is revised and updated according to feedback taken by the list. Document yielded 4 revisions, three drafts and a final which is published on Jul 12, Active development started on Jul, 16 according to plan announced in the Document. HGSM (3): Latest Development Period (Finalizing and Taking Action) ‏

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Following improvements are qualified for this development period :  Revision of visual interface fields.  Introduction universal exporting subsystem which can be used by humans and other computers.  Introduction of an importing subsystem which can import exported data for administrators.  An automatic field filling tool for administrators.  Functionality improvement on various pages of visual interface.  Improving the handling of supported applications in HGSM and developing a MAUI configuration editing tool for reflecting changes to grid sites directly and automatically.  Enabling HGSM to track site information history for statistical purposes. (and Progress So Far) ‏ (100%) ‏ (80%) ‏ (0%, 10 Sep) ‏ (0%, 24 Sep) ‏ (0%, 8 Nov) ‏ (0%, 22 Nov) ‏ (0%, 3 Dec) ‏ Note: Dates indicate rough estimations of kick-off dates. HGSM (4): The Improvements

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September SAM/BBmSAM SAM Server Portal  BBmSAM  Service availability and SLA calculations implemented  MAINTENANCE status implemented in a better way  Enhanced OVERVIEW (main) page of BBmSAM –Showing uptime for last 24h –Filters available for: country, tier, certified status, last test state  BBmobileSAM  Now also showing uptime percentage for last 24h  HGSM integration  Preparation for HGSM shift to new version  Database  Now running strictly off MySQL, no Oracle used  Reorganization of indexes – improved performance

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG: Introduction Web application for visualization of BDII information  Used as an operational tool for site monitoring Highly responsive tool because it uses AJAX  Partial refresh (client receives part by part of the page)  Asynchronous (server is processing in the background, so one may send several requests) Current version seeks for: CE, gCE, RB, gRB, SE, LFC, FTS and GridICE Documentation available:  SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG: Who is using it Several regional projects  EUMedGRID (bdii.isabella.grnet.gr)  EUChinaGrid (euchina-bdii-1.cnaf.infn.it)  EELA (lnx112.eela.if.ufrj.br)  BalticGrid (bdii.mif.vu.lt)  Int-EU-Grid (i2g-ii01.lip.pt)  Health-e-Child(hec-maat-server2.cern.ch) ROC CERN  PROD (lcg-bdii.cern.ch)  PPS (pps-bdii.cern.ch)  OPS (sam-bdii.cern.ch) SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG: Technologies Following technologies are included in WiatG:  Perl is used for LDAP connection to BDII and generation of HTML and XML data.  XML, the format for sending data from the web server to the client.  Cascading Style Sheets (CSS), a markup language used to define the presentation style of a page.  JavaScript, a scripting language.  XMLHttpRequest, an object that is used to exchange data between web client and web server.  Document Object Model (DOM), which provides a logical view of a web page as a tree structure. SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG: Architecture SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007 Browser Client WiatG Server Side BDII LDAP User Interface XMLHTTPRequest ApacheHTTPServer XMLHTTPRequest callback() LDAPtoXML script Java Script Call HTTP Request QueryResponse XML Data HTML & CSS Data

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG in action SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG: Further development (short term) Addition of new services (MyProxy, localLFC, VO software tags, …) Development of the new tool “What should be at the Grid” (WsbatG)  Based on the site configuration exported from HGSM/GOCDB  Visually identical tool, providing the expected status of BDII in WiatG SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WiatG: Further development (long term) SEE-GRID-2 PSC05 meeting, Thessalonica, Greece - September 11-12, 2007 Alarms Dashboard BDII web service sBDII web service Check correctness of sBDII data Check correctness of BDII data Check equality of sBDII-BDII information SAM HGSM/GOCDB web service HGSM/GOCDB web service Check equality of BDII- HGSM/GOCDB information Check correctness of HGSM/GOCDB data WiatGS User Interface WiatGS WiatG WiatG WsbatG WsbatG Alarms Dashboard UI

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September R-GMA (1) Accounting views for SEEGRID-only sites per site accounting: ●

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September R-GMA  Accounting views for SEEGRID and EGEE sites that support SEEGRID – per country/institution user accounting

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September R-GMA (2)  Accounting views for Job success rates and other statistics –in progress, currently running on our own data only.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September R-GMA (3)  Accounting views for Job success rates and other statistics –in progress, currently running on our own data only.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Pakiti: Overview Pakiti Client  Installed on all nodes  Checks software versions against configured repositories  Sends report once per day to pakiti server Pakiti Server  Running at the Aristotle University of Thessaloniki  Main Components:  Feed –Daily reports from clients  Site Administrator ’ s front-end –Detailed view of the rpm package status at each node –Access is permitted only to each the administrator ’ s of each site via TLS Authentication using X.509v3 Certificates  Addon Components  ROC Manager ’ s front-end –Aggregated view of the status of all the sites in the ROC –Developed by the AUTH GOC

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Pakiti: Status of the Service Pakiti enabled sites in SEE-GRID ROC: Bosnia Herzegovina  BA-04-PMFSA Bulgaria  BG01-IPP  BG02-IM  BG04-ACAD  BG05-SUGRID Croatia  HR-01-RBI Greece  HG-01-GRNET  HG-03-AUTH Romania  RO-01-ICI Serbia  AEGIS01-PHY-SCL  AEGIS02-RCUB  AEGIS03-ELEF-LEDA  AEGIS04-KG  AEGIS05-ETFBG Turkey  TR-01-ULAKBIM  TR-05-BOUN

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Pakiti and reporting The deployment of pakiti on sites is voluntary Sites deploying it provide accurate information on updates status on their nodes Proposal is that, in order to further improve status of SEE-GRID sites, all sites report in their 3M reports the following:  Middleware version changes during the quarter  Status of updates (not needed if pakiti is deployed)  Major operational issues Template will be provided for this; if pakiti is deployed) such report would just contain one line with gLite version changed and paragraph or two describing operational issues encountered (basically, structured version of section 2.3 of 3M reports, now submitted by each site)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September SEE-GRID reorganized Wiki Reorganized SEE-GRID Wiki is now the main Wiki  Many documents still missing, main being:  Participating in SEEGRID as a Site (AP1)  Policy Documents (AP3)  SEEGRID certification procedure (AP4)  LFC (AP6)  RGMA (AP7)  My Proxy (AP8)  BDII/RB (AP9)  FTS (AP10)  SAM (AP11)  GridICE(AP12)  How to Join the SEEGRID infrastructure as a user (AP14)  Grid usage basics (AP16) Effort needed by all partners; UKIM coordinating

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 developments HGSM WiatG Accounting glite-yaim-seegrid soon-to-be deployed apt/yum repository

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Infrastructure, Site and VO metrics Infrastructure growth (CPUs, storage, memory?) Bandwidth growth? Site availabilities and downtimes (CE, SE) Accounting data (per site, per country, per application, per VO, per user community, time distribution); here SEEGRID VO and national VOs should be considered Job success rates (per site, per country, per application, per VO, per user community, time distribution); here SEEGRID VO and national VOs should be considered VO membership time evolution, distribution per country, per application, per user community; here SEEGRID VO and national VOs should be considered

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September A3.2: Status Concerning the bandwidt-on-demand some tests were done in order to investigate the following protocols and services using our new router - CISCO XR:  Resource Reservation Protocol (RSVP);  Generalized Multi Protocol Label Switching (GMPLS);  Differentiated Services (DiffServ);  Standard activities for BoD.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September A3.3: Overview of the work CA establishment in SEE Region  Each country must setup each own national Certification Authorities  Each CA must be accredited by the EUGridPMA  see-ca-incubation mailing list  Support during the process of establishing a new CA and for the accreditation period  CA common procedures and best practices advices are provided  Help on writing the CP/CPS documents  Process for establishing a new CA takes around one year

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September A3.3: Status New accredited CAs in the Region  Serbian CA (AEGIS CA)  Accreditation request on August 23, 2006  Under review by GridAUTH team and SRCE CA (on behalf of EUGridPMA)  Accredited on June 1, 2007  Operational since June 10, 2007  Romanian CA (ROSA CA)  Accreditation request on January 25, 2006  Under review by GridAUTH team, PK-GRID CA and CESNET CA (on behalf of EUGridPMA)  Accredited on August 1, 2007 Grid CA candidates  Montenegro CA (MREN CA)  CP/CPS reviewed by GridAUTH (via see-ca-incubation mailing list) on July 10, 2007  F.Y.R.O.M. CA (MARGI CA)  Accreditation request on May 4, 2007  First CP/CPS not yet available All candidates are encouraged to participate at the EUGridPMA meetings (Next meeting in Thessaloniki, Sept 19-21)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September A3.4: Overview Official SEE-GRID2 Portal user maintenance (52 users)  Quota management, user management Official SEE-GRID2 Portal maintenance  Software (P-GRADE Portal version 2.5)  P-GRADE Portal software bug fixing

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September A3.4: Status and development Portlets development has been done by turkish partner (P-GRADE Portal Development Alliance)  Already active /beta-test/ in a private portal installation. New portlets  File Management Portlet to manage the remote files through the LFC catalog and the LCG interface.  Hot topic!  Intended to merge into official P-GRADE Portal v2.5.  Credential management portlet to complement the existing certificate portlet with info, change pass-phrase and destroy operations.  Intended to merge with the default certificates portlet.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 country reports Greece Switzerland/CERN Bulgaria Romania Turkey Hungary Albania Bosnia and Herzegovina FYR of Macedonia Serbia Montenegro Moldova Croatia Work performed since PSC04 Conformance to WP3 objectives Issues, if any

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Greece Pakiti Service enabled for SEE-GRID infrastructure  October 2007  HG-06-EKT will support SEEGRID VO  228 CPUs, 9.3 TB Storage Exact Resources Dedicated for SEEGRID VO to be decided.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Support to WP3 operations  GOOD shifts  Solving operational problems Support for SAM deployment and improvements, as well as liaison activities with SAM development team Liaison activities with operations in other regional Grid projects Strong involvement in operations-related developments:  WiatG, WsbatG  SEE-GRID apt/yum repository Switzerland/CERN

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bulgaria (1) Status of the infrastructure and plan for expansion  5 sites infrastructure – significantly stable with good up-time  All sites upgrading now to SL4 WNs.  Core services, monitoring tools:  R-GMA: graphical on-line user interface  Accounting views for –SEEGRID-only sites per site accounting –SEEGRID and EGEE sites that support SEEGRID – per country/institution user accounting –Job success rates and other statistics –in progress, currently running on our own data only.  FTS: used in production by SALUTE  WMS – moved to a better hardware.  BDII stability and capability improvements

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bulgaria (2) 5 production sites in BG After start of EGEE II added new cluster with 80 CPUs and low-latency Myrinet interconnect for 80 CPUs – unique resource for special MPI jobs BG04-ACAD (80 CPU) BG01-IPP (12 CPU) CPUStorageTape April TB- September TB10TB

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bulgaria (3): Petri net performance analysis Anastas Misev from Macedonia was at IPP in a visit, sponsored by project BIS 21++, working on his Ph.D. thesis on Grid scheduling and failover. Host professor was E. Atanassov His analysis of our RB rb001 shows that  The overall success rate of the analyzed data is somewhere near 70%  The percentage of the successful jobs greatly depends on the users experience. Jobs by more experienced users have success rate above 80%. Interactive diagram helps in identifying the bottlenecks in the process model. It can show throughput time between any 2 transitions Color coding to specify low, middle and high waiting time Conclusions:  Job re-submission does not help in 99% of cases  20% of the 470 jobs submitted by one user had waiting times above 57 hours, and all of them failed.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bulgaria (4): Job success rate – Top ten CEs We have made a pivot analysis of the CEs and the final statuses of the jobs. The top 10 CEs are shown in the table below. Note that the percentage of successful jobs is more then 90%. Top 10 CEsFinal Status CE nameDONEABORTCANCELGrand Total ce.ulakbim.gov.tr:2119/jobmanager-lcgpbs-see ce002.ipp.acad.bg:2119/jobmanager-lcgpbs- see ce01.ariagni.hellasgrid.gr:2119/jobmanager- pbs-see ce01.athena.hellasgrid.gr:2119/jobmanager- pbs-see ce01.isabella.grnet.gr:2119/jobmanager-pbs- see ce01.kallisto.hellasgrid.gr:2119/jobmanager- pbs-see ce01.marie.hellasgrid.gr:2119/jobmanager- pbs-see ce02.grid.acad.bg:2119/jobmanager-pbs- myrinet ce02.grid.acad.bg:2119/jobmanager-pbs-see ce101.grid.ucy.ac.cy:2119/jobmanager-lcgpbs- see Grand Total

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Accredited by EUGridPMA on March 05, 2007 Included in the IGTF CA RPM distribution from version 1.13 Effective operations started March 21, 2007 Web-page: Location: IPP-BAS, Sofia, Bulgaria Personnel:  4 CA staff members;  2 RA. Bulgaria (5): BG.ACAD CA – Status overview

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bulgaria (6): BG.ACAD CA – Status overview During the period March-August, 2007:  Total of 40 certificates are signed by BG.ACAD CA, including:  30 user certificates  10 host certificates  Total of 3 certificates are revoked by BG.ACAD CA.  Regular patches and updates to CA ’ s OS and software are applied.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bulgaria (7): BG.ACAD CA - Development A concise end-user guide is written and published on the web-site. It covers the basics of the application process. A shell-script for easier certificate request generation is developed and published. It contains step-by-step instructions and examples. Three cron-jobs are developed on the CA ’ s web server. These scripts monitor the following things:  Validity of the published certificates  Expiration of the published certificates  Expiration of the published CRL Instant notifications to the CA ’ s staff members are provided.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Romanian CA was accredited by EuGridPMA body and it will be operated by the Romanian Space Agency (ROSA) ‏ Site operational problems  Technical: Air cooling (RO-03-UPB), Room renovation (RO-06- UNIBUC) ‏  Non-technical(vacations and other personnel issues): all  RO-01-ICI, RO-03-UPB, RO-05-INCAS, RO-06-UNIBUC: uncertified status  RO-07-NIPNE, RO-08-UVT: certified Objectiv no. 1: Re-certify all the sites until 1st. October  Migrate RO-05-INCAS to EGEE if possible Romania

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Turkey (1): Sites Operation/Ticket Handling A new EGEE site has been added (TR-05-BOUN) by the beginning of April Dedicated resources:TR-01-ULAKBIM site (48 CPUs for seegrid, 16 CPUs for sgdemo) and TR-05-BOUN (8 CPUs for seegrid). From Classic SE to DPM migration has been completed at TR-01-ULAKBIM and TR-05-BOUN. DPM ownership patches have been done. SEE-GRID-2 accounting patches have been done.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Turkey (2): Sites Operation/Ticket Handling Although seegrid jobs has run successfully, there has been frequent SAM failures of TR-05-BOUN in the last three months due to unknown prd/sgm account problems. We will compensate this lack of availability with forthcoming good performance of the site. Within October 2007, SEE-GRID-2 TR-* sites are planned to be upgraded to Scientific Linux 4.5 together with glite 3.1 middleware. Periodical updates, security patches have been done for all SEE-GRID-2 TR-* sites. Regular user, site problems were handled through SEE-GRID and national helpdesks.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Turkey (3): Core services Smooth operation of the following core services has been enabled:  Core services supporting seegrid VO: RB, BDII, WMS, MYPROXY, P-GRADE Portal  Core services supporting sgdemo VO: RB, BDII, WMS, MYPROXY, LFC, P-GRADE Portal RB/WMS statistics have been provided for D3.1b.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Turkey (4): P-GRADE portal File Manager Portlet for Remote Storage Elements has been reviewed and tested together with SZTAKI. Specification of the file management for remote storage elements portlet was co-authored by SZTAKI and METU, and it is developed by METU. The portlet supports: LFC interaction commands for directory management and file management through LCG file naming conventions, namely, LFN and GUID. Credential Manager Portlet for MyProxy has been reviewed and tested. The portlet is to be integrated with the P-GRADE Grid Portal.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Hungary OS upgrade to SL 4.4 Glite upgrade to 3.0x Recabling of the internal network (new switch deployed) Maintenance  Grid-Operator-On-Duty (GOOD) 1 week in may/june  Infrastructure support, resolving Helpdesk tickets (script installation/hardware changes/security updates)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Albania (1): Overall progress Change of CA  Follow-up of certificate problems Received local funding for equipment  Installed glite in biggest part of equipment Creation of new sites  New experimental site at INIMA  New experimental site at FIE  Preparations for site of FNS  Preparations for site of FECO  Plans for University of Elbasani and of Shkodra

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Albania (2): INIMA Old site AL-01-INIMA following upgrades New site AL-04-INIMA with 9 nodes (4 nodes will be transferred in other universities) Problems with new status and building of INIMA …

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Albania (3): FIE Power supply problems, have to put some money to resolve the problem, to by inverters. Switched of during vacances

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Albania (4): FNS Cluster installed Problem with real IPs, in some institutions  Administrative problems to get separate Internet link  … …

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bosnia and Herzegovina (1) BA-01-ETFBL functioning correctly  New 4 x WN (C2D, 1 GB RAM, 80 GB HDD)  New SE – C2D, 2GB RAM, 2x320GB HDD BA-03-ETFSA  New server node - HP ML110G4, X3040, 4GB, 2x160GB  New 11 WNs - HP dc5750, MT A64-35, 1GB, 160GB  New Switch BA-04-PMFSA  New 4 x WN (C2D, 1 GB RAM, 160 GB HDD)  New Switch BA total now:  Total CPUs: 50+  Total Storage: 1+ TB Availability much better

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Bosnia and Herzegovina (2) SAM Server Portal  BBmSAM  Service availability and SLA calculations implemented  MAINTENANCE status implemented in a better way  Enhanced OVERVIEW (main) page of BBmSAM –Showing uptime for last 24h –Filters available for: country, tier, certified status, last test state  BBmobileSAM  Now also showing uptime percentage for last 24h  HGSM integration  Preparation for HGSM shift to new version  Database  Now running strictly off MySQL, no Oracle used  Reorganization of indexes – improved performance

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September FYR of Macedonia (1): Cluster addition equipment MK-01-UKIM cluster  18 new nodes added using VirtualBox WN  Tested with support of Antun  Currently installed on the old CE node but a new CE will be installed to support these WNs By the end of the year  16 new CPUs will be installed (non VirtualBox)  2TB storage

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September FYR of Macedonia (2): Cluster addition equipment MK-02-ETF cluster  24 new CPU added  By end of september new SE and CE will be installed  SE 1TB storage

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September FYR of Macedonia (3): New Clusters MK-03  Still in progress  Hardware purchase is done  Consultations of initial installation

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September FYR of Macedonia (4): Other activities Wiki pages will be provided for the installation of VirtualBox WN CA status:  Software is installed.  We will proceed in September with the review by EUGRID PMA.  Our representative was attending the last EUGRID PMA meeting in Turkey.

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Serbia (1): Infrastructure status 6 sites across the country Current number of CPUs: 195 (increase of 43 compared to PSC04) Storage: 0.4 TB (approx. the same) Expansion plans  All 3 rd parties already have a site  We expect two new sites, one in Novi Sad (Faculty of Agriculture, University of Novi Sad), and one in Nis (IRVAS SME)  Hardware delivery for AEGIS01-PHY-SCL is expected this week, bit cores, and storage upgrade to more than 20 TB; another purchase is being finalized (more CPUs)  AEGIS will propose hardware purchase from Serbian National Investment Plan for the whole NGI

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Serbia (2): Core services / SEEGRID Resources LCG-RB, GLITE-WMS, BDII, MyProxy, LFC at IPB LFC at UOB for SEEGRID and SGDEMO VO VOMS for AEGIS VO, can be deployed as a backup for SEEGRID VO if necessary Support to T-infrastructure  In all core services  In sites: AEGIS02-RCUB, AEGIS04-KG

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September CP/CPS Document finalised Object identifier: Date: 02 December 2006 DNs:  Issuer: C=RS, O=AEGIS, CN=AEGIS-CA  Subject: C=RS, O=AEGIS, OU=XXX, CN=Subject-name  Country: Must be “RS”  Organization: Must be “AEGIS”  OrganizationUnit: Must be the name of the subject's institute  CommonName: First name and last name of the subject for user certificates, DNS Serbia (3): AEGIS CA

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Accreditation request on August 23, 2006 Under review by GridAUTH team and SRCE CA (on behalf of EUGridPMA) Accredited on June 1, 2007 Operational since June 10, 2007 Already issued 53 user and host certificates Serbia (4): AEGIS CA

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Serbia (5): Other activities Leading WP3; overall activities coordination and representation at various Grid meetings Providing core services Wiki contributions:  GLITE-3 guide  SL4.5 WN gLite-3.1 guide for 32 bit and 64 bit architectures Grid-Operator-On-Duty, doing it on shifts and coordinating MW deployment, assessment, upgrade coordination Operations coordination Development coordination  Collaboration with EGEE  Collaboration with other regional Grid projects Development involvement  Problems identification, support for debugging and patching  Customizations of YAIM and providing glite-yaim-seegrid

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Montenegro Sites & upgrades  1 site (MREN-01-CIS)  WN : Upgrade from 4 to 24 CPUs and upgrade on SL 4.4  Storage : upgrade to 0.54 Tb  Migration from Classic SE to DPM SE  MPI support No centralized services of SEE-GRID in UoM CA  CP/CPS document was written and sent to see-ca-incubation mailing list for approval and suggestions Sites operation and ticket handling status/problems  Few problems with DPM SE installation

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Moldova (1) MD-01-TUM site configuration, setting up and internal tests procedures MD-01-TUM site hardware issues resolved (caused a delay in internal tests and site further operation) Working on CA service development  Learning other countries CA experience.  Study of EugridPMA documents and selection of those appropriate for conditions in Moldova Determination of future sites hardware configuration Organization of the tender for purchasing of the equipment according to the MoUs with 3 institutions (Institute of the Mathematics and Computer Science, State University of Medicine and Pharmaceutics, Faculty of Radio Electronics of the Technical University of Moldova)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Moldova (2) New sites are expected to join MD-GRID infrastructure till the end of the project:  MD-02-IMI site which will be installed in the Institute of the Mathematics and Computer Science (8 Intel Xeon quad core CPUs, 1,5 TB of storage)  MD-03-SUMP site which will be installed in the State University of Medicine and Pharmaceutics (5 Intel PIV CPUs, 1 TB of storage)  MD-04-RENAM, which will be placed in the FRE TUM (Faculty of Radio Electronics of the Technical University of Moldova) NOC of the RENAM Association (8 Intel Xeon dual core CPUs, 2 TB of storage)

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Croatia (1): Site status HR-01-RBI site  WNs upgraded to  Debian 4.0  gLite 3.1 TAR  sgdemo and MPI enabled  node packages up-to-date HR-02-GRF site  hardware being purchased  4 nodes ~ 30 CPUs:  2 x Intel Xeon 5310 QuadCore CPU, 1.6 GHz, 8 MB L2C  8 GB ECC FBD RAM  2 x 500 GB SATA HDD  2 x Gigabit Ethernet

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September Croatia (2): Other activities VOMS server regular maintenance  primary for seegrid  backup for sgdemo, see National CA operated by SRCE GOOD shifts Wiki updates  BDII response time  standalone SAM  VOMS configuration local user support for middleware problems preparation of review reports about seegrid VO

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 APs and Issues (1) Communication / response problems Deadlines must be reasonably set but also respected All sites need to resolve their operational problems, and solve all Helpdesk tickets, esp. the outstanding ones SLA conformance monitoring will continue Helpdesk improvements needed – statistics extraction needs to be perfected; currently this is difficult (end of October) Wiki reorganization and updates ASAP finished Application SEEGRID VO VOMS roles started to be used by WP4 application developers ASAP Application level accounting implemented partially; should be fully implemented and used ASAP

SEE-GRID-2 PSC05 Meeting - Thessaloniki, Greece, September WP3 APs and Issues (2) Moldova to join the infrastructure by finishing certification of MD-01-TUM site ASAP 3M reporting of sites to include information on updates and operational problems according to the template (to be provided) Partners should be more responsible when performing GOOD shifts MPI WG to be established and to define standard for MPI setup of SEE-GRID sites; to finish its work until mid- October Infrastructure, site, VO metrics to be precisely defined What happened to live UI (Boro)? SEEGRID VO commitments ASAP Review of critical SAM tests