T.Strizh (LIT, JINR) 1 Распределенная грид-инфраструктура для обработки и анализа данных с Большого адронного коллайдера Кореньков В.В. (ЛИТ ОИЯИ) Дубна,

Slides:



Advertisements
Similar presentations
CERN STAR TAP June 2001 Status of the EU DataGrid Project Fabrizio Gagliardi CERN EU-DataGrid Project Leader June 2001
Advertisements

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.
Perspectives of JINR Grid-infrastructure V.V. Ivanov, Gh. Adam, V.V. Korenkov, T.A. Strizh, P.V. Zrelov Laboratory of Information Technologies Joint Institute.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 15 th April 2009 Visit of Spanish Royal Academy.
Sergey Belov, LIT JINR 15 September, NEC’2011, Varna, Bulgaria.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, L.Levchuk 4, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and.
Massive Computing at CERN and lessons learnt
LCG Milestones for Deployment, Fabric, & Grid Technology Ian Bird LCG Deployment Area Manager PEB 3-Dec-2002.
ITEP participation in the EGEE project NEC’2005, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
Tier 1 in Dubna for CMS: plans and prospects Korenkov Vladimir LIT, JINR AIS-GRID School 2013, April 25.
Advanced Computing Services for Research Organisations Bob Jones Head of openlab IT dept CERN This document produced by Members of the Helix Nebula consortium.
A short introduction to GRID Gabriel Amorós IFIC.
Preparation of KIPT (Kharkov) computing facilities for CMS data analysis L. Levchuk Kharkov Institute of Physics and Technology (KIPT), Kharkov, Ukraine.
Task 6.1 Installing and testing components of the LCG infrastructure to achieve full-scale functionality CERN-INTAS , 25 June, 2006, Dubna V.A.
INFSO-RI Enabling Grids for E-sciencE EGEE Induction Grid training for users, Institute of Physics Belgrade, Serbia Sep. 19, 2008.
GRID in JINR and participation in the WLCG project Korenkov Vladimir LIT, JINR
INFSO-RI Enabling Grids for E-sciencE SA1: Cookbook (DSA1.7) Ian Bird CERN 18 January 2006.
Participation of JINR in the LCG and EGEE projects V.V.Korenkov (JINR, Dubna) NEC’2005, Varna 17 September, 2005.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks NA3 Activity in Russia Sergey Oleshko, PNPI,
GRID development in Russia 1) Networking for science and higher eductation 2) Grid for HEP 3) Digital Divide V. Ilyin SINP MSU.
ICHEP06, 29 July 2006, Moscow RDIG The Russian Grid for LHC physics analysis V.A. Ilyin, SINP MSU V.V. Korenkov, JINR A.A. Soldatov, RRC KI LCG.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 25 th April 2012.
Main title HEP in Greece Group info (if required) Your name ….
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and Experimental.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
Responsibilities of ROC and CIC in EGEE infrastructure A.Kryukov, SINP MSU, CIC Manager Yu.Lazin, IHEP, ROC Manager
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE – paving the way for a sustainable infrastructure.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE project’s status and future Bob.
Institute of High Energy Physics ( ) NEC’2005 Varna, Bulgaria, September Participation of IHEP in EGEE.
CERN IT Department CH-1211 Genève 23 Switzerland Visit of Professor Karel van der Toorn President University of Amsterdam Wednesday 10 th.
RDIG (Russian Data Intensive Grid) e-Infrastructure: Status and Plans Vyacheslav Ilyin (SINP, MSU), Vladimir Korenkov (JINR, Dubna), Aleksey Soldatov (RRC.
Ian Bird LHC Computing Grid Project Leader LHC Grid Fest 3 rd October 2008 A worldwide collaboration.
Development of Russian Grid Segment in the frames of EU DataGRID, LCG and EGEE projects V.A.Ilyin (SINP MSU), V.V.Korenkov (JINR, Dubna) NEC’2003, Varna.
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Dr Ian Bird LCG Project Leader 6 th October 2009 Telecom 2009 Youth Forum.
Les Les Robertson LCG Project Leader High Energy Physics using a worldwide computing grid Torino December 2005.
V.Gavrilov 1, I.Golutvin 2, V.Ilyin 3, O.Kodolova 3, V.Korenkov 2, E.Tikhonenko 2, S.Shmatov 2,V.Zhiltsov 2 1- Institute of Theoretical and Experimental.
Ian Bird LCG Deployment Area Manager & EGEE Operations Manager IT Department, CERN Presentation to HEPiX 22 nd October 2004 LCG Operations.
WLCG and the India-CERN Collaboration David Collados CERN - Information technology 27 February 2014.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks The EGEE User Support Infrastructure Torsten.
Dr. Andreas Wagner Deputy Group Leader - Operating Systems and Infrastructure Services CERN IT Department The IT Department & The LHC Computing Grid –
ITEP participation in the EGEE project NEC’2007, Varna, Bulgaria Ivan Korolko (ITEP Moscow)
INFSO-RI Enabling Grids for E-sciencE RDIG - Russia in EGEE Viatcheslav Ilyin RDIG Consortium Director, EGEE PMB SINP MSU (48),
The LHC Computing Grid – February 2008 The Challenges of LHC Computing Frédéric Hemmer IT Department 26 th January 2010 Visit of Michael Dell 1 Frédéric.
JINR WLCG Tier 1 for CMS CICC comprises 2582 Core Disk storage capacity 1800 TB Availability and Reliability = 99% 49% 44% JINR (Dubna)End of.
INFSO-RI Enabling Grids for E-sciencE The EGEE Project Owen Appleton EGEE Dissemination Officer CERN, Switzerland Danish Grid Forum.
Participation of JINR in CERN- INTAS project ( ) Korenkov V., Mitcin V., Nikonov E., Oleynik D., Pose V., Tikhonenko E. 19 march 2004.
V. Ilyin, Russia – EU, Russia participation in EGEE stable core infrastructure - new applications/new resources/new.
Ukrainian Academic Grid Initiative (UAGI) Status and outlook G. Zinovjev Bogolyubov Institute for Theoretical Physics Kiev, Ukraine.
tons, 150 million sensors generating data 40 millions times per second producing 1 petabyte per second The ATLAS experiment.
The LHC Computing Grid – February 2008 The Worldwide LHC Computing Grid Dr Ian Bird LCG Project Leader 1 st March 2011 Visit of Dr Manuel Eduardo Baldeón.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks EGEE: Enabling grids for E-Science Bob Jones.
Ian Bird LCG Project Leader Status of EGEE  EGI transition WLCG LHCC Referees’ meeting 21 st September 2009.
Round-table session «Presentation of the new telecommunication channel JINR-Moscow and of the JINR Grid-segment» of the JINR Grid-segment» Korenkov Vladimir.
INFSO-RI Enabling Grids for E-sciencE EGEE general project update Fotis Karayannis EGEE South East Europe Project Management Board.
Grid technologies for large-scale projects N. S. Astakhov, A. S. Baginyan, S. D. Belov, A. G. Dolbilov, A. O. Golunov, I. N. Gorbunov, N. I. Gromova, I.
LIT participation LIT participation Ivanov V.V. Laboratory of Information Technologies Meeting on proposal of the setup preparation for external beams.
Report on availability of the JINR networking, computing and information infrastructure for real data taking and processing in LHC experiments Ivanov V.V.
Laboratory of Information Technologies Proposal of the Laboratory of Information Technologies to the plan for the JINR development during V.V.
Bob Jones EGEE Technical Director
The LHC Computing Grid Visit of Mtro. Enrique Agüera Ibañez
Ian Bird GDB Meeting CERN 9 September 2003
Christos Markou Institute of Nuclear Physics NCSR ‘Demokritos’
Long-term Grid Sustainability
Russian Regional Center for LHC Data Analysis
Clouds of JINR, University of Sofia and INRNE Join Together
EGEE support for HEP and other applications
RDIG for ALICE today and in future
Cécile Germain-Renaud Grid Observatory meeting 19 October 2007 Orsay
The LHC Computing Grid Visit of Professor Andreas Demetriou
Presentation transcript:

T.Strizh (LIT, JINR) 1 Распределенная грид-инфраструктура для обработки и анализа данных с Большого адронного коллайдера Кореньков В.В. (ЛИТ ОИЯИ) Дубна,

T.Strizh (LIT, JINR) Some history EU DataGrid project EU DataGrid project middleware & testbed for an operational gridmiddleware & testbed for an operational grid – LHC Computing Grid – LCG – LHC Computing Grid – LCG deploying the results of DataGrid to provide adeploying the results of DataGrid to provide a production facility for LHC experiments – EU EGEE project phase – EU EGEE project phase 1 starts from the LCG gridstarts from the LCG grid shared production infrastructureshared production infrastructure expanding to other communities and sciencesexpanding to other communities and sciences – EU EGEE-II – EU EGEE-II Building on phase 1Building on phase 1 Expanding applications and communities …Expanding applications and communities … – EU EGEE-III – EU EGEE-III EGI-InSPIRE EGI-InSPIRE CERN

T.Strizh (LIT, JINR) 350 sites 55 countries 150,000 CPUs 150 PetaBytes >15,000 users >300 VOs >1 mln jobs/day Archeology Astronomy Astrophysics Civil Protection Comp. Chemistry Earth Sciences Finance Fusion Geophysics High Energy Physics Life Sciences Multimedia Material Sciences …

T.Strizh (LIT, JINR) Tier 0 at CERN: Acquisition, First pass reconstruction, Storage & Distribution 1.25 GB/sec (ions) 4

T.Strizh (LIT, JINR) Tier 0 – Tier 1 – Tier 2 5 Tier-0 (CERN): Data recording Initial data reconstruction Data distribution Tier-1 (11 centres): Permanent storage Re-processing Analysis Tier-2 (>200 centres): Simulation End-user analysis

T.Strizh (LIT, JINR) 6

Russian Data Intensive Grid infrastructure (RDIG) RDIG Resource Centres: – ITEP – JINR-LCG2 (Dubna) – RRC-KI – RU-Moscow-KIAM – RU-Phys-SPbSU – RU-Protvino-IHEP – RU-SPbSU – Ru-Troitsk-INR – ru-IMPB-LCG2 – ru-Moscow-FIAN – ru-Moscow-MEPHI – ru-PNPI-LCG2 (Gatchina) – ru-Moscow-SINP - Kharkov-KIPT (UA) - BY-NCPHEP (Minsk) - UA-KNU The Russian consortium RDIG (Russian Data Intensive Grid), was set up in September 2003 as a national federation in the EGEE project. Now the RDIG infrastructure comprises 17 Resource Centers with > kSI2K CPU and > 3500 TB of disc storage.

T.Strizh (LIT, JINR) - support of basic grid-services; - support of basic grid-services; - Support of Regional Operations Center (ROC); - Support of Regional Operations Center (ROC); - Support of Resource Centers (RC) in Russia; - Support of Resource Centers (RC) in Russia; - RDIG Certification Authority; - RDIG Certification Authority; - RDIG Monitoring and Accounting; - RDIG Monitoring and Accounting; - participation in integration, testing, certification of grid- software; - participation in integration, testing, certification of grid- software; - support of Users, Virtual Organization (VO) and application; - support of Users, Virtual Organization (VO) and application; - User & Administrator training and education; - User & Administrator training and education; - Dissemination, outreach and Communication grid activities. - Dissemination, outreach and Communication grid activities. The main directions in development and maintenance of RDIG e-infrastructure are as the following:

T.Strizh (LIT, JINR) The tasks of the Russia & JINR in the WLCG (2011 years): Task 1. MW (gLite) testing (supervisor O. Keeble) Task 1. MW (gLite) testing (supervisor O. Keeble) Task 2. LCG vs Experiments (supervisor I. Bird) Task 2. LCG vs Experiments (supervisor I. Bird) Task 3. LCG monitoring (supervisor J. Andreeva) Task 3. LCG monitoring (supervisor J. Andreeva) Task 4. Tier3 monitoring (supervisor J. Andreeva, A.Klementov) Task 4. Tier3 monitoring (supervisor J. Andreeva, A.Klementov) Task 5/6. Genser/ MCDB ( supervisor W. Pokorski) Task 5/6. Genser/ MCDB ( supervisor W. Pokorski) Worldwide LHC Computing Grid Project (WLCG) The protocol between CERN, Russia and JINR on participation in LCG Project was approved in MoU on Worldwide LHC Computing Grid (WLCG) signed by Russia and JINR in October, 2007

CICC comprises 1584 Core Total performance ~4100 kSI2K Disk storage capacity 1068 TB CPU (kSI2k) Disk systems (TB) Estimates of the JINR CICC resources increase in the future Performance Growth of the JINR CICC resources in ~ 300 CPU and ~320 TB disk storage will be added during the next few month JINR Central Information and Computing Complex (CICC) Availability and Reliability = 99%

T.Strizh (LIT, JINR) Integration with Google Earth USER- INTERFACE AND VISUALIZATION SERVICE DEVELOPMENT FOR VIRTUAL ORGANIZATION SUPPORT IN HIGH ENERGY PHYSICS

T.Strizh (LIT, JINR) Tier 3 sites monitoring project Tier-3 sites consist of resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Set of Tier 3 sites can be joined to federation. Tier-3 sites consist of resources mostly dedicated for the data analysis by the geographically close or local scientific groups. Set of Tier 3 sites can be joined to federation. Many Institutes and National Communities built (or have plans to build) Tier-3 facilities. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Grid monitoring systems useless. Many Institutes and National Communities built (or have plans to build) Tier-3 facilities. Tier-3 sites comprise a range of architectures and many do not possess Grid middleware, which would render application of Grid monitoring systems useless. Joined effort of ATLAS, JINR and CERN IT (ES group) Joined effort of ATLAS, JINR and CERN IT (ES group) Objectives for Tier3 monitoring Objectives for Tier3 monitoring Monitoring of Tier 3 site.Monitoring of Tier 3 site. Monitoring of Tier 3 sites federation.Monitoring of Tier 3 sites federation. Monitoring of Tier 3 site Monitoring of Tier 3 site Detailed monitoring of the local fabric (overall cluster or clusters monitoring, monitoring each individual node in the cluster, network utilization)Detailed monitoring of the local fabric (overall cluster or clusters monitoring, monitoring each individual node in the cluster, network utilization) Monitoring of the batch system.Monitoring of the batch system. Monitoring of the mass storage system (total and available space, number of connections, I/O performance)Monitoring of the mass storage system (total and available space, number of connections, I/O performance) Monitoring of VO computing activities at a siteMonitoring of VO computing activities at a site Monitoring of Tier 3 sites federation Monitoring of Tier 3 sites federation Monitoring of the VO usage of the Tier3 resources in terms of data transfer and job processing and the quality of the provided service based on the job processing and data transfer monitoring metrics.Monitoring of the VO usage of the Tier3 resources in terms of data transfer and job processing and the quality of the provided service based on the job processing and data transfer monitoring metrics.

T.Strizh (LIT, JINR) 1313 Country Normalized CPU time per Country LHC VO (July October 2011) Всего - 3,520,919,756 часов Россия- 88,014,873 (2.5%)

T.Strizh (LIT, JINR) 1414 Russia Normalized CPU time per SITE LHC VO (September September 2011) Всего по РДИГ - 84,057,552 часов ОИЯИ - 34,244,089 часов (40.7%) КИ - 18,073,012 часов (21.5%) ИФВЭ - 10,610,480 часов (12.6%) ПИЯФ - 5,509,617 часов (6.6%) ИТЭФ - 5,150,221 часов (6.1%) НИИЯФ МГУ - 4,522,040 часов (5.4%) ИЯИ - 2,093,048 часов (2.5)

T.Strizh (LIT, JINR) 1515 Russia Normalised CPU time per LHC VO (September 2009 – September 2011)

T.Strizh (LIT, JINR) Russia Normalised CPU time by SITE and VO (September 2009 – September 2011)

T.Strizh (LIT, JINR) Member States of JINR Normalized CPU time per countries (September September 2011) JINR member states Normalised CPU time Normalised CPU time Armenia21 Belarus60,225 Bulgaria1,623,837 Czech Republic21,948,110 Poland43,682,563 Romania13,391,632 Russia JINR 84,057,552 34,244,089 Slovakia3,428,272 Ukraine (Kharkov-KIPT-LCG2)700+ 1,312,299

T.Strizh (LIT, JINR) Тор 12 WLCG-Tier2 sites (September 2009 – May 2011) WLCG - Tier-2 site Normalised CPU time Normalised CPU time [ HEPSPEC06.Hours] [ HEPSPEC06.Hours] US-AGLT2 331,223,400 FR-GRIF 229,281,540 US-MWT2_UC 189,488,376 DE-DESY-HH 176,955,276 FR-IN2P3-CC-T2 171,903,924 US-WT2 129,650,112 UKI-SCOTGRID-GLASGOW 125,785,100 CYFRONET-LCG2 119,736,668 JINR-LCG2 116,427,880 UKI-NORTHGRID-MAN-HEP115,410,580 T2_US_UCSD 110,288,348 GLOW107,864,204

T.Strizh (LIT, JINR) JINR Grid-infrastructure for training and education – first step towards construction of the JINR Member States grid-infrastructure Consists of three grid sites at JINR and one at each of the following sites: Institute of High-Energy Physics - IHEP (Protvino), Institute of High-Energy Physics - IHEP (Protvino), Institute of Mathematics and Information Technologies AS of Republic of Uzbekistan - IMIT (Tashkhent, Uzbekistan), Institute of Mathematics and Information Technologies AS of Republic of Uzbekistan - IMIT (Tashkhent, Uzbekistan), Sofia University "St. Kliment Ohridski" - SU (Sofia, Bulgaria), Sofia University "St. Kliment Ohridski" - SU (Sofia, Bulgaria), Bogolyubov Institute for Theoretical Physics - BITP (Kiev, Ukraine), Bogolyubov Institute for Theoretical Physics - BITP (Kiev, Ukraine), National Technical University of Ukraine "Kyiv Polytechnic Institute" - KPI (Kiev, Ukraine). National Technical University of Ukraine "Kyiv Polytechnic Institute" - KPI (Kiev, Ukraine). Letters of Intent with Moldova “MD-GRID”, Mongolia “Mongol- Grid”, Kazakhstan Project with Cairo University

T.Strizh (LIT, JINR) WEB-PORTAL “GRID AT JINR” – “ГРИД В ОИЯИ”: A new informational resource has been created at JINR: web-portal “GRID AT JINR”. The content includes the detailed information on the JINR grid-site and JINR’s participation in grid projects: Grid Conception Grid-technologies Grid-projects RDIG Consortium JINR Grid-site Infrastructure and services Scheme Statistics VO and experiments support ATLAS CMS CBM and PANDA HONE How to start JINR in Grid-Projects WLCG GridNNN EGEE RFBR Projects INTAS Projects SKIF-Grid Grid Middleware testing Cooperation with JINR Member-States Monitoring&Accounting RDIG-monitoring dCache-monitoring Dashboard FTS-monitoring Н1 МС-monitoring Grid- Conferences of JINR GRID NEC Education Grid-infrastructure for education Courses & Lectures Text books Documentations Articles Materials for education News

T.Strizh (LIT, JINR) 21 Frames for Grid cooperation of JINR Worldwide LHC Computing Grid (WLCG); EGI-InSPIRE Enabling Grids for E-sciencE (EGEE) - Now is EGI-InSPIRE RDIG Development Now is E-ARENA CERN-RFBR project “Grid Monitoring from VO perspective” BMBF grant “Development of the Grid-infrastructure and tools to provide joint investigations performed with participation of JINR and German research centers” “Development of Grid segment for the LHC experiments” was supported in frames of JINR- South Africa cooperation agreement; Development of Grid segment at Cairo University and its integration to the JINR GridEdu infrastructure Development of Grid segment at Cairo University and its integration to the JINR GridEdu infrastructure NATO project "DREAMS-ASIA“ (Development of gRid EnAbling technology in Medicine&Science for Central ASIA); JINR - FZU AS Czech Republic Project “The GRID for the physics experiments” JINR - FZU AS Czech Republic Project “The GRID for the physics experiments” NASU-RFBR project “Development and support of LIT JINR and NSC KIPT grid- infrastructures for distributed CMS data processing of the LHC operation” NASU-RFBR project “Development and support of LIT JINR and NSC KIPT grid- infrastructures for distributed CMS data processing of the LHC operation” JINR-Romania cooperation Hulubei-Meshcheryakov programme (MD-GRID, RENAM) JINR-Moldova cooperation (MD-GRID, RENAM) (Mongol-Grid) JINR-Mongolia cooperation (Mongol-Grid) JINR-Slovakia cooperation Kazakhstan JINR- Kazakhstan cooperation (ENU Gumelev) Project "SKIF-GRID" (Program of Belarussian-Russian Union State). Project GridNNN (National Nanotechnological Net)

T.Strizh (LIT, JINR) 22 Perspectives of JINR GRID Activities - Tier 3 sites monitoring project - Tier 3 sites monitoring project - Common JINR-CERN Project “ Global data transfer monitoring system for WLCG infrastructure ” - Common JINR-CERN Project “ Global data transfer monitoring system for WLCG infrastructure ” - development of unified Grid-environment of the JINR Member States - development of unified Grid-environment of the JINR Member States - Participation in project “WLCG Tier1 center in Russia” - Participation in project “WLCG Tier1 center in Russia” Proposal to create the LCG Tier1 center in Russia (official letter by Minister of Science and Education of Russia A. Fursenko has been sent Proposal to create the LCG Tier1 center in Russia (official letter by Minister of Science and Education of Russia A. Fursenko has been sent to CERN DG R. Heuer in March 2011). to CERN DG R. Heuer in March 2011). The corresponding point to include in the agenda of next 5x5 meeting Russia- CERN. The corresponding point to include in the agenda of next 5x5 meeting Russia- CERN. - for all four experiments ALICE, ATLAS, CMS and LHCb - for all four experiments ALICE, ATLAS, CMS and LHCb - ~10% of the summary Tier1 (without CERN) resources - ~10% of the summary Tier1 (without CERN) resources - increase by 30% each year - increase by 30% each year - draft planning (proposal under discussion) to have prototype in the end - draft planning (proposal under discussion) to have prototype in the end of beginning 2012, and full resources in 2013 to meet the start of of beginning 2012, and full resources in 2013 to meet the start of next working LHC session. next working LHC session. Discussion about distributed Tier1 in Russia for LHC and FAIR Discussion about distributed Tier1 in Russia for LHC and FAIR

T.Strizh (LIT, JINR) The Fourth International Conference "Distributed Computing and Grid-technologies in Science and Education“ – GRID’ participants from 21 countries: Armenia, Belarus, Bulgaria, Hungary, Germany, Greece, Georgia, Iceland, Kazakhstan, Moldova, Myanmar, Poland, Russia, Romania, USA, Uzbekistan, Ukraine, France, Czechia, Switzerland, Sweden as well as from CERN and JINR. 252 participants from 21 countries: Armenia, Belarus, Bulgaria, Hungary, Germany, Greece, Georgia, Iceland, Kazakhstan, Moldova, Myanmar, Poland, Russia, Romania, USA, Uzbekistan, Ukraine, France, Czechia, Switzerland, Sweden as well as from CERN and JINR. 56 universities and research centers of Russia. 56 universities and research centers of Russia. 8 sections: WLCG - worldwide Grid for processing data from LHC at CERN, Grid- applications, Grid in business, distributed computing and Grid-technologies in education, GridННС – Grid of the national nanotechnology network, methods and algorithms for distributed computing, Grid-infrastructure and "cloud" computing. 8 sections: WLCG - worldwide Grid for processing data from LHC at CERN, Grid- applications, Grid in business, distributed computing and Grid-technologies in education, GridННС – Grid of the national nanotechnology network, methods and algorithms for distributed computing, Grid-infrastructure and "cloud" computing. Round tables on using grid-technologies in business and on training in grid-technologies and their application in education. Round tables on using grid-technologies in business and on training in grid-technologies and their application in education. 36 plenary talks, 78 sectional talks. 36 plenary talks, 78 sectional talks.