Download presentation
Presentation is loading. Please wait.
Published byCamilla Jacobs Modified over 9 years ago
1
EUROPEAN UNION PL-Grid: National Grid Initiative in Poland for supporting Computational Science in the European Research Space Jacek Kitowski Institute of Computer Science AGH-UST ACK CYFRONET AGH, Cracow, Poland in collaboration with PL-Grid Representatives Michał Turała, Kazimierz Wiatr, Marian Bubak, Tomasz Szepieniec, Marcin Radecki, Jacek Niwicki, Alex Kusznir, Zofia Mosurska, Mariusz Sterzel Piotr Bała, Wojciech Wiślicki, Norbert Meyer, Krzysztof Kurowski, Józef Janyszek, Bartłomiej Balcerek, Mścisław Nakonieczny, Rafał Tylman.. NEC 2009: XXII International Symposium on Nuclear Electronics and Computing Varna, September 7-14, 2009
2
National Grid Initiative in Poland in a nutshell Motivation Systematic approach to establishing the Consortium and NGI Rationales and Foundations International Collaboration Status PL-Grid Project Infrastructure Workpackages Activities Grid Examples SummaryOutline E-Science approach to research Integration activities ongoing in the world
3
Need for e-Science Approach in spite of complexity: Computing, Storage, Infrastructure E-Science: collaborative research supported by advanced distributed computations Multi-disciplinary, Multi-Site and Multi-National Building with and demanding advances in Computing/Computer Sciences Goal: to enable better research in all disciplines System-level Science: beyond individual phenomena, components interact and interrelate to generate, interpret and analyse rich data resources From experiments, observations and simulations Quality management, preservation and reliable evidence to develop and explore models and simulations Computation and data at all scales Trustworthy, economic, timely and relevant results to enable dynamic distributed collaboration Facilitating collaboration with information and resource sharing Security, trust, reliability, accountability, manageability and agility M. Atkinson, e-Science (...), Grid2006 & 2-nd Int.Conf.e-Social Science 2006, National e-Science Centre UK I. Foster, System Level Science and System Level Models, Snowmass, August 1-2, 2007
4
Thanks to CYFRONET activity five High Performance Computing Polish Centres analysed according to: Participation in International and National Projects and Collaboration Needs by Polish Scientific Communities Computational resources to date European/Worldwide integration Activities National Network Infrastructure ready (thanks to Pionier National Project) Rationales behind PL-Grid Consortium GEANT2
5
~35 international projects FP5, FP6, FP7 on Grids (50% common) Grid Projects: Crossgrid, Gridstart, GridLAB, Unicore, EGEE I,II,III, K-WfGrid, Coregrid, Virolab, Gredia, int.eu.grid, Eurogrid, Grip, Unigrids, Balticgrid I,II, Gridlab, Intelligrid, Ringrid, Brein, Beingrid, Qoscosgrid, Chemomentum, EUChinagrid, PRACE,… ~15 Polish projects (50% common) Pionier, Progress, SGIgrid, Clusterix, NDS, Platon, IT-SOA, Powiew, NewMAN... Close collaboration with Institute of Computer Science AGH-UST Participation in Projects and Collaboration
6
Need by Polish Scientific Communities Scientific Community / Representative % Polish publications Warsaw / ICM29,0 % Cracow / Cyfronet16,4 % Wrocław / WCSS11,1 % Poznań / PCSS10,1 % Gdańsk / TASK6,8 % SUM 73,4 % Poland100 % Distribution of Polish publications in period 01.2004 – 04.2008 according to Science Citation Index Expanded + Social Science Citation Index + Arts & Humanities Citation Index
7
Partners’ Resources to Date TOP500 – Nov.2008 Polish Sites
8
Partners’ Resources to Date TOP500 – June 2009 Polish Sites
9
PL-Grid Consortium Founders Academic Computer Center Cyfronet AGH (ACK CYFRONET AGH) Coordinator Poznań Supercomputing and Networking Center (PCSS) Wrocław Centre for Networking and Supercomputing (WCSS) Academic Computer Center in Gdańsk (TASK) Interdisciplinary Center for Math. and Computat. Modelling, Warsaw University (ICM) GEANT2
10
PL-Grid Foundations – Summary Motivation E-Science approach to research EGI initiative ongoing in collaboration with NGIs Milestones: Creation of Polish Grid (PL-Grid) Consortium: http://plgrid.pl Consortium Agreement signed in January 2007 PL-Grid Project (2009-2012) Application in Operational Programme Innovative Economy, Activity 2.3 (in Sept. 2008) Get funded March 2, 2009 (via European Structural Funds) Consortium made up of five largest Polish supercomputing and networking centres (founders) ACK CYFRONET AGH (Cracow) – Coordinator Polish Infrastructure for Supporting Computational Science in the European Research Space Response to the needs of Polish scientists and ongoing Grid activities in Poland, other European countries and all over the world
11
Grid infrastructure (Grid services) PL-Grid Application Clusters High Performance ComputersData repositories National Computer Network PIONIER Domain Grid Advanced Service Platforms Domain Grid Assumptions Polish Grid is going to have a common base infrastructure – similar to solutions adopted in other countries. Specialized, domain Grid systems – including services and tools focused on specific types of applications – will be built upon this infrastructure. These domain Grid systems can be further developed and maintained in the framework of separate projects. Such an approach should enable efficient use of available financial resources. Creation of a Grid infrastructure fully compatible and interoperable with European and World Grids thanks to cooperation with teams involved in the development of European Grid systems (EGI, EGEE, DEISA, OMII, C-OMEGA, ESFRI). Plans for HPC and Scalability Computing enabled. Tighly-coupled activities PL-Grid Base Points
12
Elements and Functionality PL-Grid software will comprise: user tools (portals, systems for applications management and monitoring, result visualization and other purposes, compatible with the lower-layer software used in PL-Grid); software libraries; virtual organization systems: certificates, accounting, security, dynamic ; data management systems: metadata catalogues, replica management, file transfer; resource management systems: job management, applications, grid services and infrastructure monitoring, license management, local resource management, monitoring. Users National computer network Grid Application Programming Interface Virtual organizations and security systems Basic Grid services Grid services LCG/gLite (EGEE) UNICORE (DEISA) Other Grids systems Grid resources Distributed computational resources Grid portals, development tools Distributed data repositories Three Grid structures will be maintained: production, reseach, development/testing.
13
Challenges Short term To start – establishing PL-Grid VO using Partners’ local and EGEE resources To provide resources for covering operational costs To select computational/storage resources for PL-Grid infrastructure Long term – continously To provide the necessary infrastructure (!!) Computer rooms, electrical power, many organizational issues Be prepared / work on approaching paradigms and integration development HPC and Scalable Computing (Capability and Capacity Computing) Clouds (internal-external, computing clouds, data clouds) SOA paradigm, knowledge usage … „Future Internet” as defined by EC in Workprogramme
14
The PL-Grid Project is split into several workpackages Planned Realization of Aims PLANNING AND DEVELOPMENT OF INFRASTRUCTURE P2 Coordination Structure Operation Rules Dissemination PROJECT MANAGEMENT P1 SECURITY CENTER P6 Training SUPPORT FOR VARIOUS DOMAIN GRIDS P5 P4 GRID SOFTWARE AND USERS TOOLS DEVELOPMENT EGEEDEISA …. OPERATIONS CENTER P3 Main Project Indicators: Peak Perf.: 215 Tflops Disk Storage: 2500 TB
15
WP1: Management and Organizations
16
WP2: Planning of Infrastructure Development Hardware Analysis of users’ requirement Continuous analysis of worldwide activity in infrastructure development and vendors’ offers Expecting results: Current activity: organization of pilot PL-Grid VO using EGEE infrastructure Current and long-term activity: Analysis of hardware trends and offers Recommendation from analysis: technical details of clusters with x86 processors
17
WP3: Operations Center’s tasks EGI Group EGI Testing Middleware EGI Production Middleware Coordination of Operation Management and accounting EGI and DEISA collaboration (Scalability and HPC Scalability Computing) Users’ requirements analysis for operational issues Running infrastructure for: Production Developers Research to consider: Computational Cloud Data Cloud Internal and External Clouds Virtualization aspects
18
WP4: Grid Software and Users Tools Workpackage on Development/Research Analysis of users’ requirements Software repository (different tools and components developed by Partners) Reengineering of tools and applications (legacy codes, APIs, SOA applications) Workflows, applications compositions Virtual Laboratory (component applications, workflows, applications monitoring) – from EU FP6 Virolab Project High-level virtual organizations construction using knowledge (semi-automatic creation using contract, semantic description of resources, monitoring of SLA due to QoS), data access, output contracts in OWL for security configuration – from EU FP6 Gredia Project Tools for management, proactive monitoring and security (Bazaar, Permis) In general: software services, virtualization on various levels (computing, storage, network, operating system…) knowledge, semantics, QoS and SLA in various aspects… XaaS (SaaS, HPCaaS, Scalable Computing aaS)
19
WP5: Users’ Support and Training „User Interface” Running Help-desk Identification of software and problems with licences ATLAS, ALICE, CMS, LHCb, Gussian Turbolome, NAMD, CPMD, Gamess, Dalton, NWChem. ADF Compass, MOPAC, Blast, ClustalW2, OpenBabel, Macromodel, Autodock Charmm, Compchem, Making commercial software available for the users (license activity) Training, education
20
WP6: Security Users’ support Authorization, authentication,... Certificates...
21
Some examples Concerning applications Concerning operational activities 21
22
Enabling Grids for E-sciencE EGEE-II INFSO-RI-031688 t.szepieniec@cyfronet.pl GAUSSIAN in Grid Gaussian VO created supported by several Partners (EGEE activity) Accepted by Vendor Registration: https://voms.cyf-kr.edu.pl:8443/vo/gaussian/vomrs Computational Chemistry VO Manager: Mariusz Sterzel (CYFRONET), EGEE II Comput. Chemistry coordinator m.sterzel@cyfronet.pl ACK: Tomasz Szepieniec, Mariusz Sterzel
23
Enabling Grids for E-sciencE EGEE-II INFSO-RI-031688 t.szepieniec@cyfronet.pl Biotechnology in Grid ACK: Irena Roterman, Jagiellonian University, Tomasz Szepieniec Never Born Protein Folding
24
Other Activities – Collaboration and Dissemination EGI_DS – CYFRONET activity Policy Board (deputy: M. Turala), NGI Observers -- „Polish Experts” (WP3, WP5) participation in Workshops… EGI Steering Group of Policy Board - proposal preparation (Alex Kusznir from Cyfronet and Ivan Maric from Croatia) Contribution to proposals EGI.org, SSC (from PL-Grid: T. Szepieniec, M. Sterzel, P. Bala) EGEE I-III From Cyfronet: chairing Resource Allocation Group in EGEE-III – trying to influence EGI Operate CE ROC More: https://twiki.cern.ch/twiki/bin/view/EGEE/RAG E-IRG – CYFRONET activity Workshops Discussions on EGEE – EGI – NGI future EU Unit F3 „Research Infrastructure” Experts in Program Committee, nominated by the Polish Ministry of Science and High Education (since 2007) from CYFRONET (M. Bubak, J. Kitowski) ENPG seminar in Cracow (2008) Dissemination, promotion, conferences: International ENPG Seminar in Cracow (2008) International Cracow Grid Workshop 2007, 2008, 2009 Cyfronet Users’ Annual Conference (2008 and 2009) Cyfronet Open Day (2007, 2008) International. Conf. Computational Science 2008 EGEE 2009, Barcelona event Inter. Users Conf. HP-Cast 2009, Madrid
25
Polish NGI (PL-Grid) Expression of Interest for Hosting Selected EGI Global Task (EGI.org and SSC Proposals) O-E-10: Coordination of resource allocation and of brokering support for VOs from NGIs (Cyfronet – T. Szepieniec) O-E-5: Grid operation and oversight of the e-Infrastructure (Cyfronet – M. Radecki) O-E-11: Coordination of interoperations between NGIs and with other Grids (Cyfronet – T. Szepieniec) O-E-13: Coordination of definition of best practices, operations procedures, operations requirements (Cyfronet – T. Szepieniec) O-E-14: Operation of production Grid core services, catch-all services for international VOs (Cyfronet – T. Szepieniec) O-E-3: Operation of the grid repositories storing monitoring and performance data, and other related information. (PCSS Poznan – M. Lichwała) O-E-9. Coordination of middleware roll-out and deployment, middleware pilot and certification testbeds (Cyfronet – M. Radecki) Chemical and Material Science and Technology Specialised Support Centre (CMST SSC) based on Computational Chemistry Cluster of Excellence in EGEE Coordinator : Cyfronet – M. Sterzel Deputy: Uni.of Perugia – A. Lagana Participants: University of Perugia, Dept of Chemistry Italy ACC Cyfronet AGH Poland National Center for Biomolecular Research Czech Republic IT Center for Science ltd Finland Ente nazionale energie alterantive Italy Consorzio Interuniversitario CINECA Italy Theoretical chemistry and computational grid applications Switzerland Foundation for Research and Technology Hellas, Inst. Electronic structure and lasers Greece Democritos, ICTP, Trieste Italy University of Barcelona Spain
26
EGEE-III / CE ROC EGEE-III / CE ROC CE ROC – coordinator: ACK CYFRONET-AGH Consortium consists of institutes from 7 countries: Austria, Bielarus, Croatia, Czech, Hungary, Poland, Slovakia, Slovenia Operate, maintain and support EGEE Grid Infrastructure in CE region Virtual Organizations supported: a) LHC: alice, atlas, lhcb, cms b) Biomed: medical imaging, bioinformatics, drug discovery c) vo.cta.in2p3.fr: Monte Carlo and Data Production for Cherenkov Telescope Discovery d) imon, imain: VO from Interactive Grid for management e) auger: Pierre Auger Cosmic Ray Observatory Poznan-PSNC Cracow-CYFRONET Warsaw-ICM cores storage Published by sites
27
EGEE-III / CE ROC 27
28
Summary – Activities Short term To start – establishing PL-Grid VO using Partners’ local and EGEE resources To provide resources for covering operational costs To provide resources for keeping international collaboration ongoing To select and install computational/storage resources for PL-Grid infrastructure Plans for 2009: 100Tflops, 700TB Long term – continously To provide, keep and extend the necessary infrastructure (!!) Computer rooms, electrical power, many organizational issues Towards Cloud Computing (internal, external; computational, data) Towards HPCaaS and Scalable computing
29
http:plgrid.pl 29
30
Back-up slides 30
31
National and International Integration Initiatives Ongoing European and Worldwide activities and consolidation EGEE EGI_DS, EGI DEISA PRACE ...... European e- Infrastructur e 2000 Testbeds 2010 Utility Service Routine Usage National International SGI Grid, Progress (Clusterix, National Data Store...) Chemomomentum, Virolab, CoreGrid, Gredia, int.eu.grid, Baltic Grid, GridLab, Porta Optica, RINGRid, Phosphorus, QoSCoSGrid, Intelligrid, K-WfGrid, Unicore...
32
Other Activities Computational Chemistry (VO) in EGEE Third CPU power consumer Nearly 3 million of jobs executed during 2007 Lead by Cyfronet since 2006 Recently management of Computational Chemistry Cluster of Excellence In parallel current effort include: Grid ports of commercial and non commercial software packages with particular focus on their parallel version Development of “experiment centric” grid web portal for chemists EGEE III / CE ROC
33
Grid infrastructure (Grid services) PL-Grid Application Clusters High Performance ComputersData repositories National Computer Network PIONIER Domain Grid Advanced Service Platforms Domain Grid Assumptions Polish Grid is going to have a common base infrastructure – similar to solutions adopted in other countries. Specialized, domain Grid systems – including services and tools focused on specific types of applications – will be built upon this infrastructure. These domain Grid systems can be further developed and maintained in the framework of separate projects. Such an approach should enable efficient use of available financial resources. Creation of a Grid infrastructure fully compatible and interoperable with European and World Grids thanks to cooperation with teams involved in the development of European Grid systems (EGEE, DEISA, OMII, C-OMEGA, ESFRI). Plans for HPC and Scalability Computing enabled. PL-Grid Infrastructure Users National computer network Grid Application Programming Interface Virtual organizations and security systems Basic Grid services Grid services LCG/gLite (EGEE) UNICORE (DEISA) Other Grids systems Grid resource s Distributed computationa l resources Grid portals, development tools Distributed data repositorie s
34
Petascale is coming BlueGene/L, 596 TFlops peak, Low power, cheap processors Earth Simulator, 35 TFlops peak Vector processing Jaguar / Franklin, 120 / 100 TFlops peak, Vector processing Roadrunner, 1457 TFlops peak, Heterogeneous / PowerXCell 8i AMD Opteron DC 1.8GHz BE 3.2GHz (12.8 Gflops) ACK: Hank Childs, Lawrence Livermore National Laboratory, ICCS 2008 129600 cores 212992 cores TOP500, Nov.2008
35
Explosion of Data ExperimentsArchivesLiteratureSimulations Petabytes Doubling every 2 years ACK: Fabrizio Gagliardi, Microsoft, ICCS 2008
36
NETWORK. INFRASTRUCTURE GRID. INFRASTRUCTURE KNOWLEDGE. INFRASTRUCTURE e-Infrastructures in Europe: Research Network infrastructure: GEANT pan-European network interconnecting National Research and Education Networks Computing Grid Infrastructure: Enabling Grids for E-SciencE (EGEE project) Transition to the sustainable European Grid Initiative (EGI) currently worked out through EGI_DS project Data & Knowledge Infrastructure: Digital Libraries (DILIGENT) and repositories (DRIVER-II) A series of other projects : Middleware interoperation, applications, policy and support actions, etc. Cyber-Infrastructures around the world: Similar in US and Asia Pacific ACK: Fabrizio Gagliardi, Microsoft, ICCS 2008
37
37
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.