Creation of data storage and analysis center Dmytro IAKUBOVKYI (BITP, Kiev, Ukraine), Igor TELEZHINSKY (KNU, Kiev, Ukraine), Andrii ELYIV (MAO, Kiev, Ukraine)

Slides:



Advertisements
Similar presentations
1 Capability Set - Bullet. 2 Common Community Problems Too Much Information –Institutions have to SPAM their faculty and students –Too many online sources.
Advertisements

Cloud computing in spatial data processing for the Integrated Geographically Distributed Information System of Earth Remote Sensing (IGDIS ERS) Open Joint-Stock.
ProAssist ® complex assistance services management system Global Assistance & INGENIUM Praha.
UPGRADE BS-SCENE Up-Grade Black Sea scientific network (INFRA) ENVIROGRIDS Building Capacity for a Black Sea Catchment Observation and Assessment (Environment)
1.Data categorization 2.Information 3.Knowledge 4.Wisdom 5.Social understanding Which of the following requires a firm to expend resources to organize.
General Overview Division of Alcoholic Beverages and Tobacco Electronic Data Submission (EDS)
January 11, Csci 2111: Data and File Structures Week1, Lecture 1 Introduction to the Design and Specification of File Structures.
GRID Activities at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain.
Monitoring in DIRAC environment for the BES-III experiment Presented by Igor Pelevanyuk Authors: Sergey BELOV, Igor PELEVANYUK, Alexander UZHINSKIY, Alexey.
CLAG 2004 – April/041 A Workflow-based Architecture for e- Learning in the Grid Luiz A. Pereira, Fábio A. Porto, Bruno Schulze, Rubens N. Melo
CSC Grid Activities Arto Teräs HIP Research Seminar February 18th 2005.
25 September 2007eSDO and the VO, ADASS 2007Elizabeth Auden Accessing eSDO Solar Image Processing and Visualisation through AstroGrid Elizabeth Auden ADASS.
Leicester Database & Archive Service J. D. Law-Green, S. W. Poulton, J. Osborne, R. S. Warwick Dept. of Physics & Astronomy, University of Leicester LEDAS.
Astronomical GRID Applications at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC.
WELCOME good day Alexandru Doszlop
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Field Project Planning, Operations and Data Services Jim Moore, EOL Field Project Services (FPS) Mike Daniels, EOL Computing, Data and Software (CDS) Facility.
Communication & Web Presence David Eichmann, Heather Davis, Brian Finley & Jennifer Laskowski Background: Due to its inherently complex and interdisciplinary.
Creating Tutorials for the Web: a Designer’s Challenge Module 4: Checking for Effectiveness.
1 Research Groups : KEEL: A Software Tool to Assess Evolutionary Algorithms for Data Mining Problems SCI 2 SMetrology and Models Intelligent.
At A Glance VOLT is a freeware, platform independent tool set that coordinates cross-mission observation planning and scheduling among one or more space.
Introduction to the course. Objectives of the course  To provide a solid introduction to the topic of file structures design.  To discuss a number of.
CSE 548 Advanced Computer Network Security Document Search in MobiCloud using Hadoop Framework Sayan Cole Jaya Chakladar Group No: 1.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
A Web Based Workorder Management System for California Schools.
4th EGEE user forum / OGF 25 Catania, TheoSSA on AstroGrid-D Iliya Nickelt (AIP / GAVO), Thomas Rauch (IAAT / GAVO), Harry Enke (AIP.
CHEP'07 September D0 data reprocessing on OSG Authors Andrew Baranovski (Fermilab) for B. Abbot, M. Diesburg, G. Garzoglio, T. Kurca, P. Mhashilkar.
The SOC Pilot and the ATOA Jessica Chapman CASS Observatory Operations Research Program Leader 28 June 2011.
CERN IT Department CH-1211 Genève 23 Switzerland t Internet Services Job Monitoring for the LHC experiments Irina Sidorova (CERN, JINR) on.
SDSS Quasars Spectra Fitting N. Kuropatkin, C. Stoughton.
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
Instrumentation of the SAM-Grid Gabriele Garzoglio CSC 426 Research Proposal.
1 Sergio Maffioletti Grid Computing Competence Center GC3 University of Zurich Swiss Grid School 2012 Develop High Throughput.
Network and Grid Monitoring Ludek Matyska CESNET Czech Republic.
Enterprise COllaboration & INteroperability Business Cases for Enterprise Interoperability The Andalusian Aeronautics Business Case Noordwijk, June 23.
Cracow Grid Workshop October 2009 Dipl.-Ing. (M.Sc.) Marcus Hilbrich Center for Information Services and High Performance.
Celine DONDEYNAZ, Joint Research Centre- Italy A. Leone, C. Carmona, P. Mainardi, M.Giacomassi and Prof. Daoyi Chen A Web knowledge Management Platform.
Grid Computing & Semantic Web. Grid Computing Proposed with the idea of electric power grid; Aims at integrating large-scale (global scale) computing.
9 Systems Analysis and Design in a Changing World, Fourth Edition.
The GriPhyN Planning Process All-Hands Meeting ISI 15 October 2001.
1 Development of a High-Throughput Computing Cluster at Florida Tech P. FORD, R. PENA, J. HELSBY, R. HOCH, M. HOHLMANN Physics and Space Sciences Dept,
Production Grid Challenges in Hungary Péter Stefán Ferenc Szalai Gábor Vitéz NIIF/HUNGARNET.
Trusted Virtual Machine Images a step towards Cloud Computing for HEP? Tony Cass on behalf of the HEPiX Virtualisation Working Group October 19 th 2010.
The HEASARC Coordinate data, software, and media standards with other astrophysics sites. Established December 1990 The HEASARC Charter: Maintain and disseminate.
11/15/04PittGrid1 PittGrid: Campus-Wide Computing Environment Hassan Karimi School of Information Sciences Ralph Roskies Pittsburgh Supercomputing Center.
Structural Biology on the GRID Dr. Tsjerk A. Wassenaar Biomolecular NMR - Utrecht University (NL)
12 Oct 2003VO Tutorial, ADASS Strasbourg, Data Access Layer (DAL) Tutorial Doug Tody, National Radio Astronomy Observatory T HE US N ATIONAL V IRTUAL.
Timeshared Parallel Machines Need resource management Need resource management Shrink and expand individual jobs to available sets of processors Shrink.
Fire Emissions Network Sept. 4, 2002 A white paper for the development of a NSF Digital Government Program proposal Stefan Falke Washington University.
Convert generic gUSE Portal into a science gateway Akos Balasko.
November 1, 2004 ElizabethGallas -- D0 Luminosity Db 1 D0 Luminosity Database: Checklist for Production Elizabeth Gallas Fermilab Computing Division /
Miron Livny Computer Sciences Department University of Wisconsin-Madison Condor and (the) Grid (one of.
Ukrainian Academic Grid Initiative (UAGI) Status and outlook G. Zinovjev Bogolyubov Institute for Theoretical Physics Kiev, Ukraine.
Six Sigma Project Coding Performance Improvement Project Global Communication, Inc.
1 st EGI CTA VT meeting 18 January 2013 C. Vuerli (INAF, Italy), N. Neyroud (CNRS/IN2P3/LAPP, France)
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
WELCOME TO SITIATA. Time Card Calculator If you want to best calculating software then visit in sitiata. We are providing a best Time Card Calculator.
Consumers, Health, Agriculture and Food Executive Agency 3rd Health Programme The Electronic Submission System (JA 2015) Georgios MARGETIDIS.
The use of the SCMS-EMI as scientific gateway in BCC of NGI_UA Authors: Andrii Golovynskyi, Andrii Malenko, Valentyna Cherepynets V. Glushkov Institute.
Advanced Higher Computing Science The Project. Introduction Worth 60% of the total marks for the course Must include: An appropriate interface using input.
ChinaGrid: National Education and Research Infrastructure Hai Jin Huazhong University of Science and Technology
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
EGI-Engage is co-funded by the Horizon 2020 Framework Programme of the European Union under grant number Federated Cloud Update.
CTA Cerenkov Telescope Array G. Lamanna (1), C. Vuerli (2) (1) CNRS/IN2P3/LAPP, Annecy (FR) (2) INAF – Istituto Nazionale di Astrofisica, Trieste (IT)
Extreme Scale Infrastructure
ICS 3UI - Introduction to Computer Science
Simulation Production System
Abstract Major Cloud computing companies have started to integrate frameworks for parallel data processing in their product portfolio, making it easy for.
Wise Implementation Plan
Presentation transcript:

Creation of data storage and analysis center Dmytro IAKUBOVKYI (BITP, Kiev, Ukraine), Igor TELEZHINSKY (KNU, Kiev, Ukraine), Andrii ELYIV (MAO, Kiev, Ukraine) On behalf of Virtual Roengen and Gamma Observatory in Ukraine (VIRGO.UA)

Off-line analysis in high-energy astrophysics and its challenges Complexity of the instruments, large amount of analyzed data; Some tasks start to challenge the existing computing capabilities; An example of computationally challenging task: search of Dark Matter decay line in X-rays; INTEGRAL/SPI – processing more than ScWs from GC and outer regions -- > thousands of CPU-hours; XMM-Newton/EPIC – excluding more than 200 point sources from M31 halo, extended sources analysis (generation of precise response matrices) -- > hundreds of CPU-hours; Xspec – joint spectral fitting of several spectra with complex models (e.g. XMM-ESAS analysis for extended sources), confidence ranges calculation --> hundreds of CPU-hours.

Possible solution – on-line analysis: These tasks can't be run at a single local computer (several months' tasks); The alternative – computing cluster (usually about 100 CPUs), several hours – several days' tasks (if cluster is free); Hundreds of INTEGRAL and XMM-Newton active users work on dozens of different places. Do they need dozens of their own computing clusters? The other alternative is to have a single “supercluster” and on-line (web-based) analysis (versus off-line); THAT'S WHAT WE PROPOSE!

How it works … Make request to the Analysis System via web site The user-friendly Interface module converts your request into high level GRID tasks… Task Execution and Error Control module Error localization and task re- submission Final data and logs delivery logs production datadata logs COMPUTER CLUSTERS USER ???... INTERFACE

Current progress: Data storage and analysis center in Kiev (part of VIRGO.UA), created with the help of ISDC, works since 2005; 16 computing clusters in Ukraine, working under Ukrainian Academic GRID (UAGRID) since 2006; We have the access to UAGRID resources since the end of 2008 when virgo_ua Virtual Organization was formed; The process of implementation of INTEGRAL/OSA and XMM-Newton/SAS has been started in 2009; The contacts with European groups having similar ideas (e.g. RISA from XMM-Newton/SAS) have been established.

Thank you for attention!