INAF - National Institute for Astrophysics The National Institute for Astrophysics coordinates and participates in the Astronomy and Astrophysics (A&A)

Slides:



Advertisements
Similar presentations
T-FLEX DOCs PLM, Document and Workflow Management.
Advertisements

ProActive Task Manager Component for SEGL Parameter Sweeping Natalia Currle-Linde and Wasseim Alzouabi High Performance Computing Center Stuttgart (HLRS),
1 Software & Grid Middleware for Tier 2 Centers Rob Gardner Indiana University DOE/NSF Review of U.S. ATLAS and CMS Computing Projects Brookhaven National.
© Prentice Hall CHAPTER 3 Computer Software.
Operating Systems Concepts 1. A Computer Model An operating system has to deal with the fact that a computer is made up of a CPU, random access memory.
Selenium – Testing Tool. What is Selenium? Selenium is a robust set of tools that supports rapid development of test automation for web-based applications.
1 Application Specific Module for P-GRADE Portal 2.7 Application Specific Module overview Akos Balasko MTA-SZTAKI LPDS
CSCI ClearQuest 1 Rational ClearQuest Michel Izygon - Jim Helm.
Chapter 3.1:Operating Systems Concepts 1. A Computer Model An operating system has to deal with the fact that a computer is made up of a CPU, random access.
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Leicester, February 24, 2005 VisIVO, a VO-Enabled tool for Scientific Visualization and Data Analysis. VO-TECH Project. Stage01 Ugo Becciani INAF – Astrophysical.
 Cloud computing  Workflow  Workflow lifecycle  Workflow design  Workflow tools : xcp, eucalyptus, open nebula.
Promoting Open Source Software Through Cloud Deployment: Library à la Carte, Heroku, and OSU Michael B. Klein Digital Applications Librarian
CONTENTS Arrival Characters Definition Merits Chararterstics Workflows Wfms Workflow engine Workflows levels & categories.
Functions and Demo of Astrogrid 1.1 China-VO Haijun Tian.
Configuration Management (CM)
07/06/11 New Features of WS-PGRADE (and gUSE) 2010 Q Q2 Miklós Kozlovszky MTA SZTAKI LPDS.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
Summary of distributed tools of potential use for JRA3 Dugan Witherick HPC Programmer for the Miracle Consortium University College.
4th EGEE user forum / OGF 25 Catania, TheoSSA on AstroGrid-D Iliya Nickelt (AIP / GAVO), Thomas Rauch (IAAT / GAVO), Harry Enke (AIP.
LCG Middleware Testing in 2005 and Future Plans E.Slabospitskaya, IHEP, Russia CERN-Russia Joint Working Group on LHC Computing March, 6, 2006.
Accelerating Scientific Exploration Using Workflow Automation Systems Terence Critchlow (LLNL) Ilkay Altintas (SDSC) Scott Klasky(ORNL) Mladen Vouk (NCSU)
The Grid System Design Liu Xiangrui Beijing Institute of Technology.
STAR net, Resources and VOs C. Vuerli, A. Costa, U. Becciani, P. Massimino, G. Castelli.
The european ITM Task Force data structure F. Imbeaux.
1 Geospatial and Business Intelligence Jean-Sébastien Turcotte Executive VP San Francisco - April 2007 Streamlining web mapping applications.
November SC06 Tampa F.Fanzago CRAB a user-friendly tool for CMS distributed analysis Federica Fanzago INFN-PADOVA for CRAB team.
Introduction to Software Engineering. Why SE? Software crisis manifested itself in several ways [1]: ◦ Project running over-time. ◦ Project running over-budget.
And Tier 3 monitoring Tier 3 Ivan Kadochnikov LIT JINR
Convert generic gUSE Portal into a science gateway Akos Balasko 02/07/
NA-MIC National Alliance for Medical Image Computing UCSD: Engineering Core 2 Portal and Grid Infrastructure.
LHCb Software Week November 2003 Gennady Kuznetsov Production Manager Tools (New Architecture)
Metadata Mòrag Burgon-Lyon University of Glasgow.
Grid Execution Management for Legacy Code Applications Grid Enabling Legacy Applications.
The EDGeS project receives Community research funding 1 Porting Applications to the EDGeS Infrastructure A comparison of the available methods, APIs, and.
1 CSCD 326 Data Structures I Software Design. 2 The Software Life Cycle 1. Specification 2. Design 3. Risk Analysis 4. Verification 5. Coding 6. Testing.
Enabling Grids for E-sciencE Astronomical data processing workflows on a service-oriented Grid architecture Valeria Manna INAF - SI The.
FRANEC and BaSTI grid integration Massimo Sponza INAF - Osservatorio Astronomico di Trieste.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Development of e-Science Application Portal on GAP WeiLong Ueng Academia Sinica Grid Computing
Convert generic gUSE Portal into a science gateway Akos Balasko.
Mike Hildreth DASPOS Update Mike Hildreth representing the DASPOS project 1.
Tool Integration with Data and Computation Grid “Grid Wizard 2”
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Outcome of the EuroVO-DCA WP5 Activities.
EGEE-III INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks On-line Visualization for Grid-based Astronomical.
DIRAC Project A.Tsaregorodtsev (CPPM) on behalf of the LHCb DIRAC team A Community Grid Solution The DIRAC (Distributed Infrastructure with Remote Agent.
INFSO-RI Enabling Grids for E-sciencE Using of GANGA interface for Athena applications A. Zalite / PNPI.
Application Specific Module Tutorial Zoltán Farkas, Ákos Balaskó 03/27/
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
D.Spiga, L.Servoli, L.Faina INFN & University of Perugia CRAB WorkFlow : CRAB: CMS Remote Analysis Builder A CMS specific tool written in python and developed.
ECHO Technical Interchange Meeting 2013 Timothy Goff 1 Raytheon EED Program | ECHO Technical Interchange 2013.
Grid Execution Management for Legacy Code Architecture Exposing legacy applications as Grid services: the GEMLCA approach Centre.
Ganga/Dirac Data Management meeting October 2003 Gennady Kuznetsov Production Manager Tools and Ganga (New Architecture)
V7 Foundation Series Vignette Education Services.
RI EGI-InSPIRE RI Astronomy and Astrophysics Dr. Giuliano Taffoni Dr. Claudio Vuerli.
SAM architecture EGEE 07 Service Availability Monitor for the LHC experiments Simone Campana, Alessandro Di Girolamo, Nicolò Magini, Patricia Mendez Lorenzo,
WP5 – Infrastructure Operations Test and Production Infrastructures StratusLab kick-off meeting June 2010, Orsay, France GRNET.
EGI-InSPIRE RI EGI-InSPIRE RI VisIVO: a tool for large astrophysical dataset exploration A&A Meeting 28 March 2012.
InSilicoLab – Grid Environment for Supporting Numerical Experiments in Chemistry Joanna Kocot, Daniel Harężlak, Klemens Noga, Mariusz Sterzel, Tomasz Szepieniec.
The EPIKH Project (Exchange Programme to advance e-Infrastructure Know-How) gLite Grid Introduction Salma Saber Electronic.
Convert generic gUSE Portal into a science gateway Akos Balasko.
Astronomy and Astrophysics Cluster in EGEE-III C. Vuerli ( 1,2 ), G. Taffoni ( 1,2 ), F. Pasian ( 1,2 ), M. Sponza ( 2 ), U. Becciani ( 3 ), S. Cassisi.
A. Costa, P. Massimino, C. Vuerli, U. Becciani INAF CTA Gateway Prototype Based on gUSE/WS-PGRADE and Single-Sign-On (SSO)
CTA Cerenkov Telescope Array G. Lamanna (1), C. Vuerli (2) (1) CNRS/IN2P3/LAPP, Annecy (FR) (2) INAF – Istituto Nazionale di Astrofisica, Trieste (IT)
Accessing the VI-SEEM infrastructure
Data Bridge Solving diverse data access in scientific applications
a Collaborative Environment for the Astrophysics Community
Introduction to the SHIWA Simulation Platform EGI User Forum,
What is UiPATH? For more details visit this link online-training.
Presentation transcript:

INAF - National Institute for Astrophysics The National Institute for Astrophysics coordinates and participates in the Astronomy and Astrophysics (A&A) cluster in EGI, using extensive and large applications on the Grid. INAF is a partner of IGI (Italian Grid Initiative) and participates in its activities, with skilled personnel and with computational resources in several INAF local institutes. VisIVO tools for Large Dataset Exploration on gLite Grid The Visualisation Interface to the Virtual Observatory (VisIVO) is a suite of software tools aimed at creating 3D customized views of many type of wide used data sets. Its peculiar characteristic is that there is no limit for what concerns the size of input tables containing data to be processed, thus they are able to support very large scale datasets (tens of Terabytes and more). VisIVO is designed to create images and movies from user files. The first fundamental goal of porting VisIVO to the Grid is to have movies and images directly generated and stored on the Grid, using the internal arrays of a running code on worker nodes, and to reduce the overall time for movie production. A VisIVO library was designed from November to December 2010 and the development phase took place in the first four months of VisIVO library represent an important step in order to export the usage of the 3D visualization rendering and movie creation tools to many user communities. The user can set the environment variables to use the multithread or the multiprocess features. By default the multiprocess features are adopted. The user application may call several instances for VA_Importer, VA_Filter, VA_View and several threads or processes will be started simultaneously. VisIVO may be compiled on a gLite system 3.1. In this case the GLITE option must be given when compiling the tool. We used lcg_util APIs: an high-level gLite data management APIs. They are able to provide a single interface to deal with Storage Elements and LFC. They actually provide they same functionality available throw the lcg_util command line tools (lcg-cr, lcg-cp, lcg-rep, lcg-del, etc). We used these functions to create a local copy of the table in the local WN. User code linked with VisIVO Library User code submitted to the Grid User code Running on Wn User code produces images and movies using VisIVO Grid Catalogue Monitor the data production during the run --fformat votable /home/user/demo/vizier.xml x x --y y --z z --color --colortable --colorscalar scalar0 --glyphs sphere VisIVO Desktop VisIVOServer VisIVOWeb Interactive fast navigation NO Blocking function executes threads or separate processes VisIVO Library Grid and Databases: the BaSTI/FRANEC Example Key Concepts BaSTI (The Bag of Stellar Tracks and Isochrones) is a theoretical astrophysical catalogue that collects fundamental data sets involving stars formation and evolution. BaSTI is also a use-case for the Virtual Observatory, and a testbed for the definition of standards to access numerical simulations. To create and populate BaSTI, a large number of stellar evolutionary computations are necessary. Work already done to gridify FRANEC, the code used to feed BaSTI Creation of specific services in gLite Environment, used to submit both SMRs (Synthetic Model Runs) and FIRs (Full Isochrone Runs). - Desktop environment (to hide the complexity in configuring and submitting FRANEC tasks) - Computational Services (to manage the configuration and execution of FRANEC tasks) - Data Oriented Services (to manage simulated data) Services for gLite are designed and implemented in a modular way to make possible their reusability Activity currently in progress Integration in P-GRADE: A specialized dedicated portal has been built on top of P-GRADE. Modular Specialized Portal: components reused more times. - Added a page allowing users to insert values for all parameters of the job to run. - Automatic check of inserted data to verify their formal correctness. - New P-GRADE component (Java script) producing a P-GRADE compatible workflow starting from data provided by users. Improvements of FRANEC: the values of some fundamental parameters requested to run pipelines are automatically generated and fed to FRANEC through input files. Modification of some components of P-GRADE to automate the porting of jobs in Grid. The construction of appropriate P-GRADE compatible workflows in charge of the application. Workflows are submitted through the standard tools provided by P-GRADE. Next Steps Interfacing the BaSTI DB and the Grid and integrating BaSTI in P-GRADE. Repeated runs with fixed values for a set of K parameters and values ranging in predefined intervals for the remaining set of N-K parameters (e.g. run of X jobs with different values of the mass). Use of robot certificates to ease users in approaching and using the portal and the underlying Grid infrastructure. INAF is actively engaged in projects, both at national and international level, aimed at fostering the use of Grid technology for astronomers to carry out their everyday work, through the deployment of computing infrastructures and integration of domain-specific applications. At the European level, INAF currently plays a leading role in promoting a tight interoperability between the Grid and the International Astronomical Virtual Observatory. In this poster we present some important research projects extensively using the computational Grid resources. U. Becciani, A. Costa, P. Massimino, F. Vitello, C. Vuerli, S. Cassisi, P. Manzato, M. Molinaro, F. Pasian, A. Pietrinferni, M. Sponza, G. Taffoni - INAF – Istituto Nazionale di Astrofisica - ITALY