VisIVO: data visualization on the grid Scientific Multidimensional Data Exploration 4th EGEE User Forum/ OGF 25 and OGF Europe's 2nd International Event.

Slides:



Advertisements
Similar presentations
CATANIA Project PI2S2 Cometa Consortium INAF-OACT A. Costa, U. Becciani, V. Costa, A. Grillo.
Advertisements

The Australian Virtual Observatory e-Science Meeting School of Physics, March 2003 David Barnes.
Introduction to the BinX Library eDIKT project team Ted Wen Robert Carroll
1 OBJECTIVES To generate a web-based system enables to assemble model configurations. to submit these configurations on different.
Consorzio COMETA - Progetto PI2S2 UNIONE EUROPEA HPC Applications on the Sicilian Grid Infrastructure Marcello Iacono-Manno
IVOA Interop. Meeting, October 2005 VisIVO interoperability with VO enabled tools Claudio Gheller (CINECA) 1, Marco Comparato (OACt) 2 Ugo Becciani (OACt)
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
FESR Consorzio COMETA - Progetto PI2S2 Using MPI to run parallel jobs on the Grid Marcello Iacono Manno Consorzio COMETA
Web-based Portal for Discovery, Retrieval and Visualization of Earth Science Datasets in Grid Environment Zhenping (Jane) Liu.
INAF–Catania Astrophysical Observatory M. Comparato 1, U. Becciani 1, B. Larsson 1, A. Costa 1, C. Gheller 2 VOTech Project Stage 04 Planning Meetings.
Leicester, February 24, 2005 VisIVO, a VO-Enabled tool for Scientific Visualization and Data Analysis. VO-TECH Project. Stage01 Ugo Becciani INAF – Astrophysical.
SCI-BUS is supported by the FP7 Capacities Programme under contract nr RI Workflow-Oriented Science Gateway for Astrophysical Visualization Eva.
Visualization Services Group Steve Cutchin – Manager Amit Chourasia – Visualization Scientist Alex DeCastro – Visualization.
INAF - National Institute for Astrophysics The National Institute for Astrophysics coordinates and participates in the Astronomy and Astrophysics (A&A)
Cluster currently consists of: 1 Dell PowerEdge Ghz Dual, quad core Xeons (8 cores) and 16G of RAM Original GRIDVM - SL4 VM-Ware host 1 Dell PowerEdge.
EGEE is a project funded by the European Union under contract IST Input from Generic and Testing Roberto Barbera NA4 Generic Applications Coordinator.
HDF5 A new file format & software for high performance scientific data management.
Functions and Demo of Astrogrid 1.1 China-VO Haijun Tian.
Edinburgh, January 25, 2005 VisIVO, a VO-Enabled tool for Scientific Visualization and Data Analysis: Overview and Demo 1. Ugo Becciani (OACt): Introduction.
Distributed Monte Carlo Instrument Simulations at ISIS Tom Griffin, ISIS Facility & University of Manchester.
E-science grid facility for Europe and Latin America Watchdog: A job monitoring solution inside the EELA-2 Infrastructure Riccardo Bruno,
:: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: ::::: :: GridKA School 2009 MPI on Grids 1 MPI On Grids September 3 rd, GridKA School 2009.
Victoria, May 2006 DAL for theorists: Implementation of the SNAP service for the TVO Claudio Gheller, Giuseppe Fiameni InterUniversitary Computing Center.
Khoros Yongqun He Dept. of Computer Science, Virginia Tech.
GCE Data Toolbox -- metadata-based tools for automated data processing and analysis Wade Sheldon University of Georgia GCE-LTER.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
AVS/Express and VisIt Training NERSC Users meeting June PPPL Cristina Siegerist NERSC/LBNL Visualization group June 13, 2006.
Migrating Desktop The graphical framework for running grid applications Bartek Palak Poznan Supercomputing and Networking Center The.
Integrated Grid workflow for mesoscale weather modeling and visualization Zhizhin, M., A. Polyakov, D. Medvedev, A. Poyda, S. Berezin Space Research Institute.
_______________________________________________________________CMAQ Libraries and Utilities ___________________________________________________Community.
E-science grid facility for Europe and Latin America Using Secure Storage Service inside the EELA-2 Infrastructure Diego Scardaci INFN (Italy)
May 2003National Coastal Data Development Center Brief Introduction Two components Data Exchange Infrastructure (DEI) Spatial Data Model (SDM) Together,
© 2006 STEP Consortium ICT Infrastructure Strand.
FESR Consorzio COMETA - Progetto PI2S2 The COMETA consortium and its activities for Grid adoption by Industry in the context of.
Overview of grid activities in France in relation to FKPPL FKPPL Workshop Thursday February 26th, 2009 Dominique Boutigny.
Center for Computational Visualization University of Texas, Austin Visualization and Graphics Research Group University of California, Davis Molecular.
E-science grid facility for Europe and Latin America Porting a program to run on the Grid Marcello Iacono Manno Consorzio COMETA
Distributed Data Analysis & Dissemination System (D-DADS ) Special Interest Group on Data Integration June 2000.
August 2003 At A Glance The IRC is a platform independent, extensible, and adaptive framework that provides robust, interactive, and distributed control.
Testing and integrating the WLCG/EGEE middleware in the LHC computing Simone Campana, Alessandro Di Girolamo, Elisa Lanciotti, Nicolò Magini, Patricia.
SAMPLE IMAGE gLite on the Market – Why and How ir Works 4 th EGEE User Forum Thursday, 05 March 2009 Le Ciminiere, Catania, Sicily, Italy Gaël Youinou.
FESR Consorzio COMETA - Progetto PI2S2 GRID Success Stories in Sicily (The TriGrid VL and PI2S2 Projects) Roberto Barbera University.
FESR Consorzio COMETA - Progetto PI2S2 Porting a program to run on the Grid Marcello Iacono Manno Consorzio COMETA
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
Garching, June 2008 SimDAP Simulation Data Access Protocol Claudio Gheller CINECA
Predrag Buncic (CERN/PH-SFT) Software Packaging: Can Virtualization help?
Automatic testing and certification procedure for IGI products in the EMI era and beyond Sara Bertocco INFN Padova on behalf of IGI Release Team EGI Community.
Migrating Desktop Uniform Access to the Grid Marcin Płóciennik Poznan Supercomputing and Networking Center Poznan, Poland EGEE’07, Budapest, Oct.
OSSIM Technology Overview Mark Lucas. “Awesome” Open Source Software Image Map (OSSIM)
FESR Consorzio COMETA Giuseppe Andronico INFN Sez. CT & Consorzio COMETA Workshop Grids vs. Clouds Beijing, Consorzio.
FESR Consorzio COMETA - Progetto PI2S2 Porting MHD codes on the GRID infrastructure of COMETA Germano Sacco & Salvatore Orlando.
EGEE is a project funded by the European Union under contract IST Compchem VO's user support EGEE Workshop for VOs Karlsruhe (Germany) March.
Claudio Grandi INFN Bologna Virtual Pools for Interactive Analysis and Software Development through an Integrated Cloud Environment Claudio Grandi (INFN.
EGI-InSPIRE RI EGI-InSPIRE RI VisIVO: a tool for large astrophysical dataset exploration A&A Meeting 28 March 2012.
FESR Trinacria Grid Virtual Laboratory The Trinacria Grid Virtual Laboratory Roberto Barbera University of Catania and INFN Grid Open Day.
FESR Consorzio COMETA - Progetto PI2S2 Using MPI to run parallel jobs on the Grid Marcello Iacono Manno Consorzio Cometa
High school workshop: Scientific visualization ● Universitat de Barcelona ● January 2007.
Scientific Data Processing Portal and Heterogeneous Computing Resources at NRC “Kurchatov Institute” V. Aulov, D. Drizhuk, A. Klimentov, R. Mashinistov,
COMETA Sara Pirrone INFN.
VisIt Project Overview
gLite MPI Job Amina KHEDIMI CERIST
Working With Azure Batch AI
Porting MM5 and BOLAM codes to the GRID
S. Mangiagli1, L. Drago2, S. Coco2, A. Laudani2, L. Lodato1, G
CompChem VO: User experience using MPI
GeoFEST tutorial What is GeoFEST?
Use of Mathematics using Technology (Maltlab)
Google Sky.
Information Services Claudio Cherubino INFN Catania Bologna
Presentation transcript:

VisIVO: data visualization on the grid Scientific Multidimensional Data Exploration 4th EGEE User Forum/ OGF 25 and OGF Europe's 2nd International Event 2-6 March 2009 Catania Ugo Becciani, A. Costa, G. Caniglia, C. Gheller M. Comparato, P. Massimino, M. Krokos, L. Zef Han

--fformat votable /home/user/demo/vizier.xml x x --y y --z z --color --colortable --colorscalar scalar0 --glyphs sphere VisIVO Desktop VisIVOServer VisIVOWeb Closely integrated, complementary and independent ! Pc GNU/Linux Grid environment Interactive fast navigation

VisIVO Desktop VisIVO desktop is a visualisation package developed in collaboration between INAF (Catania Astrophysical Observatory), CINECA (the largest Italian academic supercomputing center) and University of Portsmouth, with the specific object of supporting visualisation and analysis of astrophysical data. It was developed in the Italian participation in the VO- TECH project. Free Software Cross Platform: MacOSX Windows and Linux (Ubuntu, Debian, Fedora), N-Dim (>3) Visualization of Astrophysical data (lut and glyphs => scalar properties of each element) Point, Volume and Vector Visualization Astrophysical Data Analysis VO (Virtual Observatory) compliance (Interoperability: VOT, PLASTIC,VizieR) Easy to use: intuitive menu drives the user.

➲ Input file formats  Ascii, CSV  Binary  HDF5  FITS  Gadget  VOTable Uniform management All internal data are represented like a table, a sequence of float values (VisIVO Binary Table - VBT) VisIVO Desktop

Visualisations Navigation -- Zoom -- Lookup table -- Algorithms -- Data selection -- Picker op. -- Interoperability

LOD Level Of Detail: while interacting with the visualized data (i.e. zooming and rotating) a simplified graphical representation is used in order to keep the manipulation interactive

Data manipulation New Fields can be created as mathematical functions of existing fields Tables can be merged Subsets can be extracted using different sampling techniques

Vectors Vector Fields can be created from a VBT selecting any loaded fields as its three components

VisIVO Server on the Grid  Non-interactive command line application that implements visualisation functionalities; its output is a static 2D image of a 3D object  Easy integration in Virtual Observatory complaint Web Services and in the Grid environment  It will provide the user with a 3D preview of huge data exploiting the powerful facilities of the Grid environment  Easy to use  Open Source code: project maintained on sourceforge.  NO LIMIT on data size !

VisIVO Server Basic Architecture TVO XML Document Local or Remote (http ref) User Data VBT: VisIVO Binary Table New VBT

VisIVO Importer Input: Local or Remote file: ASCII-CSV, Binary, VOTable, FLY, Gadget, TVO XML Document, Fits Table, HDF5 Op: Data format conversion: VBTs creation Output: VBTs (optimised for VisIVO Filters and Viewer). NOTE: No limit on data dimension for input data

float little X Y Z Vx... float little X Y Z Vx... VBT: VisIVO Binary Table Ascii Header (.bin.head):contains basic metadata information. Internal data rappresentation: float Number of fields Number of elements of each field (including mesh data for volumes) Data format identifier: little endian List of field names VisIVO Server Binary File (.bin):a sequence of binary arrays (fields).

VisIVO Importer VisIVOImporter --fformat filetype [--out name] [--volume] [--compx X --compy Y --compz Z] [--sizex DX --sizey DY --sizez DZ] [--bigendian] [--double] [--npoints N] filename --fformat specifies the input file data format --out [name] (optional) output filename. Default name VisIVOServerBinary.bin --volume (optional) used to create a table that contains Volume data --compx [double] --compy [double] --compz [double] (optional) If data is a Volume the user must enter the mesh geometry. If data size fits a cubic mesh these values are automatically computed. --sizex [double] --sizey [double] --sizez [double] (optional). Cell geometry Default values bigendian (optional) Used only for big Endian Gadget and FLY input file. Default files are little Endian. --double (optional) Used only for double data of FLY input file. Default value is float --npoints (optional) Used only for FLY input file. Number of data points.

VisIVO Filters Input: A VBT data file Op: It creates a new VBT or gives info on specified fields Output: a new VBT or other info NOTE: it must be applied to visualise huge data General Syntax: VisIVOFilters --op filterOpCode InputFile (VBT file)

VisIVO Filters Append Tables: Creates a new table appending data from a list of existing tables Coarse Volume: Produces a coarsed subvolume with plane extraction from the original volume Decimator: Creates a subtable as a regular subsample from the input table. Extract Subregion: Creates a new table from an input table: sub-box or sphere extraction. Extract Subvolume: Produces a table which represents a subvolume from the original volume Math. Operations: Create a new field in a data table as the result of a mathematical operation between existing fields Merge Tables: Create a new table from two or more existing data tables Point Distribution: Creates a volume using a field distribution (CIC algorithm) on a regular mesh Point Property: Assigns a property using a field distribution Randomizer: Creates a random subset from the original data table Select Fields: Creates a new table using one or more fields of a data table Select Rows: Creates a new table using limits on one or more fields of a data table Show Table: Produces an ASCII table of selected fields Statistic: Creates and plots an histogram of a scalar field in the table

VisIVO Viewer VisIVO Viewer is a command line application that produces 3D images from the binary internal data format table (VBT) The user must specify three fields of the table for 3D representation. The user can also specifiy the following main options: - Camera position (azimuth and elevation) - Zoom level - Transparency / Opacity - Point shape (pixel/sphere, cube, cone etc..) - Lookup table

VisIVO Viewer General Syntax VisIVOViewer inputFile (VBT file) --volume (for volume representation) Input: A VBT data file Op: It creates one or more view of data Output: png files LIMITS: the input VBT data file must fit the available RAM

- Default Output: 4 Images: - Az=0 - Elev=0 - Az=90 - Elev=0 - Az=0 - Elev=90 - Az=45 -Elev=45 VisIVO Viewer Customized Image fixing: Az, Elev, Transparency....

Cometa Consortium COnsorzio Multi Ente per la promozione e l'adozione di Tecnologie di calcolo Avanzato PI2S2 Project Funded by the Italian MUR using EU funds for Objective 1

The Sicilian Grid in one slide

Available computing and storage resources 1. ~2500 cores AMD Opteron GB RAM per core 3. LSF as LRMS everywhere 4. Infiniband-4X at all sites TB of raw disk storage 2. Distributed/parallel GPFS filesystem gLite 3.0 middleware everywhere

PI2S2 infrastructure and HPC Each PI2S2 site has a low latency Infiniband network Intel fortran and PGI compilers are available  UI “on-board”:  share all resources with the WNs  available remote login  it is possible to compile and test codes  HPC RUN  A special hpc queue is available. It allows the CPU core reservation  It is possible to use ALL nodes of a PI2S2 site (more than 200 CPU cores)

PI2S2 Grid infrastructure supports the following MPI versions PI2S2 infrastructure and HPC each MPI version has been built with different compilers. The supported tags are :  MVAPICH_GCC4  MVAPICH_INTEL9  MVAPICH_PGI706  MVAPICH2_GCC4  MVAPICH2_INTEL9  MVAPICH2_PGI706  MPICH_GCC4  MPICH_INTEL9  MPICH_PGI706 o MPICH_SH_GCC4 o MPICH_SH_INTEL9 o MPICH_SH_PGI706 MPICH2_GCC4 MPICH2_INTEL9 MPICH2_PGI706 Running Using Gigabit Network Running Using Inifiband 4x Network

Type = "Job"; JobType = "MVAPICH2"; Executable = “MPIparallel_exec”; NodeNumber = 200; Arguments = “arg1 arg2 arg3"; StdOutput = "test.out"; StdError = "test.err"; InputSandbox = {“mpi.pre.sh”,“mpi.post.sh”, “MPIparallel_exec”, “watchdog.sh”}; OutputSandbox = {“test.err”, “test.out” } PI2S2 infrastructure and HPC Pre e Post Processing Scripts Executable It runs during the parallel execution

VisIVOServer HPC and Visualisation on the Grid - Can be executed on the Grid - Produces images (on the grid Catalog) - Visualisation of large dataset during the data production phase Data Image, obtained with VisIVO Server, of a cosmological half-billion bodies simulation, running since January 9th on the Cometa Grid Independent job Sub-process of the watchdog procedure

VisIVOWeb

VisIVOWeb

Ugo Becciani Principal Investigator Astrophysical Observatory Catania Italy Mel Krokos Co-Principal Investigator University of Portsmouth UK Claudio Gheller Co-Principal Investigator Cineca Bologna Italy Gabriella Caniglia Software Developer Astrophysical Observatory Catania Italy Marco Comparato Software Manager/Developer Astrophysical Observatory Catania Italy Alessandro Costa Web Services & TVO Astrophysical Observatory Catania Italy Zef Han J. Software developer University of Portsmouth UK Pietro Massimino Web developer Astrophysical Observatory Catania Italy Thanks!