GRID Activities at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain.

Slides:



Advertisements
Similar presentations
GRID Perspectives at ESA/ESAC Science Archives and VO Team Computer Support Group ESAC, Villafranca del Castillo, Madrid, Spain.
Advertisements

The Australian Virtual Observatory e-Science Meeting School of Physics, March 2003 David Barnes.
SCARF Duncan Tooke RAL HPCSG. Overview What is SCARF? Hardware & OS Management Software Users Future.
CBPF J. Magnin LAFEX-CBPF. Outline What is the GRID ? Why GRID at CBPF ? What are our needs ? Status of GRID at CBPF.
Novell Server Linux vs. windows server 2008 By: Gabe Miller.
John Kewley e-Science Centre GIS and Grid Computing Workshop 13 th September 2005, Leeds Grid Middleware and GROWL John Kewley
Leicester Database & Archive Service J. D. Law-Green, J. P. Osborne, R. S. Warwick X-Ray & Observational Astronomy Group, University of Leicester What.
25 September 2007eSDO and the VO, ADASS 2007Elizabeth Auden Accessing eSDO Solar Image Processing and Visualisation through AstroGrid Elizabeth Auden ADASS.
Carlos Gabriel Science Operations & Data Systems Division Research & Scientific Support Department ADASS London 2007 XMM-Newton ESAC SAS Carlos.
High Performance Computing (HPC) at Center for Information Communication and Technology in UTM.
Introduction to DoC Private Cloud
Astronomical GRID Applications at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC.
Aus-VO: Progress in the Australian Virtual Observatory Tara Murphy Australia Telescope National Facility.
CERN IT Department CH-1211 Genève 23 Switzerland t Next generation of virtual infrastructure with Hyper-V Michal Kwiatek, Juraj Sucik, Rafal.
DB2 (Express C Edition) Installation and Using a Database
Server System. Introduction A server system is a computer, or series of computers, that link other computers or electronic devices together. They often.
The Creation of a Big Data Analysis Environment for Undergraduates in SUNY Presented by Jim Greenberg SUNY Oneonta on behalf of the SUNY wide team.
OSN Archive: Current status and future implementations José Miguel Ibáñez Instituto de Astrofísica de Andalucía - CSIC Sierra Nevada Observatory First.
Testing Virtual Machine Performance Running ATLAS Software Yushu Yao Paolo Calafiura LBNL April 15,
Test Of Distributed Data Quality Monitoring Of CMS Tracker Dataset H->ZZ->2e2mu with PileUp - 10,000 events ( ~ 50,000 hits for events) The monitoring.
BalticGrid-II Project MATLAB implementation and application in Grid Ilmars Slaidins, Lauris Cikovskis Riga Technical University AHM Riga May 12-14, 2009.
The AstroGrid Desktop Suite Release 2008 what it is what you can do short demo (wireless permitting...) NAM2008 Belfast Andy Lawrence April 2008.
Viewgraph 1 S. Ott Herschel Data Processing System NHSC Herschel Data Processing Workshop 26th August 2013 Overview of the Herschel Data Processing System.
Global Land Cover Facility The Global Land Cover Facility (GLCF) is a member of the Earth Science Information Partnership (ESIP) Federation providing data,
GumTree Feature Overview Tony Lam Data Acquisition Team Bragg Institute eScience Workshop 2006.
VO Sandpit, November 2009 e-Infrastructure to enable EO and Climate Science Dr Victoria Bennett Centre for Environmental Data Archival (CEDA)
How to Adapt existing Archives to VO: the ISO and XMM-Newton cases Research and Scientific Support Department Science Operations.
3rd June 2004 CDF Grid SAM:Metadata and Middleware Components Mòrag Burgon-Lyon University of Glasgow.
1 Computing Fundamantals With thanks to Laudon & Laudon Session 2.
Summary of distributed tools of potential use for JRA3 Dugan Witherick HPC Programmer for the Miracle Consortium University College.
INFSO-RI Enabling Grids for E-sciencE Project Gridification: the UNOSAT experience Patricia Méndez Lorenzo CERN (IT-PSS/ED) CERN,
European Space Astronomy Centre (ESAC) Villafranca del Castillo, MADRID (SPAIN) Jesús Salgado Spectroscopic lines in the VO context Mar 2007, ESAC, Madrid,
Astro-WISE & Grid Fokke Dijkstra – Donald Smits Centre for Information Technology Andrey Belikov – OmegaCEN, Kapteyn institute University of Groningen.
Grid MP at ISIS Tom Griffin, ISIS Facility. Introduction About ISIS Why Grid MP? About Grid MP Examples The future.
1 Ilkay ALTINTAS - July 24th, 2007 Ilkay ALTINTAS Director, Scientific Workflow Automation Technologies Laboratory San Diego Supercomputer Center, UCSD.
14 Aug 08DOE Review John Huth ATLAS Computing at Harvard John Huth.
SouthGrid SouthGrid SouthGrid is a distributed Tier 2 centre, one of four setup in the UK as part of the GridPP project. SouthGrid.
EGI-InSPIRE RI EGI-InSPIRE RI EGEE and gLite are registered trademarks Astronomy and Astrophysics Heavy User Community.
CEOS WGISS-21 CNES GRID related R&D activities Anne JEAN-ANTOINE PICCOLO CEOS WGISS-21 – Budapest – 2006, 8-12 May.
Planetary Science Archive PSA User Group Meeting #1 PSA UG #1  July 2 - 3, 2013  ESAC PSA Introduction / Context.
European Space Astronomy Centre (ESAC) Villafranca del Castillo, MADRID (SPAIN) Applications May 2006, Victoria, Canada VOQuest A tool.
PACS Hitchhiker’s Guide to Herschel Archive Workshop – Pasadena 6 th - 10 th Oct 2014 The PACS Spectrometer: Overview and Products Roberta Paladini NHSC/IPAC.
Science ESAC Cluster Final Archive Pedro Osuna Head of the Science Archives and VO Team Science Operations Department CAA-CFA Review Meeting.
SVO: The Spanish Virtual Observatory Enrique Solano, LAEFF / SVO Partner’s VO Project  Census of Spanish Data Centres Partner.
Science Operations & Data Systems Division Research & Scientific Support Department IVOA Tutorial Strasbourg Pedro OSUNA How to adapt existing archives.
ESAVO/European Space Astronomy Centre (ESAC) Villafranca del Castillo, MADRID (SPAIN) Pedro Osuna ASVOWS Mar 2007 ESAC Astronomical Spectroscopy.
Christophe Arviset Paolo Padovani EURO-VO DCA Review, EC, Brussels, 10/01/2008 page 1 DCA WP3 Report Support to take-up and implementation of the VObs.
Introduction to the VO ESAVO ESA/ESAC – Madrid, Spain.
John Kewley e-Science Centre All Hands Meeting st September, Nottingham GROWL: A Lightweight Grid Services Toolkit and Applications John Kewley.
European Space Astronomy Centre (ESAC) Villafranca del Castillo, MADRID (SPAIN) Pedro Osuna VOSpec Kyoto May 2005 VOSpec: A Tool to Handle VO-Compatible.
Università di Perugia Enabling Grids for E-sciencE Status of and requirements for Computational Chemistry NA4 – SA1 Meeting – 6 th April.
CERN IT Department CH-1211 Genève 23 Switzerland t Next generation of virtual infrastructure with Hyper-V Juraj Sucik, Michal Kwiatek, Rafal.
CIP HPC CIP - HPC HPC = High Performance Computer It’s not a regular computer, it’s bigger, faster, more powerful, and more.
EGEE-II INFSO-RI Enabling Grids for E-sciencE EGEE and gLite are registered trademarks Spanish National Research Council- CSIC Isabel.
ESA Scientific Archives and Virtual Observatory Systems Science Archives and VO Team Research and Scientific Support Department.
Simulation Production System Science Advisory Committee Meeting UW-Madison March 1 st -2 nd 2007 Juan Carlos Díaz Vélez.
Instituto de Biocomputación y Física de Sistemas Complejos Cloud resources and BIFI activities in JRA2 Reunión JRU Española.
CMB & LSS Virtual Research Community Marcos López-Caniego Enrique Martínez Isabel Campos Jesús Marco Instituto de Física de Cantabria (CSIC-UC) EGI Community.
1 Workflows Access and Massage VO Data José Enrique Ruiz on behalf of the Wf4Ever Team IVOA INTEROP SPRING MEETING 2013 HEIDELBERG, MAY16th 2013.
Science Gateway- 13 th May Science Gateway Use Cases/Interfaces D. Sanchez, N. Neyroud.
A Web Based Job Submission System for a Physics Computing Cluster David Jones IOP Particle Physics 2004 Birmingham 1.
CNAF - 24 September 2004 EGEE SA-1 SPACI Activity Italo Epicoco.
Scientific Data Processing Portal and Heterogeneous Computing Resources at NRC “Kurchatov Institute” V. Aulov, D. Drizhuk, A. Klimentov, R. Mashinistov,
Brief introduction about “Grid at LNS”
DCA WP3 Report Support to take-up and implementation of the VObs framework Christophe Arviset (ESA-VO) Paolo Padovani (ESO) Christophe Arviset (ESA),
DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CLOUD COMPUTING
Archiving of solar data Luis Sanchez Solar and Heliospheric Archive Scientist Research and Scientific Support Department.
Grid site as a tool for data processing and data analysis
EVLA Archive The EVLA Archive is the E2E Archive
IVOA Status Christophe Arviset
Presentation transcript:

GRID Activities at ESAC Science Archives and Computer Engineering Unit Science Operations Department ESA/ESAC – Madrid, Spain

Christophe ARVISET IVOA Interop, A&A cluster, 25/05/2009, page 2 ESA Science Operations Department ESAC GRID Objectives  ESAC to be a Data Centre and a Data Processing Centre  Natural link between Science Archives, GRID and VO  ESAC Scientists to use the GRID for their daily science tasks  Must be easy to use (“my homedir on the GRID”, “gridlogin”)  User support to port their software to the GRID  New massive computing facilities to substitute user workstation  GRID in support to the projects  Project Standard Data Processing  On-the-fly reprocessing from the Archives  Instruments calibration monitoring and trend analysis  HPC in the GRID  Collaboration with other GRID partners  EGEE, Planck, UCM, IFCA, VO

Christophe ARVISET IVOA Interop, A&A cluster, 25/05/2009, page 3 ESA Science Operations Department Herschel Pipeline Data Processing Herschel Science Centre at ESAC Mission Operations Centre ESOC Raw Data DB Herschel Pipeline Run on the GRID Herschel Science Archive Raw data Calibration data Scientific Products Search and Download Herschel data

Christophe ARVISET IVOA Interop, A&A cluster, 25/05/2009, page 4 ESA Science Operations Department XMM-Newton RISA – Remote Interface for Science Analysis Vizualize Results with VO tools (VOSpec, Aladin) VO Access data VO Display Request Save Results on VOSpace VO Search data XMM-Newton Archive Define SAS Tasks Search data with RISA Web client Run SAS tasks on ESAC GRID VO Download data Send SAS Tasks To the GRID

Christophe ARVISET IVOA Interop, A&A cluster, 25/05/2009, page 5 ESA Science Operations Department XMM-Newton Mosaic Construction Send Processing to the GRID VO Download data XMM-Newton Archive Final Mosaics Basic processing to get the pipeline products (PPS) run in parallel for every observation 4 maps x 5 bands x 3 instrum. Total mosaic (group all previously generated mosaics) 3 final mosaic Build mosaics for every observation per band 3 combined images per obs 10 observations in 5 energy bands: 10 obs x 3 instrument x 5 bands x 4 maps (cts, exp, var, bkg) 600 maps before mosaicking + 60 x 3 (cts, exp, var) = 180 summed maps per obs 10 obs = 780 maps Construct all images and spectra (per band and per instrument) 18 maps per obs

Christophe ARVISET IVOA Interop, A&A cluster, 25/05/2009, page 6 ESA Science Operations Department XMM-Newton Mosaic Construction Send Processing to the GRID VO Download data XMM-Newton Archive Final Mosaic Basic processing to get the pipeline products (PPS) run in parallel for every observation 4 maps x 5 bands x 3 instrum. Total mosaic (group all previously generated mosaics) 1 final mosaic Build mosaics for every observation per band 1 mosaic per obs 10 observations in 5 energy bands: 10 obs x 3 instrument x 5 bands x 4 maps (cts, exp, var, bkg) 600 maps before mosaicking + 60 x 3 (cts, exp, var) = 180 summed maps per obs 10 obs = 780 maps Construct all images and spectra (per band and per instrument) 18 maps per obs

Christophe ARVISET IVOA Interop, A&A cluster, 25/05/2009, page 7 ESA Science Operations Department ESACGRID Hardware Details  In production since the beginning of 2007  36 Blade Servers for around 250 cpus and 730GB RAM  Hardware features:  DELL PowerEdge 1855 and  2 Dual/Quad core Intel CPU 3.20GHz and 1.86GHz  OS: Scientific Linux CERN 4 32 bits or 64 bits  Based on EGEE middleware gLite 3.1  Part of EGEE  Planck and eurovo-dca Virtual Organizations