Download presentation
Presentation is loading. Please wait.
Published byLaurel Dawson Modified over 9 years ago
1
LOFAR project Astroparticle Physics workshop 26 April 2004
2
LOFAR concept Combine advances in enabling IT: inexpensive environmental sensors10.000’s of sensors wide area optical broadband networkscustom+GigaPort/Géant high performance computingIBM BlueGene/L to make a ‘shared aperture multi-telescope’ but also: to sense and interpret the environment in innovative ways System spec driver
3
LOFAR Sensors Sensor typeApplications HF-antenna:astrophysics astro-particle physics VHF-antenna:cosmology, early Universe solar effects on Earth, space weather. Geophones:ground subsidence. gas/oil extraction Weather:micro-climate prediction precision agriculture wind energy. Water:precision agriculture habitat management public safety Infra-sound:atmospheric turbulence meteors, explosions, sonic booms
4
LOFAR Phase 1 - Radio telescope - Seismic imager - Precision weather for agriculture, wind energy Sensor fieldCentral processor Fibre data transport Integrate LOFAR network into regional fibre network, sharing costs with schools, health centres etc.
5
Radio Telescope Specifications Frequency range: – 20 – 80 MHz, 120 – 240 MHz Angular resolution – few – 10 arcsec Sensitivity – 100x previous instruments at these frequencies Shared aperture multi-telescope – up to 8 independent telescopes plus geophone, weather etc arrays – operated from remote Science Operations Centers similar to LHC ‘tier-1’ centers
6
One day in the life of LOFAR, the radio telescope Telescope nr.
7
Challenges Data rate ~ 15 Tbits / sec total data generated (increasing later) ~ 330 Gbits / sec input data rate to central processor ~ 1 Gbit / sec to distributed Science Operations Centres Computational resources ~ 34 TFLOP/s in custom co-processor (IBM BG/L) ~ 500 TBytes on-line temporary storage Calibration adaptive multi-patch all-sky phase correction 10 sec duty cycle Store: 25 Gbps Store: 25 Gbps Input rate > 300 Gbps Input rate > 300 Gbps Product s ~1 Gbps Product s ~1 Gbps 3 T-ops 5 T-flops 2 T-ops Transpose ~300 Gbps Storage: >500 TB Within correlator: 20 Tbps 15 T-ops
8
IBM BlueGene/L IBM – 1 st research machine on road to multi-peta-FLOP/s – 3 BG/L machines under construction LLNL, LOFAR, IBM Research – numbers 1-10 of Top-500 supercomputers in one machine (LLNL) – SOC technology, standard components for reliability – dual PowerPC 440 chips per node with 700 MHz clock – scalability – to many times 100.000 nodes – low power, air cooled –~ 20W per node
9
IBM BlueGene/L LOFAR – BG/L is our 1 st non-custom central processor total CPU power is ‘interesting’ ( 34 TFLOP/s) and scalable component failure rate: one every 3 months, DRAM dominated – BG/L is embedded co-processor in LINUX cluster – stripped down LINUX kernal on-chip – general purpose capability allows complex modelling on-line, real time – efficient for complex arithmetic, streaming applications 330 Gb/s input data rate initially; 768 Gb/s max – low power 150 kW for LOFAR ( 6k nodes ) – scalable beyond LOFAR to SKA requirements
10
Tier-0 computing LHC, LOFAR in 2006 CPU (SPECint95) No. of Processors Disk storage (TB) Tape storage (PB) LAN throughput (Gb/s) LHC / exp’t x 4 exp’ts ( Tier-0 ) 2,8 10 6 5600 / 11200 (?) 2160 12 368 LOFAR ( EOC ) 3,4 10 6 6144 / 12288 ~ 500 ?? > 330
11
LOFAR with Bsik financing Central core - plus - 45 stations 150 km max baseline
12
Mid-LOFAR would extend into Lower Saxony, Schleswig-Holstein, Northrhein-Westphalen Max-LOFAR would have stations from Cambridge UK to Potsdam DE, from Nançay FR to Växjö SE
13
1-10 Gbps China USA South Africa Russia Post-2005: JIVE + LOFAR data processing centre 30 Gbps – 2 Tbps LOFAR, the Sensor Network is under consideration as FP7 ‘Technology Platform’
14
LOFAR project timeline PDR in June/Oct 2003: M€ 14 expended Dutch funding end 2003: M€ 52 for ‘infrastructure’ funding must be matched by ‘partners’ –18 member consortium: additional partners possible formal goal is economic positioning w.r.t. ‘adaptive sensor networks’ – RF, seismic, infra-sound, wind-energy sensors prototyping of a full station is in progress 100 low frequency antennas in field, now are making all-sky videos end 2004, expect 2 beam web-based system on-line (to gain experience) – issues: calibration, RFI, adaptive re-allocation of resources BlueGene/L delivery in 1Q-2005 FDR start in mid-2004, complete mid-2005 procurement start mid-2004, end mid-2006 Initial operational status: end-2006 (solar minimum) full operational status: mid-2008
15
Remaining tasks for which partners are being sought Array configuration size: new stations ! – extension of array size to 400+ km is highly desirable cost is ~ € 500k per station fiber connections through Géant, national academic networks Definition, designation of operations centers – Science Operations Centers are remote, on-line basic data taking and archiving of observations financing mostly local, plus contribution to common services – Engineering Operations Center in Dwingeloo monitor system, perform maintenance integrated operations team (with WSRT, possibly JIVE) Operational modelling and User interface use of (quasi-real-time) GRID technologies foreseen work packages not funded / manned yet Where?
16
User involvement Test User Group Heino Falcke, leader – Lars Bähren, Michiel Breintjens, Stefan Wijholds etc ‘open’, ‘remote’ access to developing system – step-wise functionality improvements until 2006 1 st user workshop Dwingeloo, May 24-25, 2004 ASTRON is ready to host a (limited) number of young researchers to test, help develop the system Formal operations from 2007 scheduling will be an ‘interesting’ problem
17
LOFAR Research Consortium UniversitiesResearch Institutes Commercial Univ. of AmsterdamASTRON (management org.) Ordina Technical Automation bv TU DelftCWIDutch Space bv TU EindhovenIMAG Twente Institute for Wireless and Mobile Communications bv Univ. of GroningenKNMIScience[&]Technology bv Leiden Univ.TNO-NITG Nijmegen Univ.LOPES Consortium Uppsala Univ.MPIfR-Bonn
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.