Interacademiaal 20061 Lecture 6 1. Spatial Tapering 2. Non-coplanar arrays (w term) 3.Bandwidth- & Time smearing 4. Wide-band imaging.

Slides:



Advertisements
Similar presentations
A Crash Course in Radio Astronomy and Interferometry: 4
Advertisements

A Crash Course in Radio Astronomy and Interferometry: 2
Image Processing IB Paper 8 – Part A Ognjen Arandjelović Ognjen Arandjelović
November 12, 2013Computer Vision Lecture 12: Texture 1Signature Another popular method of representing shape is called the signature. In order to compute.
Atacama Large Millimeter/submillimeter Array Expanded Very Large Array Robert C. Byrd Green Bank Telescope Very Long Baseline Array Whoever North American.
NASSP Masters 5003F - Computational Astronomy Lecture 13 Further with interferometry – Resolution and the field of view; Binning in frequency and.
Understanding interferometric visibility functions J. Meisner.
SKADS: Array Configuration Studies Implementation of Figures-of-Merit on Spatial-Dynamic-Range Progress made & Current status Dharam V. Lal & Andrei P.
Interferometric Spectral Line Imaging Martin Zwaan (Chapters of synthesis imaging book)
Image Analysis Jim Lovell ATNF Synthesis Imaging Workshop May 2003.
BMME 560 & BME 590I Medical Imaging: X-ray, CT, and Nuclear Methods Tomography Part 3.
Image Analysis Jim Lovell ATNF Synthesis Imaging Workshop September 2001.
CSCE 641 Computer Graphics: Image Sampling and Reconstruction Jinxiang Chai.
Twelfth Synthesis Imaging Workshop 2010 June 8-15 Widefield Imaging II: Mosaicing Juergen Ott (NRAO)
CSCE 641 Computer Graphics: Image Sampling and Reconstruction Jinxiang Chai.
Interference Daniel Mitchell, ATNF and Sydney University.
Image Sampling Moire patterns
Copyright, 1996 © Dale Carnegie & Associates, Inc. Tim Cornwell Kumar Golap Sanjay Bhatnagar W projection A new algorithm for wide field imaging with radio.
CS4670: Computer Vision Kavita Bala Lecture 7: Harris Corner Detection.
1 Synthesis Imaging Workshop Error recognition R. D. Ekers Narrabri, 14 May 2003.
OVSA Expansion Pipeline Imaging. Log-spiral array: “uv” distribution Left: sampling of Right: inner region of spatial Fourier plane sampled spatial Fourier.
Ninth Synthesis Imaging Summer School Socorro, June 15-22, 2004 Cross Correlators Walter Brisken.
lecture 2, linear imaging systems Linear Imaging Systems Example: The Pinhole camera Outline  General goals, definitions  Linear Imaging Systems.
CSIRO; Swinburne Error Recognition Emil Lenc University of Sydney / CAASTRO CASS Radio Astronomy School 2012 Based on lectures given previously.
Interferometry Basics
Topic 7 - Fourier Transforms DIGITAL IMAGE PROCESSING Course 3624 Department of Physics and Astronomy Professor Bob Warwick.
Michael Bietenholz Wide-Field Imaging Based on a lecture by Rick Perley (NRAO) at the NRAO Synthesis Imaging Workshop.
Ninth Synthesis Imaging Summer School Socorro, June 15-22, 2004 Mosaicing Tim Cornwell.
Random Media in Radio Astronomy Atmospherepath length ~ 6 Km Ionospherepath length ~100 Km Interstellar Plasma path length ~ pc (3 x Km)
Wideband Imaging and Measurements ASTRONOMY AND SPACE SCIENCE Jamie Stevens | ATCA Senior Systems Scientist / ATCA Lead Scientist 2 October 2014.
Data Weighting and Imaging Tom Muxlow General introduction Initial data processing Data gridding and weighting schemes De-convolution Wide-field imaging.
Copyright, 1996 © Dale Carnegie & Associates, Inc. Tim Cornwell Kumar Golap Sanjay Bhatnagar Wide field imaging - computational challenges.
Controlling Field-of-View of Radio Arrays using Weighting Functions MIT Haystack FOV Group: Lynn D. Matthews,Colin Lonsdale, Roger Cappallo, Sheperd Doeleman,
1 The OSKAR Simulator (Version 2!) AAVP Workshop, ASTRON, 15 th December 2011 Fred Dulwich, Ben Mort, Stef Salvini.
S.T. MyersEVLA Advisory Committee Meeting September 6-7, 2007 EVLA Algorithm Research & Development Steven T. Myers (NRAO) CASA Project Scientist with.
Ninth Synthesis Imaging Summer School Socorro, June 15-22, 2004 Spectral Line II John Hibbard.
1 ATNF Synthesis Workshop 2001 Basic Interferometry - II David McConnell.
Wide-field imaging Max Voronkov (filling up for Tim Cornwell) Software Scientist – ASKAP 1 st October 2010.
Non-coplanar arrays Wide field imaging Non-copalanar arrays Kumar Golap Tim Cornwell.
Sanjay BhatnagarEVLA Advisory Committee Meeting May 8-9, EVLA Algorithm Research & Development Progress & Plans Sanjay Bhatnagar CASA/EVLA.
Observing Strategies at cm wavelengths Making good decisions Jessica Chapman Synthesis Workshop May 2003.
NASSP Masters 5003F - Computational Astronomy Lecture 12: The beautiful theory of interferometry. First, some movies to illustrate the problem.
NASSP Masters 5003F - Computational Astronomy Lecture 14 Reprise: dirty beam, dirty image. Sensitivity Wide-band imaging Weighting –Uniform vs Natural.
The Australia Telescope National Facility Ray Norris CSIRO ATNF.
Eleventh Synthesis Imaging Workshop Socorro, June 10-17, 2008 Rick Perley Wide-Field Imaging I: Non-Coplanar Visibilities.
Ninth Synthesis Imaging Summer School Socorro, June 15-22, 2004 Wide Field Imaging I: Non-Coplanar Arrays Rick Perley.
Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos VC 15/16 – TP7 Spatial Filters Miguel Tavares Coimbra.
NASSP Masters 5003F - Computational Astronomy Lecture 16 Further with interferometry – Digital correlation Earth-rotation synthesis and non-planar.
Digital Image Processing
Lecture 14. Radio Interferometry Talk at Nagoya University IMS Oct /43.
1 ATNF Synthesis Workshop 2003 Basics of Interferometry David McConnell.
WIDE-FIELD IMAGING IN CLASSIC AIPS Eric W. Greisen National Radio Astronomy Observatory Socorro, NM, USA.
David Luebke 3/17/2016 Advanced Computer Graphics Antialiasing David Luebke
1 Combining Single Dish and Interferometer Data Naomi McClure-Griffiths ATNF ATNF Astronomical Synthesis Imaging Workshop September 24-28, 2001 or Seeing.
M.P. Rupen, Synthesis Imaging Summer School, 18 June Cross Correlators Michael P. Rupen NRAO/Socorro.
Imaging issues Full beam, full bandwidth, full Stokes noise limited imaging Algorithmic Requirements: –PB corrections: Rotation, Freq. & Poln. dependence,
Direction dependent effects: Jan. 15, 2010, CPG F2F Chicago: S. Bhatnagar 1 Direction-dependent effects S. Bhatnagar NRAO, Socorro RMS ~15  Jy/beam RMS.
Details: Gridding, Weight Functions, the W-term
Miguel Tavares Coimbra
CLEAN: Iterative Deconvolution
Observing Strategies for the Compact Array
Data Reduction and Analysis Techniques
Wide-field imaging Max Voronkov (filling up for Tim Cornwell)
Wide field imaging Non-copalanar arrays
HERA Imaging and Closure
Deconvolution and Multi frequency synthesis
Filtering Part 2: Image Sampling
Basic theory Some choices Example Closing remarks
EVLA Algorithm Research & Development
Wide Field Imaging II: Mosaicing Debra Shepherd & Mark Holdaway
Presentation transcript:

Interacademiaal Lecture 6 1. Spatial Tapering 2. Non-coplanar arrays (w term) 3.Bandwidth- & Time smearing 4. Wide-band imaging

Interacademiaal Tentamen IAC06 tentamen There will be a centrally organised schriftelijk exam. This will take place on June 7 in 3 locations, from 13:30 to 16:30: Leiden, supervisor Van Langevelde Groningen, supervisor Oosterloo Utrecht, supervisor Pols The exam will be based on several chapters from the Synthesis Imaging book and the material from the courses, guest lectures and practica. The book is allowed at the exam, as is a scientific calculator, the material from the courses or notes are not allowed, nor is internet access. From the book we have selected chapters 1-12, 17, 20, 22, 28, 29 We have dropped the idea of asking you to prepare by reading scientific papers

Interacademiaal Spatial tapering

Interacademiaal Previous praktikum Spectral line data of NGC 1023

Interacademiaal Previous praktikum Integrated H I image (moment 0) ― unmasked vs masked

Interacademiaal Weighting schemes Because we make the image after the observation, we can still choose how to make the image: assign different weights to each visibility: In this way, we can “tune” the image to what we want to see: optimise noise and/or resolution

Interacademiaal Weighting and WSRT Because WSRT has many redundant baselines:  choice of weighting very important for WSRT Natural weightingRobust = 0Uniform weighting

Interacademiaal Weighting and WSRT Because WSRT has many redundant baselines:  choice of weighting very important for WSRT Natural weighting noise = 0.5 Robust = 0 noise = 0.6 Uniform weighting noise = 0.7 Difference in noise 40% (factor 2 in observing time!!!)

Interacademiaal Alternative: Spatial Tapering To detect extended source: smooth image to lower spatial resolution Remember: units in image are Jy/beam  if one doubles the area of the beam: signal per beam of extended source in image doubles noise increases only by 10-20%  signal/noise increases Optimum beam to detect a source has same extent as source

Interacademiaal Related to Wiener Filtering u source noise No reason to add these data !!!!!

Interacademiaal ” resolution30” resolution

Interacademiaal Tapering Smooth image  taper down long baselines: with Gaussianmultiply uv plane with Gaussian Am “throwing away” baselines  noise increases! Improvement: multi-scale (e.g. wavelets)

Interacademiaal Imaging with non-coplanar arrays

Interacademiaal The problem: You will remember the complete imaging equation: Under certain conditions, the inversion of this is simple: 1. Coplanar arrays (E-W or VLA snapshot) 2. Small field of view (i.e. n-1 can be ignored)

Interacademiaal Example

Interacademiaal

Interacademiaal Simple geometric effect: The apparent shape of the array varies across the field of view This gives an extra phase term e -2πiw(n-1) This phase term should be small over the field of view:

Interacademiaal Must represent celestial sphere via a projection onto plane Distance AA’: The “extra” phase is given by AA’ multiplied by 2πw:

Interacademiaal A simple picture: Planar array a coplanar array is stretched or squeezed when seen from different locations in the field of view

Interacademiaal A simple picture: non-coplanar array a non-coplanar array is “distorted” in shape when seen from different locations in the field of view

Interacademiaal Changes in the beam Different array geometry leads to position variant PSF

Interacademiaal What about snapshots??? Snapshot: VLA coplanar, but as seen from the sky, this plane rotates through the day The apparent source position in a 2-D image thus rotates following a conic section

Interacademiaal

Interacademiaal Snapshots: warping Instantaneous planar data can be regridded to: Involves coordinate distortions  Complications for deconvolution

Interacademiaal The 3-D Image Volume After some juggling, one finds that: where F is related to the desired intensity, I(l,m), by: Brightness defined only on “Celestial Sphere”: Visibility measured in (u,v,w) space remove extra w-phase for field centre

Interacademiaal The image volume is not a physical space. It is a mathematical construct.

Interacademiaal D deconvolution of image volume Taking into account the sampling of the observations: All 2D deconvolution theory can be extended straightforwardly to 3D Solve 3D convolution equation using any deconvolution algorithm, but must constrain solution to lie on celestial sphere

Interacademiaal Schematically True image After decon Dirty image After projection

Interacademiaal How to deal with it? Compute the entire 3-d image volume. The most straightforward approach. But this approach is hugely wasteful in computing resources! The minimum number of ‘vertical planes’ needed is: BΘ 2 /λ The number of volume pixels to be calculated is: 4 B 3 Θ 2 /λ 3 But the number of pixels actually needed is: 4 B 2 /λ 2 So the fraction of effort which is wasted is 1 – λ/(BΘ 2). And this about 90% at 20 cm in VLA A-configuration, for a full primary beam image.

Interacademiaal

Interacademiaal The polyhedron approach The polyhedron approach approximates the unit sphere with small flat planes, each of which stays close to the sphere’s surface. facets For each subimage, the entire dataset must be phase-shifted, and the (u,v,w) recomputed for the new plane.

Interacademiaal Polyhedron Approach, (cont.) How many facets are needed? If we want to minimize distortions, the plane mustn’t depart from the unit sphere by more than the synthesized beam, /B. Simple analysis (see the book) shows the number of facets will be: N f ~ 2 B/D 2 or twice the number needed for 3-D imaging. But the size of each image is much smaller, so the total number of cells computed is much smaller. The extra effort in phase computation and (u,v,w) rotation is more than made up by the reduction in the number of cells computed. This approach was the standard until not so long ago.

Interacademiaal Polyhedron Approach, (cont.) Procedure is then: Determine number of facets, and the size of each. Generate each facet image, rotating the (u,v,w) and phase-shifting the phase center for each. Jointly deconvolve the set. The Clark/Cotton/Schwab major/minor cycle system is well suited for this. Project the finished images onto a 2-d surface. Added benefit of this approach: As each facet is independently generated, one can do a separate antenna-based calibration for each facet. Useful if calibration is a function of direction as well as time. This is needed for meter-wavelength imaging.

Interacademiaal W-projection Is it possible to project the data onto a single (u,v) plane, accounting for all the necessary phase shifts? Answer is YES! Tim Cornwell has developed a new algorithm, termed ‘w-projection’, to do this. Available only in AIPS++, this approach permits a single 2-d image/deconvolution, and eliminates the annoying edge effects which accompany facet re-projection.

Interacademiaal Convolutional relationship between planes in u,v,w space Assume w constant: Frater & Docherty 1980

Interacademiaal W-Projection Each visibility, at location (u,v,w) is mapped to the w=0 plane, with a phase shift proportional to the distance Each visibility is mapped to ALL the points lying within a cone whose full angle is the same as the field of view of the desired map – ~2λ/D for a full-field image w u u0,w0u0,w0 u0u0 u1,v1u1,v1 ~2λ/D ~2λw 0 /D

Interacademiaal The W-projection algorithm Calculate gridding kernel for range of values of Fourier transform phase screens multiplied by spheroidal function (needed to control aliasing) Image to Fourier Taper image by spheroidal function Fourier transform Estimate sampled visibilities by convolving gridded values with w dependent kernel Fourier to Image Convolve sampled visibilities onto grid using w dependent kernel Inverse Fourier transform Correct for spheroidal function Deconvolution Deconvolve in minor cycles using PSF for image center Reconcile to visibility data in major cycles

Interacademiaal Example Fourier transformUVW space facetsW projection

Interacademiaal Bandwidth & Time Smearing

Interacademiaal Effects of finite bandwidth The monochromatic imaging equation Real life: instrument has finite bandwidth What happens if we ignore this ?

Interacademiaal Unit of length is wavelength  uv coordinates scale with frequency:  Finite bandwidth causes radial smearing in uv plane

Interacademiaal Moreover: phase is affected due to wrong delay: soso s

Interacademiaal Averaging of V along radius in uv plane + phase turn G bandpass Effect on off-axis point source (with sampling function S):

Interacademiaal delaycorresponding to offset position l-l o delay function Position dependent distortion function

Interacademiaal Radial smearing

Interacademiaal

Interacademiaal Simpler version If frequencies range from to spatially, for a point x we move from so source width becomes

Interacademiaal Simpler version

Interacademiaal Time averaging

Interacademiaal uv tracks move through uv plane  azimuthal smearing Phase of point source:  Phase changes as Reduction of amplitude

Interacademiaal

Interacademiaal E.g. WSRT: 60 sec integration at 21 cm At 1 x FWHM primary beam: 3%loss 2 x FWHM11%loss Alternative: In 1 minute, the longest baseline (3 km) sweeps 13 m. This is almost OK for Nyquist sampling the uv plane (dish diameter is 25 m) In 1 minute, the fan beam sweeps 4.4 arcsec at half power. Just acceptable for 12 arcsec beam. At twice half power this is 8.8 arcsec and is too much  For imaging large fields, the standard 60 sec is too long

Interacademiaal Multi-Frequency Synthesis

Interacademiaal Wavelength is unit of length for an interferometer  the size of the array changes over the observing band (chromatic aberration)

Interacademiaal With single broad band this gives degradation of images at edges  Observe in spectral mode !!!!! Spectral mode allows to put every data sample to be put at correct uv coordinate, not at average coordinate of entire band  Broad band gives radial smearing in uv coverage can improve uv coverage & reduce sidelobe level

Interacademiaal uv coverage improvement single channel MHz 20-MHz band 64 channels

Interacademiaal Wide band: have to consider spectral effects u v 1 v 2 u for PSF: every uv point gets value 1 but source will have non-zero spectral index  PSF will not deconvolve source

Interacademiaal Spectral dirty beam (SDB) Dirty beam  response of the perfect point source Linearise spectral response: Now we have TWO dirty beams: B 0 normal dirty beam B 1 spectral dirty beam Can be generalised to higher order; Legendre polynomials

Interacademiaal B 0 is the normal dirty beam i.e. how unit flux apears in the image B 1 gives how “unit” spectral variation is reflected in image

Interacademiaal Example of spectral dirty beam B0B0 B1B1 Amplitude of B 1 much smaller. B 0 has peak at centre, B 1 has a minimum

Interacademiaal Deconvolution Clean every channel but: clean is not linear, so not optimal Include spectral index in model in maximum entropy has expected value with

Interacademiaal Deconvolution Normal Clean: Find j and a 0 that minimise “Double Clean” with B 0 and B 1. Find j, a 0 and a 1 that minimises (implemented as mfclean in Miriad)

Interacademiaal Example: NGC 2841: 2x130 MHz frequency switching 23 cm17 cm WSRT continuum setup: 2x8 bands of 20 MHz each, each band 64 channels, full pol

Interacademiaal single channel ( MHz) single band, 64 channels (20 MHz) 16 Bands 2 x 130 MHz

Interacademiaal MFS can give strong sidelobe suppression (used in Lofar) 1 Channel2 x 130 MHz

Interacademiaal MFS only MFS + SDB

Interacademiaal Additional issues Primary beam frequency dependent  introduces spectral index 18 cm22 cm

Interacademiaal Additional issues Faraday rotation Important at low frequencies!!!!

Interacademiaal Acknowledgements For this lecture I made extensive use of material prepared by other people, in particular Tim Cornwell, Rick Perley and from the Synthesis Imaging book.