Presentation is loading. Please wait.

Presentation is loading. Please wait.

Interacademiaal 20061 Lecture 6 1. Spatial Tapering 2. Non-coplanar arrays (w term) 3.Bandwidth- & Time smearing 4. Wide-band imaging.

Similar presentations


Presentation on theme: "Interacademiaal 20061 Lecture 6 1. Spatial Tapering 2. Non-coplanar arrays (w term) 3.Bandwidth- & Time smearing 4. Wide-band imaging."— Presentation transcript:

1 Interacademiaal 20061 Lecture 6 1. Spatial Tapering 2. Non-coplanar arrays (w term) 3.Bandwidth- & Time smearing 4. Wide-band imaging

2 Interacademiaal 20062 Tentamen IAC06 tentamen There will be a centrally organised schriftelijk exam. This will take place on June 7 in 3 locations, from 13:30 to 16:30: Leiden, supervisor Van Langevelde Groningen, supervisor Oosterloo Utrecht, supervisor Pols The exam will be based on several chapters from the Synthesis Imaging book and the material from the courses, guest lectures and practica. The book is allowed at the exam, as is a scientific calculator, the material from the courses or notes are not allowed, nor is internet access. From the book we have selected chapters 1-12, 17, 20, 22, 28, 29 We have dropped the idea of asking you to prepare by reading scientific papers

3 Interacademiaal 20063 Spatial tapering

4 Interacademiaal 20064 Previous praktikum Spectral line data of NGC 1023

5 Interacademiaal 20065 Previous praktikum Integrated H I image (moment 0) ― unmasked vs masked

6 Interacademiaal 20066 Weighting schemes Because we make the image after the observation, we can still choose how to make the image: assign different weights to each visibility: In this way, we can “tune” the image to what we want to see: optimise noise and/or resolution

7 Interacademiaal 20067 Weighting and WSRT Because WSRT has many redundant baselines:  choice of weighting very important for WSRT Natural weightingRobust = 0Uniform weighting

8 Interacademiaal 20068 Weighting and WSRT Because WSRT has many redundant baselines:  choice of weighting very important for WSRT Natural weighting noise = 0.5 Robust = 0 noise = 0.6 Uniform weighting noise = 0.7 Difference in noise 40% (factor 2 in observing time!!!)

9 Interacademiaal 20069 Alternative: Spatial Tapering To detect extended source: smooth image to lower spatial resolution Remember: units in image are Jy/beam  if one doubles the area of the beam: signal per beam of extended source in image doubles noise increases only by 10-20%  signal/noise increases Optimum beam to detect a source has same extent as source

10 Interacademiaal 200610 Related to Wiener Filtering u source noise No reason to add these data !!!!!

11 Interacademiaal 200611 15” resolution30” resolution

12 Interacademiaal 200612 Tapering Smooth image  taper down long baselines: with Gaussianmultiply uv plane with Gaussian Am “throwing away” baselines  noise increases! Improvement: multi-scale (e.g. wavelets)

13 Interacademiaal 200613 Imaging with non-coplanar arrays

14 Interacademiaal 200614 The problem: You will remember the complete imaging equation: Under certain conditions, the inversion of this is simple: 1. Coplanar arrays (E-W or VLA snapshot) 2. Small field of view (i.e. n-1 can be ignored)

15 Interacademiaal 200615 Example

16 Interacademiaal 200616

17 Interacademiaal 200617 Simple geometric effect: The apparent shape of the array varies across the field of view This gives an extra phase term e -2πiw(n-1) This phase term should be small over the field of view:

18 Interacademiaal 200618 Must represent celestial sphere via a projection onto plane Distance AA’: The “extra” phase is given by AA’ multiplied by 2πw:

19 Interacademiaal 200619 A simple picture: Planar array a coplanar array is stretched or squeezed when seen from different locations in the field of view

20 Interacademiaal 200620 A simple picture: non-coplanar array a non-coplanar array is “distorted” in shape when seen from different locations in the field of view

21 Interacademiaal 200621 Changes in the beam Different array geometry leads to position variant PSF

22 Interacademiaal 200622 What about snapshots??? Snapshot: VLA coplanar, but as seen from the sky, this plane rotates through the day The apparent source position in a 2-D image thus rotates following a conic section

23 Interacademiaal 200623

24 Interacademiaal 200624 Snapshots: warping Instantaneous planar data can be regridded to: Involves coordinate distortions  Complications for deconvolution

25 Interacademiaal 200625 The 3-D Image Volume After some juggling, one finds that: where F is related to the desired intensity, I(l,m), by: Brightness defined only on “Celestial Sphere”: Visibility measured in (u,v,w) space remove extra w-phase for field centre

26 Interacademiaal 200626 The image volume is not a physical space. It is a mathematical construct.

27 Interacademiaal 200627 3D deconvolution of image volume Taking into account the sampling of the observations: All 2D deconvolution theory can be extended straightforwardly to 3D Solve 3D convolution equation using any deconvolution algorithm, but must constrain solution to lie on celestial sphere

28 Interacademiaal 200628 Schematically True image After decon Dirty image After projection

29 Interacademiaal 200629 How to deal with it? Compute the entire 3-d image volume. The most straightforward approach. But this approach is hugely wasteful in computing resources! The minimum number of ‘vertical planes’ needed is: BΘ 2 /λ The number of volume pixels to be calculated is: 4 B 3 Θ 2 /λ 3 But the number of pixels actually needed is: 4 B 2 /λ 2 So the fraction of effort which is wasted is 1 – λ/(BΘ 2). And this about 90% at 20 cm in VLA A-configuration, for a full primary beam image.

30 Interacademiaal 200630

31 Interacademiaal 200631 The polyhedron approach The polyhedron approach approximates the unit sphere with small flat planes, each of which stays close to the sphere’s surface. facets For each subimage, the entire dataset must be phase-shifted, and the (u,v,w) recomputed for the new plane.

32 Interacademiaal 200632 Polyhedron Approach, (cont.) How many facets are needed? If we want to minimize distortions, the plane mustn’t depart from the unit sphere by more than the synthesized beam, /B. Simple analysis (see the book) shows the number of facets will be: N f ~ 2 B/D 2 or twice the number needed for 3-D imaging. But the size of each image is much smaller, so the total number of cells computed is much smaller. The extra effort in phase computation and (u,v,w) rotation is more than made up by the reduction in the number of cells computed. This approach was the standard until not so long ago.

33 Interacademiaal 200633 Polyhedron Approach, (cont.) Procedure is then: Determine number of facets, and the size of each. Generate each facet image, rotating the (u,v,w) and phase-shifting the phase center for each. Jointly deconvolve the set. The Clark/Cotton/Schwab major/minor cycle system is well suited for this. Project the finished images onto a 2-d surface. Added benefit of this approach: As each facet is independently generated, one can do a separate antenna-based calibration for each facet. Useful if calibration is a function of direction as well as time. This is needed for meter-wavelength imaging.

34 Interacademiaal 200634 W-projection Is it possible to project the data onto a single (u,v) plane, accounting for all the necessary phase shifts? Answer is YES! Tim Cornwell has developed a new algorithm, termed ‘w-projection’, to do this. Available only in AIPS++, this approach permits a single 2-d image/deconvolution, and eliminates the annoying edge effects which accompany facet re-projection.

35 Interacademiaal 200635 Convolutional relationship between planes in u,v,w space Assume w constant: Frater & Docherty 1980

36 Interacademiaal 200636 W-Projection Each visibility, at location (u,v,w) is mapped to the w=0 plane, with a phase shift proportional to the distance Each visibility is mapped to ALL the points lying within a cone whose full angle is the same as the field of view of the desired map – ~2λ/D for a full-field image w u u0,w0u0,w0 u0u0 u1,v1u1,v1 ~2λ/D ~2λw 0 /D

37 Interacademiaal 200637 The W-projection algorithm Calculate gridding kernel for range of values of Fourier transform phase screens multiplied by spheroidal function (needed to control aliasing) Image to Fourier Taper image by spheroidal function Fourier transform Estimate sampled visibilities by convolving gridded values with w dependent kernel Fourier to Image Convolve sampled visibilities onto grid using w dependent kernel Inverse Fourier transform Correct for spheroidal function Deconvolution Deconvolve in minor cycles using PSF for image center Reconcile to visibility data in major cycles

38 Interacademiaal 200638 Example Fourier transformUVW space facetsW projection

39 Interacademiaal 200639 Bandwidth & Time Smearing

40 Interacademiaal 200640 Effects of finite bandwidth The monochromatic imaging equation Real life: instrument has finite bandwidth What happens if we ignore this ?

41 Interacademiaal 200641 Unit of length is wavelength  uv coordinates scale with frequency:  Finite bandwidth causes radial smearing in uv plane

42 Interacademiaal 200642 Moreover: phase is affected due to wrong delay: soso s

43 Interacademiaal 200643 Averaging of V along radius in uv plane + phase turn G bandpass Effect on off-axis point source (with sampling function S):

44 Interacademiaal 200644 delaycorresponding to offset position l-l o delay function Position dependent distortion function

45 Interacademiaal 200645 Radial smearing

46 Interacademiaal 200646

47 Interacademiaal 200647 Simpler version If frequencies range from to spatially, for a point x we move from so source width becomes

48 Interacademiaal 200648 Simpler version

49 Interacademiaal 200649 Time averaging

50 Interacademiaal 200650 uv tracks move through uv plane  azimuthal smearing Phase of point source:  Phase changes as Reduction of amplitude

51 Interacademiaal 200651

52 Interacademiaal 200652 E.g. WSRT: 60 sec integration at 21 cm At 1 x FWHM primary beam: 3%loss 2 x FWHM11%loss Alternative: In 1 minute, the longest baseline (3 km) sweeps 13 m. This is almost OK for Nyquist sampling the uv plane (dish diameter is 25 m) In 1 minute, the fan beam sweeps 4.4 arcsec at half power. Just acceptable for 12 arcsec beam. At twice half power this is 8.8 arcsec and is too much  For imaging large fields, the standard 60 sec is too long

53 Interacademiaal 200653 Multi-Frequency Synthesis

54 Interacademiaal 200654 Wavelength is unit of length for an interferometer  the size of the array changes over the observing band (chromatic aberration)

55 Interacademiaal 200655 With single broad band this gives degradation of images at edges  Observe in spectral mode !!!!! Spectral mode allows to put every data sample to be put at correct uv coordinate, not at average coordinate of entire band  Broad band gives radial smearing in uv coverage can improve uv coverage & reduce sidelobe level

56 Interacademiaal 200656 uv coverage improvement single channel 0.3125 MHz 20-MHz band 64 channels

57 Interacademiaal 200657 Wide band: have to consider spectral effects u v 1 v 2 u for PSF: every uv point gets value 1 but source will have non-zero spectral index  PSF will not deconvolve source

58 Interacademiaal 200658 Spectral dirty beam (SDB) Dirty beam  response of the perfect point source Linearise spectral response: Now we have TWO dirty beams: B 0 normal dirty beam B 1 spectral dirty beam Can be generalised to higher order; Legendre polynomials

59 Interacademiaal 200659 B 0 is the normal dirty beam i.e. how unit flux apears in the image B 1 gives how “unit” spectral variation is reflected in image

60 Interacademiaal 200660 Example of spectral dirty beam B0B0 B1B1 Amplitude of B 1 much smaller. B 0 has peak at centre, B 1 has a minimum

61 Interacademiaal 200661 Deconvolution Clean every channel but: clean is not linear, so not optimal Include spectral index in model in maximum entropy has expected value with

62 Interacademiaal 200662 Deconvolution Normal Clean: Find j and a 0 that minimise “Double Clean” with B 0 and B 1. Find j, a 0 and a 1 that minimises (implemented as mfclean in Miriad)

63 Interacademiaal 200663 Example: NGC 2841: 2x130 MHz frequency switching 23 cm17 cm WSRT continuum setup: 2x8 bands of 20 MHz each, each band 64 channels, full pol

64 Interacademiaal 200664 single channel (0.3125 MHz) single band, 64 channels (20 MHz) 16 Bands 2 x 130 MHz

65 Interacademiaal 200665 MFS can give strong sidelobe suppression (used in Lofar) 1 Channel2 x 130 MHz

66 Interacademiaal 200666 MFS only MFS + SDB

67 Interacademiaal 200667 Additional issues Primary beam frequency dependent  introduces spectral index 18 cm22 cm

68 Interacademiaal 200668 Additional issues Faraday rotation Important at low frequencies!!!!

69 Interacademiaal 200669 Acknowledgements For this lecture I made extensive use of material prepared by other people, in particular Tim Cornwell, Rick Perley and from the Synthesis Imaging book.


Download ppt "Interacademiaal 20061 Lecture 6 1. Spatial Tapering 2. Non-coplanar arrays (w term) 3.Bandwidth- & Time smearing 4. Wide-band imaging."

Similar presentations


Ads by Google