Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand.

Slides:



Advertisements
Similar presentations
Noise in Radiographic Imaging
Advertisements

Bayesian Belief Propagation
Fast & Furious: a potential wavefront reconstructor for extreme adaptive optics at ELTs Visa Korkiakoski and Christoph U. Keller Leiden Observatory Niek.
Arc-length computation and arc-length parameterization
1 AN ALIGNMENT STRATEGY FOR THE ATST M2 Implementing a standalone correction strategy for ATST M2 Robert S. Upton NIO/AURA February 11,2005.
Lecture 23 Exemplary Inverse Problems including Earthquake Location.
1 Low-Dose Dual-Energy CT for PET Attenuation Correction with Statistical Sinogram Restoration Joonki Noh, Jeffrey A. Fessler EECS Department, The University.
Optics and Human Vision The physics of light
Chapter 2 Propagation of Laser Beams
Computer vision: models, learning and inference
COMPUTER MODELING OF LASER SYSTEMS
Aperture Pupil (stop) Exit Pupil Entrance Pupil.
Chris A. Mack, Fundamental Principles of Optical Lithography, (c) Figure 3.1 Examples of typical aberrations of construction.
Aberrations  Aberrations of Lenses  Analogue to Holographic Model  Aberrations in Holography  Implications in the aberration equations  Experimental.
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Optical Imaging in Astronomy 1st CASSDA School for Observers Observatorio del Teide, 20 – 25 April 2015 Franz Kneer Institut für Astrophysik Göttingen.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Adaptive Optics and Optical Interferometry or How I Learned to Stop Worrying and Love the Atmosphere Brian Kern Observational Astronomy 10/25/00.
Optics in Astronomy - Interferometry - Oskar von der Lühe Kiepenheuer-Institut für Sonnenphysik Freiburg, Germany.
Digital Image Processing Chapter 5: Image Restoration.
Massey University Image Resolution Improvement from Multiple Images Donald Bailey Institute of Information Sciences and Technology Massey University Palmerston.
CS443: Digital Imaging and Multimedia Filters Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University Spring 2008 Ahmed Elgammal Dept.
Signal Analysis and Processing for SmartPET D. Scraggs, A. Boston, H Boston, R Cooper, A Mather, G Turk University of Liverpool C. Hall, I. Lazarus Daresbury.
A Theory of Locally Low Dimensional Light Transport Dhruv Mahajan (Columbia University) Ira Kemelmacher-Shlizerman (Weizmann Institute) Ravi Ramamoorthi.
Thermally Deformable Mirrors: a new Adaptive Optics scheme for Advanced Gravitational Wave Interferometers Marie Kasprzack Laboratoire de l’Accélérateur.
Despeckle Filtering in Medical Ultrasound Imaging
lecture 2, linear imaging systems Linear Imaging Systems Example: The Pinhole camera Outline  General goals, definitions  Linear Imaging Systems.
Wavefront Sensing II Richard Lane Department of Electrical and Computer Engineering University of Canterbury.
Figure 2.1 Block diagram of a generic projection imaging system.
Multiframe Image Restoration. Outline Introduction Mathematical Models The restoration Problem Nuisance Parameters and Blind Restoration Applications.
8 September Observational Astronomy TELESCOPES, Active and adaptive optics Kitchin pp
Introduction to Adaptive Digital Filters Algorithms
Image Formation. Input - Digital Images Intensity Images – encoding of light intensity Range Images – encoding of shape and distance They are both a 2-D.
1 On-sky validation of LIFT on GeMS C. Plantet 1, S. Meimon 1, J.-M. Conan 1, B. Neichel 2, T. Fusco 1 1: ONERA, the French Aerospace Lab, Chatillon, France.
19 February 2009 Cophasing sensor for synthetic aperture optics applications First steps of the development of a cophasing sensor for synthetic aperture.
Optics Real-time Rendering of Physically Based Optical Effects in Theory and Practice Masanori KAKIMOTO Tokyo University of Technology.
Speckle Correlation Analysis1 Adaptive Imaging Preliminary: Speckle Correlation Analysis.
Austin Roorda, Ph.D. University of Houston College of Optometry
Update to End to End LSST Science Simulation Garrett Jernigan and John Peterson December, 2004 Status of the Science End-to-End Simulator: 1. Sky Models.
Stochastic Monte Carlo methods for non-linear statistical inverse problems Benjamin R. Herman Department of Electrical Engineering City College of New.
Multimedia Data Introduction to Image Processing Dr Sandra I. Woolley Electronic, Electrical.
Gary Chanan Department of Physics and Astronomy University of California, Irvine 4 February 2000.
1 High-order coronagraphic phase diversity: demonstration of COFFEE on SPHERE. B.Paul 1,2, J-F Sauvage 1, L. Mugnier 1, K. Dohlen 2, D. Mouillet 3, T.
Improved Tilt Sensing in an LGS-based Tomographic AO System Based on Instantaneous PSF Estimation Jean-Pierre Véran AO4ELT3, May 2013.
Aldo Dell'Oro INAF- Observatory of Turin Detailed analysis of the signal from asteroids by GAIA and their size estimation Besançon November 6-7, 2003.
The Hong Kong Polytechnic University Optics 2----by Dr.H.Huang, Department of Applied Physics1 Diffraction Introduction: Diffraction is often distinguished.
Modern Navigation Thomas Herring MW 11:00-12:30 Room
Lenses and Imaging (Part I) Why is imaging necessary: Huygen’s principle – Spherical & parallel ray bundles, points at infinity Refraction at spherical.
Autonomous Robots Vision © Manfred Huber 2014.
Osservatorio Astronomico di Padova A study of Pyramid WFS behavior under imperfect illumination Valentina Viotto Demetrio Magrin Maria Bergomi Marco Dima.
Hartmann Sensor for advanced gravitational wave interferometers
On the Evaluation of Optical Performace of Observing Instruments Y. Suematsu (National Astronomical Observatory of Japan) ABSTRACT: It is useful to represent.
The Self-Coherent Camera: a focal plane wavefront sensor for EPICS
Part 2: Phase structure function, spatial coherence and r 0.
Fundamentals of adaptive optics and wavefront reconstruction Marcos van Dam Institute for Geophysics and Planetary Physics, Lawrence Livermore National.
Charts for TPF-C workshop SNR for Nulling Coronagraph and Post Coron WFS M. Shao 9/28/06.
Simulation and Experimental Verification of Model Based Opto-Electronic Automation Drexel University Department of Electrical and Computer Engineering.
Page 1 Lecture 7: Wavefront Sensing Claire Max Astro 289C, UCSC February 2, 2016.
24 September 2001ATNF Imaging Workshop1 The Sydney University Stellar Interferometer (SUSI) John Davis School of Physics University of Sydney 24 September.
Digital Image Processing Image Enhancement in Spatial Domain
GEOMETRICAL OPTICS. Laws of Reflection Laws of Refraction.
Page 1 Adaptive Optics in the VLT and ELT era François Wildi Observatoire de Genève Credit for most slides : Claire Max (UC Santa Cruz) Basics of AO.
Lecture 14 AO System Optimization
Structure Function Analysis of Annular Zernike Polynomials
Statistical Methods For Engineers
Observational Astronomy
Fourier Optics P47 – Optics: Unit 8.
He Sun Advisor: N. Jeremy Kasdin Mechanical and Aerospace Engineering
Modern Observational/Instrumentation Techniques Astronomy 500
Presentation transcript:

Wavefront Sensing I Richard Lane Department of Electrical and Computer Engineering University of Canterbury Christchurch New Zealand

Location

Astronomical Imaging Group past and present Dr Richard Lane Professor Peter Gough Associate Professor P. J. Bones Associate Professor Peter Cottrell Professor Richard Bates Dr Bonnie Law Dr Roy Irwan Dr Rachel Johnston Dr Marcos van Dam Dr Valerie LeungRichard Clare Yong ChewJudy Mohr

Contents Session 1 – Principles Session 2 – Performances Session 3 – Wavefront Reconstruction for 3D

Principles of wavefront sensing Introduction Closed against open loop wavefront sensing Nonlinear wavefront sensing Shack-Hartmann Curvature Geometric Conclusions

Imaging a star

The effect of turbulence

Adaptive Optics system Wavefront sensor Image plane Deformable mirror Distorted incoming wavefront telescope

Closed loop system Reduces the effects of disturbances such as telescope vibration, modelling errors by the loop gain Does not inherently improve the noise performance unless the closed loop measurements are easier to make Design limited by stability constraints

Postprocessing system feedforward compensation Wavefront sensor Detector plane Distorted incoming wavefront telescope Fixed mirror Computer Image

Open loop system (SPID) Sensitive to modelling errors No stability issues with computer post processing Problem is not noise but errors in modelling the system time Temporal coherence of the atmosphere T

Modelling the problem (step1) The relationship between the measured data and the object and the point spread function is linear Data object convolution point spread function noise (psf) A linear relationship would mean that if we multiply the input by α we multiply the output by α. The output doesn’t change form

Modelling the problem (step 2) The relationship between the phase and the psf is non linear psf Fourier magnitude phase correlation transform

Correct MAP estimate Wrapped ambiguity ML estimation MAP estimation Phase retrieval Nonlinearity caused by 2 p wrapping interacting with smoothing

Role of typical wavefront sensor To produce a linear relationship between the measurements and the phase –Speeds up reconstruction –Guarantees a solution –Degrades the ultimate performance phase weighting basis function

Solution is by linear equations Measurement Interaction Basis function vector matrix Coefficents i th column of Θ corresponds to the measurement that would occur if the phase was the i th basis function Three main issues –What has been lost in linearising? –How well you can solve the system of equations? –Is it the right equations?

The effect of turbulence There is a linear relationship between the mean slope of the phase in a direction and the displacement of the image in that direction.

Trivial example There is a linear relationship between the mean slope and the displacement of the centroid Measurements are the centroids of the data Interaction matrix is the scaled identity Reconstruct the coefficients of the tip and tilt

Quality of the reconstruction The centroid proportional to the mean slope (Primot el al, Welsh et al). The best Strehl requires estimating the least mean square (LMS) phase (Glindemann). To distinguish the mean and LMS slope you need to estimate the coma and higher order terms Mean slope LMS slope Phase

Coma distortion Detected image Peak value is better than the centroid for optimising the Strehl Impractical for low light data Difference between the lms and mean tilt Ideal image

Where to from here The real problem is how to estimate higher aberration orders. Wavefront sensor can be divided into: – pupil plane techniques, that measure slopes (curvatures) in the divided pupil plane, Shack-Hartmann Curvature (Roddier), Pyramid (Ragazonni) Lateral Shearing Interferometers –Image plane techniques that go directly from data in the image plane to the phase (nonlinear) Phase diversity (Paxman) Phase retrieval

Geometric wavefront sensing Pyramid, Shack-Hartmann and Curvature sensors are all essentially geometric wavefront sensors Rely on the fact that light propagates perpindicularly to the wavefront. A linear relationship between the displacement and the slope Essentially achromatic

Geometric optics model A slope in the wave-front causes an incoming photon to be displaced by Model is independent of wavelength and spatial coherence. z W(x) xx

Generalized wave-front sensor This is the basis of the two most common wave-front sensors. Converging lens Aberration Focal plane Curvature sensor Shack- Hartmann

Trade-off For fixed photon count, you trade off the number of modes you can estimate in the phase screen against the accuracy with which you can estimate them To estimate a high number of modes you need good resolution in the pupil plane To make the estimate accurately you need good resolution in the image plane

Properties of a wave-front sensor Linearization: want a linear relationship between the wave-front and the measurements. Localization: the measurements must relate to a region of the aperture. Broadband: the sensor should operate over a wide range of wavelengths.  Geometric Optics regime

Explicit division of the pupil Direct image Shack-Hartmann

Shack-Hartmann sensor Subdivide the aperture and converge each subdivision to a different point on the focal plane. A wave-front slope, W x, causes a displacement of each image by zW x.

Fundamental problem Resolution in the pupil plane is inversely proportional to the resolution in the image plane You can have good resolution in one but not both (Uncertainty principle) D w Pupil Image

Loss of information due to subdivision Cannot measure the average phase difference between the apertures Can only determine the mean phase slope within an aperture As the apertures become smaller the light per aperture drops As the aperture size drops below r0 (Fried parameter) the spot centroid becomes harder to measure

Subdivided aperture

Implicit subdivision If you don’t image in the focal plane then the image looks like a blurred version of the aperture If it looks like the aperture then you can localise in the aperture

Explanation of the underlying principle If there is a deviation from the average curvature in the wavefront then on one side the image will be brighter than the other If there is no curvature from the atmosphere then it is equally bright on both sides of focus.

Slope based analysis of the curvature sensor The displacement of light from one pixel to its neighbour is s determined by the slope of the wavefront

Slope based analysis of the curvature sensor The signal is the difference between two slope signals →Curvature

Phase information localisation in the curvature sensor Diffraction blurring + geometric expansion

Localization comes from the short effective propagation distance, Linear relationship between the curvature in the aperture and the normalized intensity difference: Broadband light helps reduce diffraction effects. Curvature sensing

Curvature sensing signal Simulated intensity measurement Curvature sensing estimate The intensity signal gives an approximate estimate of the curvature. Two planes help remove scintillation effects

Irradiance transport equation Linear approximation gives

Solution inside the boundary There is a linear relationship between the signal and the curvature. The sensor is more sensitive for large effective propagation distances.

Solution at the boundary (mean slope)  If the intensity is constant at the aperture, H(z) = Heaviside function I1I2I1- I2I1I2I1- I2

The wavefront also changes As the wave propagates, the wave-front changes according to: As the measurement approaches the focal plane the distortion of the wavefront becomes more important, and needs to be incorpoarated (van Dam and Lane)

Non-linearity due to the wavefront changing As a consequence the intensity also changes! So, to second order : The sensor is non-linear!

Origin of terms Due to the difference in the curvature in the x- and y- directions (astigmatism). Due to the local wave-front slope, displacing the curvature measurement.

Consequences of the analysis As z increases, the curvature sensor is limited by nonlinearities K and T. A third-order diffraction term limits the spatial resolution to

Analysis of the curvature sensor As the propagation distance, z, increases, Sensitivity increases. Spatial resolution decreases. The relationship between the signal and the curvature becomes non-linear.

Tradeoff in the curvature sensor Fundamental conflict between: Sensitivity which dictates moving the detection planes toward the focal plane Aperture resolution which dictates that the planes should be closer to the aperture

Slopes in the wave-front causes the intensity distribution to be stretched like a rubber sheet Wavefront sensing maps the distribution backto uniform Geometric optics model z W(x) xx

The intensity can be viewed as a probability density function (PDF) for photon arrival. As the wave propagates, the PDF evolves. The cumulative distribution function (CDF) also changes. Intensity distribution as a PDF

Take two propagated images of the aperture. D=1 m, r 0 =0.1 m and λ=589 nm. Intensity at - z Intensity at z

Can prove using the irradiance transport equation and the wave-front transport equation that This relationship is exact for geometric optics, even when there is scintillation. Can be thought of as the light intensity being a rubber sheet being stretched unevenly

Use the cumulative distribution function to match points in the two intensity distributions. The slope is given by x1 x2

Results in one dimension Actual (black) and reconstructed (red) derivative

Simulation results

Comparison with Shack-Hartmann

Conclusions Fundamentally geometric wavefront sensors are all based on the same linear relationship between slope and displaced light All sensors trade off the number of modes you can estimate against the quality of the estimate The main difference between the curvature and Shack-Hartmann is how they divide the aperture Question is how to make this tradeoff optimally.