Physics 114: Lecture 20 2D Data Analysis Dale E. Gary NJIT Physics Department.

Slides:



Advertisements
Similar presentations
A Crash Course in Radio Astronomy and Interferometry: 4
Advertisements

A Crash Course in Radio Astronomy and Interferometry: 2
S INGLE -I MAGE R EFOCUSING AND D EFOCUSING Wei Zhang, Nember, IEEE, and Wai-Kuen Cham, Senior Member, IEEE.
3-D Computer Vision CSc83020 / Ioannis Stamos  Revisit filtering (Gaussian and Median)  Introduction to edge detection 3-D Computater Vision CSc
Physics 114: Lecture 19 Least Squares Fit to 2D Data Dale E. Gary NJIT Physics Department.
November 12, 2013Computer Vision Lecture 12: Texture 1Signature Another popular method of representing shape is called the signature. In order to compute.
Image Processing. Image processing Once we have an image in a digital form we can process it in the computer. We apply a “filter” and recalculate the.
Regional Processing Convolutional filters. Smoothing  Convolution can be used to achieve a variety of effects depending on the kernel.  Smoothing, or.
 In our analysis of the double slit interference in Waves we assumed that both slits act as point sources.  From the previous figure we see that the.
Topic 11.3 Diffraction.
Fourier Transform A Fourier Transform is an integral transform that re-expresses a function in terms of different sine waves of varying amplitudes, wavelengths,
NASSP Masters 5003F - Computational Astronomy Lecture 13 Further with interferometry – Resolution and the field of view; Binning in frequency and.
Lecture 4 Linear Filters and Convolution
Multimedia Data Introduction to Image Processing Dr Mike Spann Electronic, Electrical and Computer.
Lecture 21 Wave Optics-2 Chapter 22
lecture 5, Sampling and the Nyquist Condition Sampling Outline  FT of comb function  Sampling Nyquist Condition  sinc interpolation Truncation.
CS443: Digital Imaging and Multimedia Filters Spring 2008 Ahmed Elgammal Dept. of Computer Science Rutgers University Spring 2008 Ahmed Elgammal Dept.
Media Cybernetics Deconvolution Approaches and Challenges Jonathan Girroir
Linear Filtering About modifying pixels based on neighborhood. Local methods simplest. Linear means linear combination of neighbors. Linear methods simplest.
Diffraction vs. Interference
Fraunhofer Diffraction
lecture 2, linear imaging systems Linear Imaging Systems Example: The Pinhole camera Outline  General goals, definitions  Linear Imaging Systems.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean Hall 5409 T-R 10:30am – 11:50am.
Lecture 12 Convolutions and summary of course Remember Phils Problems and your notes = everything Today Convolutions.
National Center for Supercomputing Applications University of Illinois at Urbana-Champaign Image Features Kenton McHenry, Ph.D. Research Scientist.
Diffraction and Limits of Resolution. Diffraction through a circular aperture of diameter D Intensity Diameter D Image on Screen θ = 1.22 λ /D Because.
1 Waves 8 Lecture 8 Fourier Analysis. D Aims: ëFourier Theory: > Description of waveforms in terms of a superposition of harmonic waves. > Fourier series.
Circular aperture Rectangular aperture Fraunhofer Diffraction.
Filtering Robert Lin April 29, Outline Why filter? Filtering for Graphics Sampling and Reconstruction Convolution The Fourier Transform Overview.
Multimedia Data Introduction to Image Processing Dr Sandra I. Woolley Electronic, Electrical.
Chapter 3: Image Restoration Introduction. Image restoration methods are used to improve the appearance of an image by applying a restoration process.
Page 1 Adaptive Optics in the VLT and ELT era François Wildi Observatoire de Genève Credit for most slides : Claire Max (UC Santa Cruz) Optics for AO.
Image Processing Edge detection Filtering: Noise suppresion.
Motion Blur Detection Ben Simandoyev & Keren Damari.
Introduction to Deconvolution Image Processing Introduction to Light and Electron Microscopy Neu259 Spring 2006 Spring 2006 James Bouwer UCSD.
PHYS 2022: Observational Astronomy Properties of Light and Optical Observations from the Earth.
The Hong Kong Polytechnic University Optics 2----by Dr.H.Huang, Department of Applied Physics1 Diffraction Introduction: Diffraction is often distinguished.
Astronomical Seeing. The Project Students will be introduced to the concept of astronomical seeing and how it affects the quality of astronomical images.
Machine Vision ENT 273 Image Filters Hema C.R. Lecture 5.
1 Fraunhofer Diffraction: Single, multiple slit(s) & Circular aperture Fri. Nov. 22, 2002.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Physics 114: Lecture 18 Least Squares Fit to Arbitrary Functions Dale E. Gary NJIT Physics Department.
Intelligent Vision Systems ENT 496 Image Filtering and Enhancement Hema C.R. Lecture 4.
Sharper telescope images with video astronomy : an undergraduate laboratory Michael Dubson Physics Department, University of Colorado at Boulder The Problem.
Topic 11  11.4: Resolution. Double Slit  In our analysis of the double slit interference in Waves we assumed that both slits act as point sources.
1 Computational Vision CSCI 363, Fall 2012 Lecture 6 Edge Detection.
Lens to interferometer Suppose the small boxes are very small, then the phase shift Introduced by the lens is constant across the box and the same on both.
Removing motion blur from a single image
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities May 2, 2005 Prof. Charlene Tsai.
Linear filtering. Motivation: Image denoising How can we reduce noise in a photograph?
6/10/20161 Digital Image Processing Lecture 09: Image Restoration-I Naveed Ejaz.
Non-linear filtering Example: Median filter Replaces pixel value by median value over neighborhood Generates no new gray levels.
Filters– Chapter 6. Filter Difference between a Filter and a Point Operation is that a Filter utilizes a neighborhood of pixels from the input image to.
Diffraction and Coherence 16-2 and CAN WAVES BEND AROUND CORNERS? ·Can you hear me when I stand around the corner and yell? ·What about light? Think.
Page 1 Adaptive Optics in the VLT and ELT era François Wildi Observatoire de Genève Credit for most slides : Claire Max (UC Santa Cruz) Basics of AO.
Digital Image Processing Lecture 16: Segmentation: Detection of Discontinuities Prof. Charlene Tsai.
Image Deblurring and noise reduction in python
A “Field Guide” to WFPC2 HLA Image Anomalies
Wide-field imaging Max Voronkov (filling up for Tim Cornwell)
Removing motion blur from a single image
The Resolution Of A Telescope
Basics of Photometry.
Diffraction vs. Interference
Fourier Optics P47 – Optics: Unit 8.
Fraunhofer diffraction from Circular apertures:
Lecture 2: Image filtering
Lecture 7 Spatial filtering.
Angular Resolution 1. 1.
DIGITAL IMAGE PROCESSING Elective 3 (5th Sem.)
Presentation transcript:

Physics 114: Lecture 20 2D Data Analysis Dale E. Gary NJIT Physics Department

Reminder 1D Convolution and Smoothing  Let’s create a noisy sine wave: u = -5:.1:5; w = sin(u*pi)+0.5*randn(size(u)); plot(u,w)  We can now smooth the data by convolving it with a vector [1,1,1], which does a 3-point running sum. wsm = conv(w,[1,1,1]); whos wsm Name Size Bytes Class Attributes wsm 1x double  Notice wsm is now of length 103. That means we cannot plot(u,wsm), but we can plot(u,wsm(2:102)). Now we see another problem.  Try this: plot(u,w) hold on; plot(u,wsm(2:102)/3,’r’,’linewidth’,2) Apr 23, 2010

2D Convolution  To do 2D smoothing, we will use a 2D kernel k = ones(3,3), and use the conv2() function. So to smooth the residuals of our fit, we can use zsm = conv2(ztest-z,k)/9.; imagesc(x,y,zsm(2:102,2:102))  Now we can see the effect of missing the value of cx by 0.05 due to our limited search range.  There are other uses for convolution, such as edge detection. For example, we can convolve with a kernel k = [1,1,1,1,1,1].  Or a kernel k = [1,1,1,1,1,1]’. Or even a kernel k = eye(6). Or k = rot90(eye(6)). Apr 23, 2010

Convolution and Resolution  Convolution can be used for smoothing data, but it is also something that happens naturally whenever measurements are made, due to instrumental resolution limitations.  For example, an optical system (telescope or microscope, etc.) has an aperture size that limits the resolution due to diffraction (called the diffraction limit). Looking at a star with a telescope, assuming no other effects like atmospheric turbulence, results in a star image of a certain size, surrounded by an “airy disk” with diffraction rings.  This shape is mathematically just the sinc() function we introduced last time: x = -5:0.1:5; y = -5:0.1:5; [X, Y] = meshgrid(x,y); Z = sinc(sqrt(X.^2 + Y.^2)); imagesc(x,y,Z);  In fact, this is the electric field pattern, and to get the intensity we need to square the electric field: imagesc(x,y,Z.^2) Apr 23, 2010

Point Spread Function  To show this point better, consider a “perfect” instrument that perhaps has noise, but shows stars as perfect point sources. Let’s generate an image of some stars: stars = randn(size(X))*0.1; stars(50,50) = 1; stars(20,37) = 4; stars(87,74) = 2; stars(45,24) = 0.5; imagesc(stars)  To see the effect of observing such a star pattern with an instrument, convolve the star image with the sinc function representing the diffraction pattern of the instrument (the point spread function or PSF): Z = sinc(sqrt(X.^2 + Y.^2)*5).^2; % the *5 makes it smaller/sharper imagesc(conv2(stars,Z))  You see that the result is twice as large due to the way convolution works. Try fuzzy = conv2(stars,Z); colormap(gray(256)); imagesc(stars); axis square imagesc(fuzzy(51:150,51:150)); axis square Apr 23, 2010

Deconvolution  It is actually possible to do the inverse of convolution, called deconvolution. Let’s read in an image and fuzz it up (download fruit.gif from course web pg) [img map] = imread(‘fruit.gif’); fuzzy = conv2(single(img),Z)/sum(sum(Z); image(img) % Original image—observe the sharpness image(fuzzy(51:515,51:750)) % fuzzy image  Now let’s sharpen it again. MatLAB has a family of deconvolution routines. The standard one is deconvreg(): image(deconvreg(fuzzy,Z))  The image is dark, because we have to normalize the convolving function: image(deconvreg(fuzzy,Z)*sum(sum(Z)))  This looks pretty good, but note the edge effects. Try another routine image(deconvlucy(fuzzy,Z)*sum(sum(Z)))  This one looks almost perfect. However, if you compare images you do see differences sharp = deconvlucy(fuzzy,Z)*sum(sum(Z)) imagesc(sharp(51:515,51:750) – single(img)) Apr 23, 2010

Deconvolution Problems  Any time you do an inversion of data, the result can be unstable. Success depends critically on having the correct point spread function.  The deconvolution we just did was after convolving the image with a “perfect” instrument and neglecting atmospheric turbulence. Further blurring by the atmosphere acts to increase the size of the “airy disk” and smear out the diffraction rings.  With some time average, the above pattern smears out into an equivalent gaussian. The equivalent gaussian to Zsinc = sinc(sqrt(X.^2 + Y.^2)*5).^2; is Zgaus = exp(-(X.^2 + Y.^2)*(5*1.913)^2); Apr 23, 2010

Incorrect PSF  Let’s convolve the image with the Gaussian (i.e. instrument plus atmospheric turbulence), creating a larger PSF  Zgaus = exp(-(X.^2 + Y.^2)*(3*1.913)^2); % Note use of 3 to enlarge Gaussian  Convolve with this blurred PSF fuzzy = conv2(double(img),Zgaus)/sum(sum(Zgaus)); image(fuzzy)  Now deconvolve with the instrumental PSF dconl = deconvlucy(fuzzy,Zsinc)*sum(sum(Zsinc)); image(dconl)  We see that we cannot recover the original instrumental resolution. The clarity is lost due to atmospheric turbulence.  However, if we measure the PSF of instrument plus atmosphere, we CAN recover the blurring due to the atmosphere. Apr 23, 2010

Laser Guide Stars  Astronomers now use a laser to create a bright, nearby “guide star” near the region of the sky of interest.  By imaging the laser scintillation pattern instantaneously, they can freeze the atmosphere and correct the images in real time. Apr 23, 2010