Today’s Topic (Slight ) correction to last week’s lecture

Slides:



Advertisements
Similar presentations
Image Reconstruction.
Advertisements

Fluorescence: get beautiful pictures
Bio-optics & Microscopy (MEDS 6450) 11/16/2010 Presented by: Mathilde Bonnemasison Leia Shuhaibar Steve Pirnie Ronghua (Ronnie) Yang Neil MAA, Juskaitis.
Microscopy Outline 1.Resolution and Simple Optical Microscope 2.Contrast enhancement: Dark field, Fluorescence (Chelsea & Peter), Phase Contrast, DIC 3.Newer.
Nick Beavers Project Manager Deconvolution from Andy Molnar Software Engineer.
Blizard Advanced Light Microscopy club Making friends with your microscope.
Dial in: (319) Access: # Unmute/mute: #6 Deconvolution of Microscopy Images Issues to Consider for Widefield and Confocal Microscopy Jonathan.
Deconvolution of Laser Scanning Confocal Microscope Volumes: Empirical determination of the point spread function Eyal Bar-Kochba ENGN2500: Medical Imaging.
ABRF meeting 09 Light Microscopy Research Group. Why are there no standards? Imaging was largely an ultrastructure tool Digital imaging only common in.
Image-Pro Premier Basic Training Course
Today’s Topic: Lec 3 Prep for Labs 1 & 2 3-D imaging—how to get a nice 2D Image when your samples are 3D. (Deconvolution, with point scanning or with Wide-field.
Short pulses in optical microscopy Ivan Scheblykin, Chemical Physics, LU Outline: Introduction to traditional optical microscopy based on single photon.
Computational Biology, Part 22 Biological Imaging I G. Steven Vanni Meel Velliste Robert F. Murphy Copyright  1998, All rights reserved.
Digital Image Processing Chapter 5: Image Restoration.
lecture 5, Sampling and the Nyquist Condition Sampling Outline  FT of comb function  Sampling Nyquist Condition  sinc interpolation Truncation.
Media Cybernetics Deconvolution Approaches and Challenges Jonathan Girroir
Back Projection Reconstruction for CT, MRI and Nuclear Medicine
Microscopy Techniques for Biomaterial Characterization: A Primer Prabhas V. Moghe Lecture 3 September 21, 1999 RU CBE 533 or BME 553; NJIT BME 698.
Study of Protein Association by Fluorescence-based Methods Kristin Michalski UWM RET Intern In association with Professor Vali Raicu.
lecture 2, linear imaging systems Linear Imaging Systems Example: The Pinhole camera Outline  General goals, definitions  Linear Imaging Systems.
Microscope.
Illumination and Filters Foundations of Microscopy Series Amanda Combs Advanced Instrumentation and Physics.
Detecting Electrons: CCD vs Film Practical CryoEM Course July 26, 2005 Christopher Booth.
Biophotonics lecture 23. November Last week: -Fluorescence microscopy in general, labeling, etc… -How to do optical sectioning and fill the missing.
Microscopy. Scale Lenses and the Bending of Light light is refracted (bent) when passing from one medium to another refractive index –a measure of how.
Fluorescence Techniques
IPC Friedrich-Schiller-Universität Jena 1 Reduction of out of focus light  Excitation light excites fluorescence more or less within the whole sample.
Chapter 23 Properties of Light. Section 1: Objectives Use ray diagrams to show how light is reflected or refracted. Compare plane mirrors, concave mirrors,
ECEN 4616/5616 Optoelectronic Design Class website with past lectures, various files, and assignments: (The.
Seeram Chapter 7: Image Reconstruction
Integral University EC-024 Digital Image Processing.
Deconvolution, Visualization and Analysis Software for Widefield and Confocal Microscopy.
Introduction to Deconvolution Image Processing Introduction to Light and Electron Microscopy Neu259 Spring 2006 Spring 2006 James Bouwer UCSD.
Biophotonics lecture 11. January Today: -Correct sampling in microscopy -Deconvolution techniques.
Basics of Deconvolution
Advanced Biology Visualizing Cells. The Human Eye  Resolution – The minimum distance two points can be apart and still be distinguished as two separate.
Image Restoration Chapter 5.
Molecular Cell Biology Light Microscopy in Cell Biology Cooper Modified from a 2010 lecture by Richard McIntosh, University of Colorado.
The Microscope and Forensic Identification. Magnification of Images A microscope is an optical instrument that uses a lens or a combination of lenses.
November 5, 2013Computer Vision Lecture 15: Region Detection 1 Basic Steps for Filtering in the Frequency Domain.
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
Use Snell’s law to determine the angle of total internal reflection in the coverslip (without oil). Oil immersion objectives More light (no total internal.
Today Defocus Deconvolution / inverse filters. Defocus.
Designing a Microscopy Experiment Kurt Thorn, PhD Director, Image from Susanne Rafelski, Marshall lab.
Dr R.Jayaprada CONFOCAL MICROSCOPY.
Introduction to Digital Image Analysis Kurt Thorn NIC.
FROM IMAGES TO ANSWERS Deconvolution of Widefield and Confocal images The growing role of deconvolution NE 2007.
Imaging.
Microscopy.
CSE 185 Introduction to Computer Vision
Chapter-4 Single-Photon emission computed tomography (SPECT)
Degradation/Restoration Model
Digital Image Processing
Fourier Transform: Real-World Images
Histogram Histogram is a graph that shows frequency of anything. Histograms usually have bars that represent frequency of occuring of data. Histogram has.
Statistical Deconvolution for Superresolution Fluorescence Microscopy
LIGHT MICROSCOPY variations
Nanoscopy of Living Brain Slices with Low Light Levels
Volume 109, Issue 10, Pages (November 2015)
Regulation of Airway Ciliary Activity by Ca2+: Simultaneous Measurement of Beat Frequency and Intracellular Ca2+  Alison B. Lansley, Michael J. Sanderson 
Image Restoration for Confocal Microscopy: Improving the Limits of Deconvolution, with Application to the Visualization of the Mammalian Hearing Organ 
Supraresolution Imaging in Brain Slices using Stimulated-Emission Depletion Two- Photon Laser Scanning Microscopy  Jun B. Ding, Kevin T. Takasaki, Bernardo.
9.4 Enhancing the SNR of Digitized Signals
Volume 93, Issue 9, Pages (November 2007)
Development of a Hyaluronan Targeted Contrast Reagent for the Demarcation of Melanoma Margins In Vivo  S.R. Rudrabhatla, W. Matthew Petroll, Christie.
FLUORESCENCE MICROSCOPY
Aquiles Carattino, Veer I.P. Keizer, Marcel J.M. Schaaf, Michel Orrit 
Samuel T. Hess, Watt W. Webb  Biophysical Journal 
Computed Tomography (C.T)
Volume 85, Issue 6, Pages (December 2003)
Presentation transcript:

Today’s Topic (Slight ) correction to last week’s lecture Getting z-resolution—confocal detection and deconvolution algorithms. (Lab 1) Polarization assays—important for k2 and for binding assays (Lab 2)

Photobleaching Important: Dye emits 105 107 photons, then dies! What is fluorescence? (correction) Shine light in, gets absorbed, reemits at longer wavelength Stokes Shift (10-100 nm) Excitation Spectra Emission Light In Light Out Fluorescence & + non-radiative [nsec] Thermal Relaxation [psec] Thermal relaxation Absorption [ftsec] k k= kf +kn.r. Old diagram was wrong Time (nsec) Fluorescence -t/to y = e to = 1/k= 1/(kf +kn.r.) Photobleaching Important: Dye emits 105 107 photons, then dies!

(De-)Convolution

(De-)Convolution

(De-)Convolution

Getting 3D Convolution (pinhole) vs. Widefield Deconvolution

(Last time) Convolution (pinhole) microscopy Way to improve the z-resolution by collecting less out-of-focus fluorescence, via a pinhole and scanning excitation with focused light. Great especially for thick samples, but takes time, complicated optics and requires fluorophores that are very stable. (why?) Focused Light creates fluorescence which gets to detector Light mostly gets rejected Fluorescence (dark-field) Confocal microscopy prevents out-of-focus blur from being detected by placing a pinhole aperture between the objective and the detector, through which only in-focus light rays can pass. http://micro.magnet.fsu.edu/primer/digitalimaging/deconvolution/deconintro.html

Deconvolution What if you let light go in (no pinhole) and don’t scan—i.e. use wide-field excitation. Can you mathematically discriminate against out-of-focus light? Deconvolution For each focal plane in the specimen, a corresponding image plane is recorded by the detector and subsequently stored in a data analysis computer. During deconvolution analysis, the entire series of optical sections is analyzed to create a three-dimensional montage. http://micro.magnet.fsu.edu/primer/digitalimaging/deconvolution/deconintro.html

Deconvolution: Deconvolution is a computationally intensive image processing technique that is being increasingly utilized for improving the contrast and resolution of digital images captured in the microscope. The foundations are based upon a suite of methods that are designed to remove or reverse the blurring present in microscope images…. In contrast, widefield microscopy allows blurred light to reach the detector (or image sensor), but deconvolution techniques are then applied to the resulting images either to subtract blurred light or to reassign it back to a source. Even if you have a point emitter (at z=0), you will see a PSF like cone. By knowing the (mathematical) transfer function, can you do better? Called deconvolution techniques. Common technique: take a series of z-axis and then unscramble them. http://micro.magnet.fsu.edu/primer/digitalimaging/deconvolution/deconintro.html

Deconvolution: 2 Techniques Deblurring and image restoration Deblurring Algorithms are fundamentally two-dimensiona: they subtract away the average of the nearest neighbors in a 3D stack. For example, the nearest-neighbor algorithm operates on the plane z by blurring the neighboring planes (z + 1 and z - 1, using a digital blurring filter), then subtracting the blurred planes from the z plane. In contrast, image restoration algorithms are properly termed "three-dimensional" because they operate simultaneously on every pixel in a three-dimensional image stack. Instead of subtracting blur, they attempt to reassign blurred light to the proper in-focus location. + Computationally simple. Add noise, reduce signal Sometimes distort image http://micro.magnet.fsu.edu/primer/digitalimaging/deconvolution/deconalgorithms.html

Deconvolution Image Restoration Instead of subtracting blur (as deblurring methods do), Image Restoration Algorithms attempt to reassign blurred light to the proper in-focus location. This is performed by reversing the convolution operation inherent in the imaging system. If the imaging system is modeled as a convolution of the object with the point spread function, then a deconvolution of the raw image should restore the object. However, never know PSF perfectly. Varies from point-to-point, especially as a function of z and by color. You guess or take some average.

Deconvolution algorithms and Fourier Transforms An advantage of this formulation is that convolution operations on large matrices (such as a three-dimensional image stack) can be computed very simply using the mathematical technique of Fourier transformation. If the image and point spread function are transformed into Fourier space, the convolution of the image by the point spread function can be computed simply by multiplying their Fourier transforms. The resulting Fourier image can then be back-transformed into real three-dimensional coordinates.

Deconvolution Algorithms A three-dimensional image stack containing 70 optical sections, and recorded at intervals of 0.2 micrometers through a single XLK2 cell. A widefield imaging system, equipped with a high numerical aperture (1.40) oil immersion objective, was employed to acquire the images. The Original Data is a single focal plane taken from the three-dimensional stack, before the application of any data processing. Deblurring by a nearest-neighbor algorithm produced the result shown in the image labeled Nearest Neighbor (Figure 2(b)). The third image (Figure 2(c), Restored) illustrates the result of restoration by a commercially marketed constrained iterative deconvolution software product. Both deblurring and restoration improve contrast, but the signal-to-noise ratio is significantly lower in the deblurred image than in the restored image. The scale bar in Figure 2(c) represents a length of 2 micrometers, and the arrow (Figure 2(a))designates the position of the line plot presented in Figure 4.

Contrast: deblur & image restoration The graphical plot presented in Figure 4 represents the pixel brightness values along a horizontal line traversing the cell illustrated in Figure 2 (the line position is shown by the arrow in Figure 2(a)). Original data = green line, the deblurred image data = blue line, and the restored image data = the red line. Deblurring causes a significant loss of pixel intensity over the entire image, whereas restoration results in a gain of intensity in areas of specimen detail. A similar loss of image intensity as that seen with the deblurring method occurs with the application of any two-dimensional filter.

Estimating the object First, an initial estimate of the object is performed, which is usually the raw image itself. The estimate is then convolved with the point spread function, and the resulting "blurred estimate" is compared with the original raw image. This comparison is employed to compute an error criterion that represents how similar the blurred estimate is to the raw image. Often referred to as a figure of merit, the error criterion is then utilized to alter the estimate in such a way that the error is reduced. A new iteration then takes place: The entire process is repeated until the error criterion is minimized or reaches a defined threshold. The final restored image is the object estimate at the last iteration.

Blind Deconvolution an image reconstruction technique: object and PSF are estimated The algorithm was developed by altering the maximum likelihood estimation procedure so that not only the object, but also the point spread function is estimated. Another family of iterative algorithms uses probabilistic error criteria borrowed from statistical theory. Likelihood, a reverse variation of probability, is employed in the maximum likelihood estimation (MLE) Using this approach, an initial estimate of the object is made and the estimate is then convolved with a theoretical point spread function calculated from optical parameters of the imaging system. The resulting blurred estimate is compared with the raw image, a correction is computed, and this correction is employed to generate a new estimate, as described above. This same correction is also applied to the point spread function, generating a new point spread function estimate. In further iterations, the point spread function estimate and the object estimate are updated together.

Different Deconvolution Algorithms The results of applying three different processing algorithms to the same data set are presented in Figure 3. The original three-dimensional data are 192 optical sections of a fruit fly embryo leg acquired in 0.4-micrometer z-axis steps with a widefield fluorescence microscope (1.25 NA oil objective). The images represent a single optical section selected from the three-dimensional stack The original (raw) image is illustrated in Figure 3(a). The results of deblurring by a nearest neighbor algorithm appear in Figure 3(b), with processing parameters set for 95 percent haze removal. The same image slice is illustrated after deconvolution by an inverse (Wiener) filter (Figure 3(c)), and by iterative blind deconvolution (Figure 3(d)), incorporating an adaptive point spread function method. http://micro.magnet.fsu.edu/primer/digitalimaging/deconvolution/deconalgorithms.html

Deconvolution Slice #36 from wide-field fluorescence Z-stack Deconvoluted (using theoretical PSF) Wide-Field Fluorescence slice 36 DECONVOLUTED MIN PROJECTION DECONVOLUTED MAX PROJECTION

The End