Download presentation
Presentation is loading. Please wait.
Published byKelley Garey Rose Modified over 9 years ago
1
ECEN 4616/5616 Optoelectronic Design Class website with past lectures, various files, and assignments: http://ecee.colorado.edu/ecen4616/Spring2014/ (The first assignment will be posted here by 1/27) To view video recordings of past lectures, go to: http://cuengineeringonline.colorado.edu and select “course login” from the upper right corner of the page.
2
f’ f Virtual image S1S1 Object a h S2S2 h S2S2 b The Magnifier Lens With the unaided eye, an object can be no closer than the near limit of vision, S 2. To the unaided eye, the angular extent of the object is With a magnifier lens, we can put the object at S1, such that the virtual image is at S 2 : With the magnifier lens, the angular extent of the object is The effective magnification is therefore:
3
f’ f Virtual image S1S1 Object a h S2S2 The imaging equation then gives us: Note that, if the object is at the focal distance, then the object is at ∞, and the resulting magnification is: It is usual to assume that the average near vision distance is 250 mm. Hence, (because S2 < 0 according to the sign rules)
4
The ultimate detection resolution of the Human eye (being able to detect a fine thread, for example) is about 0.3 arc min, or 10 millirad. Hence the smallest detectable object (not necessarily resolved) with the unaided eye is 0.0001*S1 = 0.0001*250 =.025mm, or 25µm. (A Human hair, by comparison, is ~50 -> 100 µm diameter.) Therefore a 10 mm focal length lens (M=26) would allow one to detect 1 µm objects. 25 – 30x hand magnifiers can be easily bought. These are not simple lenses, however, but are generally “triplets” – three lenses of differing glass types glued together:
5
Image Resolution and the Wave Nature of Light f f’ Consider a lens producing a real image of an object: A question we need to be able to answer (if we are to design optical systems that work) is: What is the resolution of the image? Equivalently, what is the finest structure on the object that will be visible on the image? Also, what are the variables and construction parameters of the system which determine this resolution?
6
All ray (and wave) paths between an object point and the corresponding image point have the same time delay *. * From the first lecture. This is what Fermat’s Principle implies, but is is also required by the wave nature of light: Since the phase of the waves is continuous across the interface, any ray path between S and P will cross exactly the same number of wavefronts.
7
θ θ x λ Consider the rays converging on an image point from the edges of the lens, at an angle of θ : The cycle length of the interference pattern created by the locally coherent waves is: and the corresponding spatial frequency (S.F.) is 1/x. Since this is the shortest cycle S.F. that can be created on the image, it represents the resolution limit of the image. Sin( θ) is called the “Numerical Aperture” (N.A.) of the lens, so that the “cutoff S.F.” of the lens is
8
Optical Resolution: In 1873 Ernst Abbe discovered that you couldn’t image a diffraction grating with a microscope unless the objective could capture the first order diffracted light from the grating. The relationship between the period of the grating and the diffraction angle of the first order is shown below: d θ λ θ At a wavefront at the first order angle, light from each slot is delayed by 1 λ, and hence is in phase. (The same as calculated before.)
9
An example of a spatial frequency The Fourier Transform allows us to decompose any intensity image of N pixels into N/2 spatial frequencies, each with a different cycle length (spatial period) and orientation. However, an optical system acts as a low pass filter for spatial frequencies. As we have seen, there is a cutoff frequency for S.F.s and the optical system cannot image higher frequencies. Intensity of pattern:
10
The Modulation Transfer Function The highest spatial frequencies are only contributed to by light from the edges of the lens, as that is the only light that converges at a steep enough angle. Contributes to high S.F. Background illumination The light from the rest of the lens simply washes out the contrast of the high S.F. For lower S.F.s, more of the lens can contribute and therefore the contrast of low S.F.s is higher. This leads to the concept of the Modulation Transfer Function (MTF): MTF = 1.0, 0.25
11
Ideal MTF compared to MTF for de-focused image:
12
The Point Spread Function of a circular pupil: The Airy Pattern
13
Refract through first element: Transfer to second element: Refract at second element: Eliminating h 1 and using where K is the equivalent power of the combination, we get: General formula for combination of two surfaces: K1K1 K2K2 f BFL u 1 =0 u’ 1 =u 2 u’ 2 h1h1 h2h2 d1d1 In the last lecture, we found the combination of power formula by tracing rays through a pair of surfaces: n 1 ’=n 2 n1n1 n2’n2’
14
The “Lensmakers” Equation The ‘thin lens’ is an approximation, as any real lens will have a finite thickness. In the last lecture (1-17) we saw that, by applying the paraxial assumption to a surface separating materials of indices n, n’ that the power of a surface is: Where c is the curvature (inverse radius) of the surface, n the index before and n’ the index after the surface. c1c1 c2c2 d1d1 A “Thick Lens”: To adapt the surface power combination formula to a thick lens in air, let: Substituting for the powers and indices: We get the formula for the power of a thick lens:
15
Aside: Void Lenses From the derivation of the Lensmakers Equation notice that a positive curvature gives a positive power if the indices go from low before the surface to high after. However, for indices that go from high to low, a positive curvature would give a negative power. What would a positive lens look like if it were made by a void in a high index material? In a DARPA project (ELASTIC * ), it was proposed to construct a surveillance camera that could be air-dropped from 50,000 feet by encasing the detector, transmitter and power source in a polymer “super-ball”. The camera lens would, in part, be made from voids in the polymer material. Positive void lens. Detector and electronics. * Expendable Local Area Sensors in a Tactically Interconnected Cluster. To see what the deployment of 250,000 of these devices would look like, check out: http://vimeo.com/14504562 Small, “ballistically distributed” optical imaging sensors. Once deployed they would link up with each other to form an ad hoc network.
19
Combinations of Lenses: Principle Planes K1K1 K2K2 f BFL u 1 =0 u’ 1 =u 2 u’ 2 h1h1 h2h2 d1d1 In the last lecture, we used the results of a Gaussian ray trace through a pair of thin lenses to calculate the lens power combination formula,, by calculating the ray variables and using the relation. The Back Focal Length (BFL) can be similarly calculated using. The location of the equivalent lens can then be calculated as BFL – f. This is called the Secondary Principle Plane and labeled P’ P’
20
Combinations of Lenses: Principle Planes K1K1 K2K2 P’ P The Primary Principle Plane, P, can likewise be found by tracing a horizontal ray through the system from the right to the left. It is evident from this construction that even very complicated systems of lenses can be reduced to a pair Primary and Secondary Principle Planes, P, P’.
21
Principle Planes are Virtual Images of each other Ray 1Ray 2 P P’ Q Q’ First, draw two rays which determine P, P’. Ray 1Ray 2 P Q Q’ P’ Ray 1 Ray 2 Then, realizing that ray paths are reversible, notice that rays which converge toward point Q on plane P, appear to diverge from point Q’ on plane P’, and vice-versa. Thus, for the purpose of tracing rays through the system, the separation distance between planes P, P’ can be treated as if it didn’t exist. Z
22
Real Lens or System Replacing Paraxial (Ideal) Lens: Suppose we have a paraxial lens in a system: ll’ K We wish to replace the ideal lens with a real system of lenses, whose principle plane locations have been determined: ll’ K P P’
23
Principle Planes do not have to be within the system itself P’ f L The Telephoto Lens: A “Telephoto Lens” is a lens whose physical length is less than its focal length. The ‘telephoto ratio’ is the ratio of those lengths, Most long focal length lenses for hand-held cameras are designed as telephotos. Q: Where is the Primary Principle Plane?
24
Inverse Telephoto Lens: P’ f BFL This is the inverse of the telephoto, as one would expect. The goal here is to allow a very short f.l. lens – say a wide angle or macro lens – to have enough BFL to be compatible with shutters, standard mounts, etc.
25
Principle Plane positions for some simple lenses (From page 83 of the text) In general, the distance between P, P’ can be positive, negative, or zero.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.