Presentation is loading. Please wait.

Presentation is loading. Please wait.

Depth Estimate and Focus Recovery Presenter : Wen-Chih Hong Adviser: Jian-Jiun Ding Digital Image and Signal Processing Laboratory Graduate Institute of.

Similar presentations


Presentation on theme: "Depth Estimate and Focus Recovery Presenter : Wen-Chih Hong Adviser: Jian-Jiun Ding Digital Image and Signal Processing Laboratory Graduate Institute of."— Presentation transcript:

1 Depth Estimate and Focus Recovery Presenter : Wen-Chih Hong Adviser: Jian-Jiun Ding Digital Image and Signal Processing Laboratory Graduate Institute of Communication Engineering National Taiwan University, Taipei, Taiwan, ROC 台大電信所 數位影像與訊號處理實驗室 2015/8/301DISP Lab @ MD531

2 Outlines  Introduction  Binocular version systems  Stereo  Monocular version systems  DFF  DFD  Other method  Conclusions  References 2015/8/302DISP Lab @ MD531

3 Introduction  Depth is an important information for robot and the 3D reconstruction.  Image depth recovery is a long-term subject for other applications such as robot vision and the restorations.  Most of depth recovery methods based on simply camera focus and defocus.  Focus recovery can help users to understand more details for the original defocus images. 2015/8/303DISP Lab @ MD531

4 Introduction  Categories of depth estimation Monocular Depth from defocus (DFD) Depth from focus (DFF) Binocular Stereo focus 2015/8/304DISP Lab @ MD531

5 Introduction  Categories of depth estimation  Active :  Sending a controlled energy beam  Detection of reflected energy  Passive :  Image-based 2015/8/305DISP Lab @ MD531

6 Introduction  Geometric on imaging F D/2 F u s 2R : R>0 sensor v Biconvex 2015/8/306DISP Lab @ MD531

7 Binocular version systems  The flow chart to binocular depth estimation.  Depth map  HVS modeling  Edge detection  Correspondence  Vengeance control  Gaze control  Depth map 2015/8/307DISP Lab @ MD531

8 Binocular version systems  Vengeance movement :  is some kind of slow eye movement that two eyes move in different directions.  But corresponding problem Gazing point (Corresponding point) Baseline (B) B/2 Depth (u) 2015/8/308DISP Lab @ MD531

9 Binocular version systems  Complex model Corresponding point Depth (u) Right vision Left vision Baseline (B) (x R, y R ) (x L, y L ) Figure 3.3 A more complete triangulation geometry for the binocular vision. We have to realize how much departure between the optical axis and the direction of the 2015/8/309DISP Lab @ MD531

10 Binocular version systems  Corresponding problem  But more accuracy 2015/8/3010DISP Lab @ MD531

11 Monocular version systems  Depth from focus  Depth form defocus 2015/8/3011DISP Lab @ MD531

12 Depth from Focus  Taking pictures at different observer distance or object distance  We need an estimator to measure degree on focus  Using Laplacian operator  Such operator point to a measurement on a single pixel influence, a sum of Laplacian operator is needed: 2015/8/3012DISP Lab @ MD531

13 Depth from Focus  Gaussian interpolation Figure 4.4 Gaussian interpolation to a measure curve, N k ≧ N k-1, N k ≧ N k+1 displacement dkdk [SML] NPNP Focus measure NkNk N k-1 d k-1 dpdp Measured curve Ideal condition d k+1 N k+1 2015/8/3013DISP Lab @ MD531

14 Depth from Focus  Range from focus  using  Take pictures along the axis  Find the image having highest frequency  Need more than 10 images (monocular) 2015/8/3014DISP Lab @ MD531

15 Depth from Focus  We use Gaussian interpolation to form a set of approximations  The depth solution d p from above Gaussian: 2015/8/3015DISP Lab @ MD531

16 Depth from Defocus  Due to geometric optics, the intensity inside the blur circle should be constant.  Considering of aberration and diffraction and so on, we easily assume a blurring function:  : diffusion parameter   Diffusion parameter is related to blur radius:  derived from triangularity in geometric optics  For easy computation, we assume that foreground has equal- diffusion, background has equal-diffusion and so on  However, this equal-focal assumption will be a problem 2015/8/3016DISP Lab @ MD531

17 Depth from Defocus  Blurring model Blurring radius 2015/8/3017DISP Lab @ MD531

18 Depth from Defocus  Blurring model 2015/8/3018DISP Lab @ MD531

19 Depth from Defocus  Blurring model 2015/8/3019DISP Lab @ MD531

20 Depth from Defocus  Blurring model 2015/8/3020DISP Lab @ MD531

21 Depth from Defocus  Blurring model when  blur radius is independent of the location of the point source on the object plane at depth 2015/8/3021DISP Lab @ MD531

22 Depth from Defocus  Blurring model  Using and  We get  So diffusion parameter: 2015/8/3022DISP Lab @ MD531

23 Depth from Defocus  Depth recovery  Eliminating D from m=1,2  we get  where  and 2015/8/3023DISP Lab @ MD531

24 Depth from Defocus  Depth recovery  From  Take F.T.:  The F.T. of Gaussian is Gaussian  2015/8/3024DISP Lab @ MD531

25 Depth from Defocus  Depth recovery  Take the log   Using the relationship between them  we get 2015/8/3025DISP Lab @ MD531

26 Depth from Defocus  Depth recovery  let apha=1  we obtain the value of sigma-2  Find out the depth D 2015/8/3026DISP Lab @ MD531

27 Depth from Defocus  The main sources of range errors in DFD  Inaccurate modeling of the optical system.  Windowing for local feature analysis.  Low spectral content in the scene being images.  Improper calibration of camera parameters.  Presence of sensor noise. 2015/8/3027DISP Lab @ MD531

28 Depth from Defocus  Block shift-variant blur model  Consider the interaction of sub-images  Define the neighborhood function  Indeed, the image we observed is  compared with 2015/8/3028DISP Lab @ MD531

29 Depth from Defocus  Space-variant filtering models for recovering depth  Using complex spectrogram and P.W.D.  Complex Spectrogram:   2015/8/3029DISP Lab @ MD531

30 Depth from Defocus  Space-variant filtering models for recovering depth  C.S.:  g_1/g_2  where 2015/8/3030DISP Lab @ MD531

31 Depth from Defocus  Space-variant filtering models for recovering depth  objective function:  Drawback:  No consider the intersection of pixels there will be interrupt in border.  Regularized solution. 2015/8/3031DISP Lab @ MD531

32 Depth from Defocus  No corresponding problem  Less accuracy  S.V. > B.S.V.  Blocking Trade-off  Blocking size  Too large: less accuracy  Too small: noise 2015/8/3032DISP Lab @ MD531

33 Other method  Structure from motion  Shape from shading  ML Estimation of Depth and Optimal camera settings  Recursive computation of depth from multiple images 2015/8/3033DISP Lab @ MD531

34 Other method  Structure from motion  Using the relative motion between object and camera to find out surface information  Corresponding problem (binocular)  Find out what motion of camera 2015/8/3034DISP Lab @ MD531

35 Other method  Shape from shading  Need to know the reflectance  Find the sliding rate and blindness 2015/8/3035DISP Lab @ MD531

36 Focus recovery  SML measurement Defocused image pair Full focused image Maximum value searching Depth measurement of a point Small aperture construction Linear canonical transform based on constructed optical system focal point Using the specific depth to retrieve imaging distance 2015/8/3036DISP Lab @ MD531

37 Conclusions  Binocular stereo method  high accuracy  Absolute depth information  Complexity computation  Corresponding problem  Structure form motion  Nonlinear problem  Corresponding problem  Shape from shading  Very difficult method  Active method 2015/8/3037DISP Lab @ MD531

38 Conclusions  Range from focus:  Slowly  More than 10 images  depth from defocus:  Easy method  Less accuracy 2015/8/3038DISP Lab @ MD531

39 References and future work 1)Y. C. Lin, Depth Estimation and Focus Recovery, Master thesis, National Taiwan Univ., Taipei, Taiwan, R.O.C, 2008 2)Subhasis Chaudhuri, A.N. Rajagopalan, ”Depth From Defocus: A Real Aperture Imaging Approach. ” Springer-Verlag. New York, 1999. 3)M. Subbarao, “Parallel depth recovery by changing camera parameters,” Second International Conference on Computer Vision 1988, pp. 149-155, Dec. 1988. 4)M. Subbarao and T. C. Wei, “Depth from defocus and rapid autofocusing: a practical approach,” IEEE Conferences on Computer Vision and Pattern Recognition, pp. 773-776, Jun. 1992. 5)A. N. Rajagopalan and S. Chaudhuri, “A variational approach to recovering depth from defocused images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 19, pp. 1158-1164, Oct. 1997. 2015/8/3039DISP Lab @ MD531

40 The end 2015/8/30DISP Lab @ MD53140


Download ppt "Depth Estimate and Focus Recovery Presenter : Wen-Chih Hong Adviser: Jian-Jiun Ding Digital Image and Signal Processing Laboratory Graduate Institute of."

Similar presentations


Ads by Google