Presentation is loading. Please wait.

Presentation is loading. Please wait.

Design and Perceptual Validation of Performance Measures for Salient Object Segmentation Vida Movahedi, James H. Elder Centre for Vision Research York.

Similar presentations


Presentation on theme: "Design and Perceptual Validation of Performance Measures for Salient Object Segmentation Vida Movahedi, James H. Elder Centre for Vision Research York."— Presentation transcript:

1 Design and Perceptual Validation of Performance Measures for Salient Object Segmentation Vida Movahedi, James H. Elder Centre for Vision Research York University, Canada

2 Evaluation of Salient Object Segmentation Centre for Vision Research, York University2 Source: Berkeley Segmentation Dataset

3 Evaluation of Salient Object Segmentation Centre for Vision Research, York University3 How do we measure success?

4 Existing literature  Salient object segmentation  [Liu07, Zhang07, Park07, Zhuang09, Achanta09, Pirnog09, …]  Evaluation of salient object segmentation algorithms  [Ge06,?]  Evaluation of segmentation algorithms  [Huang95, Zhang96, Martin01, Monteiro06, Goldmann08, Estrada09] Centre for Vision Research, York University4

5 Contributions Centre for Vision Research, York University5  Analysis of previously suggested measures  Contour Mapping Measure (CM)  Order-preserving matching  A new dataset of salient objects (SOD)  Psychophysics experiments  Evaluation of above measures  Matching paradigm in Precision and Recall measures

6 Evaluation measures in literature Centre for Vision Research, York University6  Region-based error measures  Based on false positive/ false negative pixels  [Young05], [Ge06], [Goldmann08],...  Boundary-based error measures  Based on distance between boundaries  [Huttenlocher93], [Monteiro06],...  Mixed measures  Based on distance of misclassified pixels to the boundaries  [Young05], [Monteiro06],...

7 Region-based error measures [Young05], [Ge06], [Goldmann08],... Centre for Vision Research, York University7  A and B two boundaries  R A the region corresponding to a boundary A and |R A | the area of this region, False NegativesFalse Positives Not sensitive to shape differences

8 Boundary-based error measures [Huang95],[Huttenlocher93], [Monteiro06],... Centre for Vision Research, York University8  A and B two boundaries  Distance of one point a on A from B is  Hausdorff distance:  Mean distance: Not sensitive to shape differences a

9  Penalizing the over-detected and under-detected regions by their distances to intersection Mixture error measures [Young05], [Monteiro06],... Centre for Vision Research, York University9 False NegativesFalse Positives Not sensitive to shape difference

10 Another example Centre for Vision Research, York University10 Different shapes with low errors

11 Comparing two boundaries Centre for Vision Research, York University11  The two boundaries need to follow each other  Thus it is not sufficient to map points to the closest point on the other boundary  The ordering of mapped points must be preserved Small False Negative Region Small False Positive Region

12  The order of mapped points on the two boundaries must be monotonically non-decreasing.  Allowing for different levels of detail:  One-to-one  Many-to-one  One-to-many Order-preserving Mapping Centre for Vision Research, York University12

13 Contour Mapping Measure Centre for Vision Research, York University13  Given two contours A=a 1 a 2..a n and B=b 1 b 2..b m,  Find the correct order-preserving mapping Contour mapping error measure: Average distance between matched pairs of points  Bimorphism [Tagare02]  Elastic Matching [Geiger95, Basri98, Sebastian03,..]

14  A dynamic programming implementation to find the optimum mapping  Closed contours  point indices are assigned cyclically  Based on string correction techniques [Maes90]  Complexity: if m<n and m, n points on two boundaries Contour Mapping Measure Centre for Vision Research, York University14

15 Contour Mapping Example Centre for Vision Research, York University15 Ground Truth BoundaryAlgorithm Boundary Matched pairs shown as line segments CM= average length of line segments connecting matched pairs

16 Contour Mapping Measure  Order- preserving mapping avoids problems experienced by other measures Centre for Vision Research, York University16

17 SOD: Salient Object Dataset Centre for Vision Research, York University17  A dataset of salient objects  Based on Berkeley Segmentation Dataset (BSD) [Martin01]  300 images  7 subjects Source: Berkeley Segmentation DatasetAvailable in SOD

18 Psychophysical experiments Centre for Vision Research, York University18  Which error measure is closer to human judgements of shape similarity?  9 subjects  5 error measures  Regional Intersection (RI)  Mean distance (MD)  Hausdorff distance (HD)  Mixed distance (MM)  Contour Mapping (CM)

19 Psychophysical Experiments Experiment 1 - SOD Reference & test shapes all from SOD Experiment 2 - ALG Reference from SOD, test shapes algorithm-generated Centre for Vision Research, York University19 Reference: Human segmentation Test cases: Algorithm- generated Test cases: Human segmentations Reference: Human segmentation

20 Agreement with Human Subjects  Human subject chooses Left or Right  An error measure M also chooses Left or Right, based on their error w.r.t. the reference shape  If M chooses the same as the human, it is a case of agreement  Human-Human consistency: defined based on agreement between human subjects Centre for Vision Research, York University20 Left Right Reference

21 Psychophysical Experiments Centre for Vision Research, York University21 Experiment 1- SOD Reference & tests shapes all from SOD Experiment 2 - ALG Reference from SOD, test shapes algorithm-generated RI: region intersection, MD: mean distance, HD: Hausdorff distance, MM: mixed measure, CM: contour mapping

22 Precision and Recall measures Centre for Vision Research, York University22  For algorithm boundary A and ground truth boundary B  Precision: proportion of true positives on A  Recall:proportion of detected points on B  Martin’s PR (M-PR) [Martin04]  Minimum cost bipartite matching, cost proportional to distance  Estrada’s PR (E-PR) [Estrada09]  ‘No intervening contours’ and ‘Same side’ constraints  Contour Mapping PR (CM-PR)  Order-preserving mapping

23 Matching paradigm in Precision/Recall Centre for Vision Research, York University23 Experiment 1- SOD Reference & test shapes all from SOD Experiment 2 - ALG Reference from SOD, test shapes algorithm-generated

24 Summary Centre for Vision Research, York University24  Analysis of available measures for evaluation of salient object segmentation algorithms  A new measure- contour mapping measure (CM)  Code available online: http://elderlab.yorku.ca/ContourMapping  A new dataset of salient objects  Dataset available online : http://elderlab.yorku.ca/SOD  Psychophysical Experiment  CM has a higher agreement with human subjects  Order-preserving matching paradigm in Precision/Recall analysis  Code available online: http://elderlab.yorku.ca/ContourMapping

25 Centre for Vision Research, York University25 Thank You!


Download ppt "Design and Perceptual Validation of Performance Measures for Salient Object Segmentation Vida Movahedi, James H. Elder Centre for Vision Research York."

Similar presentations


Ads by Google