Presentation is loading. Please wait.

Presentation is loading. Please wait.

February, 26 2007 Content-Based Image Retrieval Saint-Petersburg State University Natalia Vassilieva Il’ya Markov

Similar presentations


Presentation on theme: "February, 26 2007 Content-Based Image Retrieval Saint-Petersburg State University Natalia Vassilieva Il’ya Markov"— Presentation transcript:

1 February, 26 2007 Content-Based Image Retrieval Saint-Petersburg State University Natalia Vassilieva natalia@ntc-it.ru Il’ya Markov ilya.markov@gmail.com Alexander Dolnik alexander.dolnik@gmail.com

2 February, 26 2007 Our team  Natalia Vassilieva  Alexander Dolnik  Ilya Markov  Maria Teplyh  Maria Davydova  Dmitry Shubakov  Alexander Yaremchuk

3 February, 26 2007 General problems  Semantic gap between system and human mode of image analysis  Specific of human visual perception  How to catch semantics of an image  Signature calculation and response time  Combining different features and metrics

4 February, 26 2007 Image retrieval system General goal: an image retrieval system that is able to process natural language query that is able to search among annotated and non-annotated images that takes into account human visual perception that processes various features (color, texture, shapes) that uses relevance feedback for query refinement, adaptive search How to minimize “semantic gap”? semanticlow-level features semantic gap

5 February, 26 2007 CBIR : Traditional approach indexation retrieval signature calculation database signature calculation comparison result image query Relevance feedback: query refinement fusion of results: independent search by different features color space partition according to human perception auto-annotation annotations refinement multidimensional indexing (vp-tree)

6 February, 26 2007 Research directions  Color space partition according to human visual perception  Correspondence between low-level features and semantics: auto-annotation  Fusion of retrieval result sets  Adaptive search: color and texture fusion  Using relevance feedback

7 February, 26 2007 Human visual perception: colors Experiments with color partition: HSV space (H=9; S=2; V=3) – 72 % (H=11; S=2; V=3) – 66% (H=13; S=2; V=3) – 63% (H=15; S=2; V=3) – 60% Compare partitions of different spaces (RGB, HSV, Lab)

8 February, 26 2007 Research directions  Color space partition according to human visual perception  Correspondence between low-level features and semantics: auto-annotation  Fusion of retrieval result sets  Adaptive search: color and texture fusion  Using relevance feedback

9 February, 26 2007 Auto-annotation Natalia Vassilieva, Boris Novikov. Establishing a correspondence between low-level features and semantics of fixed images. In Proceedings of the Seventh National Russian Research Conference RCDL'2005, Yaroslavl, October 04 - 06, 2005  Training set selection  Color feature extraction for every image from the set  Similarity calculation for every pair of images from the set  Training set clustering  Basis color features calculation: one per every cluster  Definition of basis lexical features  Correspondence between basis color features and basis lexical features

10 February, 26 2007 Examples city, night, road, riversnow, winter, sky, mountain

11 February, 26 2007 Retrieve by textual query N. Vassilieva and B. Novikov. A Similarity Retrieval Algorithm for Natural Images. Proc. of the Baltic DB&IS'2004, Riga, Latvia, Scientific Papers University of Latvia, June 2004  Image database is divided into clusters  Search for appropriate cluster by textual query using cluster’s annotations  Browse the images from the appropriate cluster  Use relevance feedback to refine the query  Use relevance feedback to reorganize the clusters and assign new annotations

12 February, 26 2007 Feature extraction: color  Color: histograms  Color: statistical approach First moments for color distribution (every channel) and covariations

13 February, 26 2007 Feature extraction: texture  Texture: use independent component filters that results from ICA H. Borgne, A. Guerin-Dugue, A. Antoniadis “Representation of images for classification with independent features” Image I 1 Image I 2 … N filtres dist(I 1,I 2 ) = KL H (H 1i, H 2i ) Σ i=1 N

14 February, 26 2007 Research directions  Color space partition according to human visual perception  Correspondence between low-level features and semantics: auto-annotation  Fusion of retrieval result sets  Adaptive search: color and texture fusion  Using relevance feedback

15 February, 26 2007 Fusion of retrieval result sets  How to merge fairly?  How to merge efficiently?  How to merge effectively? Fusion of weighted lists with ranked elements: (x 1 1, r 1 1 ), (x 1 2, r 1 2 ), …, (x 1 n, r 1 n ) ω1ω1 (x 2 1, r 2 1 ), (x 2 2, r 2 2 ), …, (x 2 k, r 2 n ) ω2ω2 (x m 1, r m 1 ), (x m 2, r m 2 ), …, (x m l, r m l ) ωmωm … ?

16 February, 26 2007  Supplement fusion –union textual results (textual viewpoints )  Collage fusion –combine texture (texture viewpoint) & color results (color viewpoint) –different color methods (different color viewpoints) Ranked lists fusion: application area

17 February, 26 2007  Search by textual query in partly annotated image database Ranked lists fusion: application area Textual query TextResult 1, textrank 1 TR 2, tr 2,... … tr 1 … tr 2 … by annotations content-based Result

18 February, 26 2007  commutative property  associative property  value of result object's rank independent of another object's ranks Examples: COMBSUM, COMBMIN, COMBMAX merge functions Three main native fusion properties

19 February, 26 2007  normalization & delimitation property  conic property  attraction of current object for mix result depend on value of function g(rank, weight) ≥ 0 ;  snare condition: Additional native fusion properties

20 February, 26 2007  g monotonically decreases with fixed weight parameter  g monotonically decreases with fixed rank parameter  g must satisfy boundaries conditions:  g( 0, w ) > 0 if w != 0  g( r, 0 ) = 0 Conic properties, function g

21 February, 26 2007 Fusion formula where Ranked lists fusion: Formulas

22 February, 26 2007  All lists are sorted by object id  Using step by step lists merging (object id priory)  If object_id1 not equal object_id2 => some object is absent in one of the lists Ranked lists fusion: Algorithm List 1 List 2 Result list Current object_id2 Current object_id1

23 February, 26 2007  Viewpoint should provide some “valuable” information. Retrieval system's performance at least should be better than a random system.  Information is not fully duplicated. There should be partial disagreement among viewpoints. Ranked lists fusion: Experiments Necessary conditions:

24 February, 26 2007  R overlap && N overlap conditions  Intercomparison of methods –Classical methods: COMBSUM, COMBMIN, COMBMAX –Probability methods: probFuse –Random method: random values that satisfied to merge properties. Ranked lists fusion: Experiments Parameters:

25 February, 26 2007 Research directions  Color space partition according to human visual perception  Correspondence between low-level features and semantics: auto-annotation  Fusion of retrieval result sets  Adaptive search: color and texture fusion  Using relevance feedback

26 February, 26 2007 Adaptive merge: color and texture Hypothesis: Optimal α depends on features of query Q. It is possible to distinguish common features for images that have the same “best” α. Dist(I, Q) = α *C(I, Q) + (1 - α )*Т(I, Q), C(I, Q) – color distance between I and Q; T(I, Q) – texture distance between I and Q; 0 ≤ α ≤ 1

27 February, 26 2007 Adaptive merge: experiments

28 February, 26 2007 Estimation tool  Web-application  Provides interfaces for developers of search- methods  Uses common measures to estimate search methods:  Precision  Pseudo-recall  Collects users opinions – > builds test database

29 February, 26 2007 Datasets  Own photo collection (~2000 images)  Subset from own photo collection (150 images)  Flickr collection (~15000, ~1.5 mln images)  Corel photoset (1100 images)

30 February, 26 2007 Research directions  Color space partition according to human visual perception  Correspondence between low-level features and semantics: auto-annotation  Fusion of retrieval result sets  Adaptive search: color and texture fusion  Using relevance feedback


Download ppt "February, 26 2007 Content-Based Image Retrieval Saint-Petersburg State University Natalia Vassilieva Il’ya Markov"

Similar presentations


Ads by Google