PMLab. Алена Прихнич и Ірина Пашко "Як масштабувати agile на великі проекти"
PMLAB Finding Similar Image Quickly Using Object Shapes Heng Tao Shen Dept. of Computer Science...
-
Upload
loreen-hortense-bryan -
Category
Documents
-
view
226 -
download
0
Transcript of PMLAB Finding Similar Image Quickly Using Object Shapes Heng Tao Shen Dept. of Computer Science...
PMLAB
Finding Similar Image Quickly Finding Similar Image Quickly Using Object ShapesUsing Object Shapes
Heng Tao ShenHeng Tao ShenDept. of Computer Science Dept. of Computer Science
National University of SingaporeNational University of Singapore
Presented by Chin-Yi TsaiPresented by Chin-Yi Tsai
22PMLAB
OutlineOutline
MotivationMotivation Related WorkRelated Work A Hierarchical Partitioning A Hierarchical Partitioning FrameworkFramework Via Via
AAngle ngle MMappingapping HHierarchical ierarchical PPartitioning With artitioning With SShape hape
RRepresentationsepresentations HHierarchical ierarchical PPartitioning With artitioning With AAngle ngle VVectorsectors ExperimentsExperiments ConclusionConclusion
33PMLAB
IntroductionIntroduction
There are two requirements for a There are two requirements for a content-based image -> content-based image -> effectivenesseffectiveness, , efficiencyefficiency
Identifying relevant image Identifying relevant image quicklyquickly from a large imagesfrom a large images
A framework for fast image retrieval A framework for fast image retrieval based on based on object shapesobject shapes extracted extracted from object within imagesfrom object within images
44PMLAB
Introduction (cont’d)Introduction (cont’d)
Images
Logical
Level N
Level N-1
Level N-2
…coarser
fewer partitions
fewer dimensionality
55PMLAB
Introduction (cont’d)Introduction (cont’d)
Angle mappingAngle mapping (AM) replaces a (AM) replaces a sequence of connected edges by a sequence of connected edges by a smaller number of edgessmaller number of edges– angle > ?angle > ?– length < ?length < ?– Dimensionality Dimensionality
66PMLAB
Introduction (cont’d)Introduction (cont’d)
Two hierarchical structure to Two hierarchical structure to facilitate speedy retrievalfacilitate speedy retrieval
Hierarchical Partitioning on Shape Hierarchical Partitioning on Shape Representation (Representation (HPSRHPSR))– shape representationshape representation as the indexing key as the indexing key
Hierarchical Partitioning on Angle Hierarchical Partitioning on Angle Vector (Vector (HPAVHPAV))– angle informationangle information as the indexing key as the indexing key
77PMLAB
Related WorkRelated Work
For content-based retrieval system, to map For content-based retrieval system, to map physical objects into physical objects into logical representationlogical representation– Color histogram, 2D-strings, symbolic imageColor histogram, 2D-strings, symbolic image
Decomposing an image into its individual Decomposing an image into its individual objectobject
Several indexing structure (Several indexing structure (dimensionalitydimensionality))– R-tree, RR-tree, R++-tree, R-tree, R**-tree, TV-tree, NR-tree-tree, TV-tree, NR-tree– HPSRHPSR, , HPAVHPAV
( Angle Mapping )
88PMLAB
A Hierarchical Partitioning A Hierarchical Partitioning Framework Via Angle MappingFramework Via Angle Mapping
The The frameworkframework maps high-D into multiple level maps high-D into multiple level low-Dlow-D
AMAM approximates a shape based on the approximates a shape based on the anglesangles between between edgesedges
SharperSharper angles and angles and longerlonger edges carry more edges carry more important information about shapeimportant information about shape
Angle Interval (AI)Angle Interval (AI)– AI[i] = (90 + 90 * (i - 1) / N, 90 + 90 * i / (N)] orAI[i] = (90 + 90 * (i - 1) / N, 90 + 90 * i / (N)] or– AI[i] = (270 - 90 * (i - 1) / N, 270 - 90 * i / (N)]AI[i] = (270 - 90 * (i - 1) / N, 270 - 90 * i / (N)]– If N=3, (150, 180), (120, 150), (90, 120)If N=3, (150, 180), (120, 150), (90, 120)
Prune Length Threshold (PLT)Prune Length Threshold (PLT)
99PMLAB
A Hierarchical Partitioning A Hierarchical Partitioning Framework Via Angle MappingFramework Via Angle Mapping
Logical
Level-2 Level-1
Level-3
AI[3] = ( 150, 180 )AI[2] = ( 120, 150 )AI[1] = ( 90, 120 )
1010PMLAB
A Hierarchical Partitioning A Hierarchical Partitioning FrameworkFramework
DDimension imension RReduction eduction RRatioatio– DRR(i, i-1)=[Dim(R(i)) – Dim(R(i-1))] / DRR(i, i-1)=[Dim(R(i)) – Dim(R(i-1))] /
Dim(R(i))Dim(R(i))
(9-3)/9 = 2/3
1111PMLAB
1.1. Find the shape Find the shape dominating outlinedominating outline and initialize it as level and initialize it as level NN representation representation
2.2. for level i from for level i from NN to to 11 do do– 2.1 check if angle lies in AI[i]2.1 check if angle lies in AI[i]– 2.2 check edge < PLT2.2 check edge < PLT– 2.3 go back 2.1 or 2.22.3 go back 2.1 or 2.2– 2.4 get level-i representation2.4 get level-i representation– 2.5 if there are too many shapes at level i2.5 if there are too many shapes at level i
2.5.1 2.5.1 clustercluster similar shapes similar shapes 2.5.2 2.5.2 identify a representative from the identify a representative from the
shapeshape
Algorithm: A Hierarchical Partitioning Framework
( N = 2 )
1212PMLAB
Hierarchical Partitioning With Hierarchical Partitioning With Shape RepresentationsShape Representations
Two shapes that are similar are Two shapes that are similar are grouped as a clustergrouped as a cluster
Representation Reduction RatioRepresentation Reduction Ratio– RRRRRR(i, i-1) = NumberOfNodesAtLevel(i) / (i, i-1) = NumberOfNodesAtLevel(i) /
NumberOfNodesAtLevel(i-1)NumberOfNodesAtLevel(i-1)
RRR(N,N-1)=(6-4)/6=1/3
Level N
Level N-1
1313PMLAB
Algorithm: HPSRAlgorithm: HPSR
( 2-level HPSR Indexing )
1. Initialize n to N1. Initialize n to N 2. For all logical shapes, apply AM2. For all logical shapes, apply AM
to get their level-n representationto get their level-n representation 3. Merge 3. Merge identicalidentical representation into representation into
partitionpartition 4. If number of partition is too big, 4. If number of partition is too big,
cluster cluster similarsimilar representation given representation given
distance thresholddistance threshold 5. For each partition, produce partition’s5. For each partition, produce partition’s
representative as a node at level nrepresentative as a node at level n 6. Build connection between level-n and6. Build connection between level-n and
their childrentheir children 7. if n>1, map level n nodes to level n-17. if n>1, map level n nodes to level n-1
representations and decrease n by 1, then go to step 3.representations and decrease n by 1, then go to step 3.
1414PMLAB
Algorithm: Query Processing in Algorithm: Query Processing in HPSRHPSR
( 2-level HPSR Indexing )
1. Generate query shape’s N multi-level 1. Generate query shape’s N multi-level
representationrepresentation 2. Initialize level n as 2. Initialize level n as 11 3. Compute the similarity between query3. Compute the similarity between query
shape’s level-n representation and level nshape’s level-n representation and level n
nodesnodes 4. If no similar node is found, return null4. If no similar node is found, return null 5. If leaf level is reached, get shape’s logical5. If leaf level is reached, get shape’s logical
representation and compute the similarityrepresentation and compute the similarity
with query shape. Return those similarwith query shape. Return those similar
shapesshapes 6. Increase n by 1 and retrieval the similar 6. Increase n by 1 and retrieval the similar
node’s children as level-n nodes. Go back to step 3node’s children as level-n nodes. Go back to step 3
1515PMLAB
HPSR MatchingHPSR Matching
To get the similarity between two To get the similarity between two dominating outlines, this paper applies dominating outlines, this paper applies the the turning functionturning function dxxbxabaD
1
0
2))()((),(
V
x
a(x)
v
2v
1
1616PMLAB
Hierarchical Partitioning With Hierarchical Partitioning With Angle VectorsAngle Vectors
AV for level-1 representation
AV for level-2 representation
AV for level-3 representation
AV for logical representation
3
4 2
3 3 1
5 3 0 1
AI[3] = ( 150, 180 )AI[2] = ( 120, 150 )AI[1] = ( 90, 120 )AI[0] = ( 0, 90)
1717PMLAB
( HPSR ) ( HPAV )1. To construct the HPAV structure, we need to build the HPSR structure first2. Only difference between HPSR and HPAV is the node representation3. In HPAV tree, each level has much lower fixed dimensions4. HPAV has fewer storage requirement
1818PMLAB
Similarity measure for AV Similarity measure for AV comparisoncomparison
21
21)2,1(
AVAV
AVAVAVAVSim
Definition:
1919PMLAB
ExperimentsExperiments
Set upSet up– P-III 700MHz with 128 Mbytes of RAMP-III 700MHz with 128 Mbytes of RAM
To evaluate the two structure To evaluate the two structure HPSRHPSR and and HPAVHPAV on their on their effectivenesseffectiveness and and efficiencyefficiency, as well as , as well as storage storage requirementrequirement
2020PMLAB
Tuning Mapping Level and Tuning Mapping Level and PLTPLT
During the During the Angle MappingAngle Mapping process, process, two parameter may affect the resultstwo parameter may affect the results– Mapping level Mapping level NN, Prune Length Threshold , Prune Length Threshold
((PLTPLT)) Choose the optimal value for Choose the optimal value for NN and and
PLTPLT
2121PMLAB
lower-dimension
more Angle Interval
too many level
coarser approximation
2222PMLAB
Effectiveness and Efficiency of Effectiveness and Efficiency of HPAR and HPAVHPAR and HPAV
2323PMLAB
Storage Space for IMLR and Storage Space for IMLR and IAV TreeIAV Tree
2424PMLAB
ConclusionConclusion
A framework for partitioning image A framework for partitioning image database based on database based on shapes of objectsshapes of objects
Use hierarchy of approximation of Use hierarchy of approximation of shapes that reduces the shapes that reduces the dimensionality of shapes . . . (dimensionality of shapes . . . (AMAM))
Meet the user’s performance Meet the user’s performance requirementrequirement
Two indexing structure based on the Two indexing structure based on the frameworkframework– HPSRHPSR, , HPAVHPAV