# Segmentation using eigenvectors Papers: â€œNormalized Cuts and Image...

date post

29-Dec-2015Category

## Documents

view

214download

0

Embed Size (px)

### Transcript of Segmentation using eigenvectors Papers: â€œNormalized Cuts and Image...

Segmentation using eigenvectorsPapers: Normalized Cuts and Image Segmentation. Jianbo Shi and Jitendra Malik, IEEE, 2000Segmentation using eigenvectors: a unifying view. Yair Weiss, ICCV 1999.

Presenter: Carlos Vallespicvalles@cs.cmu.edu

Image Segmentation

Image segmentationHow do you pick the right segmentation? Bottom up segmentation: - Tokens belong together because they are locally coherent. Top down segmentation: - Tokens grouped because they lie on the same object.

Correct segmentationThere may not be a single correct answer.Partitioning is inherently hierarchical.One approach we will use in this presentation:Use the low-level coherence of brightness, color, texture or motion attributes to come up with partitions

OutlineIntroductionGraph terminology and representation.Min cuts and Normalized cuts.Other segmentation methods using eigenvectors.Conclusions.

OutlineIntroductionGraph terminology and representation.Min cuts and Normalized cuts.Other segmentation methods using eigenvectors.Conclusions.

Graph-based Image SegmentationImage (I)Graph Affinities(W)IntensityColorEdgesTextureSlide from Timothee Cour (http://www.seas.upenn.edu/~timothee)

Graph-based Image SegmentationImage (I)Slide from Timothee Cour (http://www.seas.upenn.edu/~timothee)Graph Affinities(W)IntensityColorEdgesTexture

Graph-based Image SegmentationImage (I)Eigenvector X(W)Slide from Timothee Cour (http://www.seas.upenn.edu/~timothee)Graph Affinities(W)IntensityColorEdgesTexture

Graph-based Image SegmentationImage (I)Eigenvector X(W)DiscretizationSlide from Timothee Cour (http://www.seas.upenn.edu/~timothee)Graph Affinities(W)IntensityColorEdgesTexture

OutlineIntroductionGraph terminology and representation.Min cuts and Normalized cuts.Other segmentation methods using eigenvectors.Conclusions.

Graph-based Image SegmentationV: graph nodesE: edges connection nodesG = {V,E}PixelsPixel similaritySlides from Jianbo Shi

Graph terminologySimilarity matrix:Slides from Jianbo Shi

Affinity matrixSimilarity of image pixels to selected pixelBrighter means more similarReshapeN*M pixelsN*M pixelsM pixelsN pixelsWarningthe size of W is quadratic with the numberof parameters!

Graph terminologyDegree of node:Slides from Jianbo Shi

Graph terminologyVolume of set:Slides from Jianbo Shi

Graph terminologySlides from Jianbo ShiCuts in a graph:

RepresentationPartition matrix X:

Pair-wise similarity matrix W:

Degree matrix D:

Laplacian matrix L:

Pixel similarity functionsIntensityTextureDistance

Pixel similarity functionsIntensityTextureDistancehere c(x) is a vector of filter outputs. A natural thing to do is to square the outputs of a range of different filters at different scales and orientations, smooth the result, and rack these into a vector.

DefinitionsMethods that use the spectrum of the affinity matrix to cluster are known as spectral clustering.Normalized cuts, Average cuts, Average association make use of the eigenvectors of the affinity matrix.Why these methods work?

Spectral ClusteringDataSimilarities* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

Eigenvectors and blocksBlock matrices have block eigenvectors:

Near-block matrices have near-block eigenvectors:eigensolver1= 22= 23= 04= 0eigensolver1= 2.022= 2.023= -0.024= -0.02* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

1100110000110011

.71.7100

00.71.71

11.2 0110-.2.20110-.211

.71.69.140

0-.14.69.71

Spectral SpaceCan put items into blocks by eigenvectors:

Clusters clear regardless of row ordering:e1e2e1e2e1e2e1e2* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

11.2 0110-.2.20110-.211

.71.69.140

0-.14.69.71

1.2 10.2101101-.201-.21

.71.14.690

0.69-.14.71

How do we extract a good cluster?Simplest idea: we want a vector x giving the association between each element and a clusterWe want elements within this cluster to, on the whole, have strong affinity with one anotherWe could maximizeBut need the constraintThis is an eigenvalue problem - choose the eigenvector of W with largest eigenvalue.

Minimum cutCriterion for partition:

First proposed by Wu and LeahyABProblem! Weight of cut is directly proportional to the number of edges in the cut.

Normalized CutNormalized cut or balanced cut:Finds better cut

Normalized CutVolume of set (or association):AB

Normalized CutVolume of set (or association):

Define normalized cut: a fraction of the total edge connections to all the nodes in the graph:

ABABDefine normalized association: how tightly on average nodes within the cluster are connected to each otherAB

Observations(I)Maximizing Nassoc is the same as minimizing Ncut, since they are related:

How to minimize Ncut?Transform Ncut equation to a matricial form.After simplifying:Subject to:Rayleigh quotientNP-Hard!ys values are quantized

Observations(II)Instead, relax into the continuous domain by solving generalized eigenvalue system:

Which gives:Note that so, the first eigenvector is y0=1 with eigenvalue 0.The second smallest eigenvector is the real valued solution to this problem!!

AlgorithmDefine a similarity function between 2 nodes. i.e.:

Compute affinity matrix (W) and degree matrix (D).SolveUse the eigenvector with the second smallest eigenvalue to bipartition the graph.Decide if re-partition current partitions.

Note: since precision requirements are low, W is very sparse and only few eigenvectors are required, the eigenvectors can be extracted very fast using Lanczos algorithm.

DiscretizationSometimes there is not a clear threshold to binarize since eigenvectors take on continuous values.

How to choose the splitting point? Pick a constant value (0, or 0.5).Pick the median value as splitting point.Look for the splitting point that has the minimum Ncut value:Choose n possible splitting points.Compute Ncut value.Pick minimum.

Use k-eigenvectorsRecursive 2-way Ncut is slow.We can use more eigenvectors to re-partition the graph, however:Not all eigenvectors are useful for partition (degree of smoothness).Procedure: compute k-means with a high k. Then follow one of these procedures:Merge segments that minimize k-way Ncut criterion.Use the k segments and find the partitions there using exhaustive search.

Compute Q (next slides).e1e2e1e2

11.2 0110-.2.20110-.211

.71.69.140

0-.14.69.71

Toy examplesImages from Matthew Brand (TR-2002-42)

Example (I)EigenvectorsSegments

Example (II)* Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003SegmentsOriginal

Other methodsAverage associationUse the eigenvector of W associated to the biggest eigenvalue for partitioning.Tries to maximize:

Has a bias to find tight clusters. Useful for gaussian distributions.AB

Other methodsAverage cutTries to minimize:

Very similar to normalized cuts.We cannot ensure that partitions will have a a tight within-group similarity since this equation does not have the nice properties of the equation of normalized cuts.

Other methods

Other methods20 points are randomly distributed from 0.0 to 0.512 points are randomly distributed from 0.65 to 1.0Normalized cutAverage cutAverage association

Other methodsScott and Longuet-Higgins (1990).V contains the first eigenvectors of W.Normalize V by rows.Compute Q=VTVValues close to 1 belong to the same cluster.Second evFirst evQWData

Other applicationsCosteira and Kanade (1995).Used to segment points in motion.Compute M=(XY). The affinity matrix W is compute as W=MTM. This trick computes the affinity of every pair of points as a inner product.Compute Q=VTVValues close to 1 belong to the same cluster.

DataMQ

Other applicationsFace clustering in meetings.Grab faces from video in real time (use a face detector + face tracker).Compare all faces using a distance metric (i.e. projection error into representative basis).Use normalized cuts to find best clustering.

ConclusionsGood news:Simple and powerful methods to segment images.Flexible and easy to apply to other clustering problems.

Bad news:High memory requirements (use sparse matrices).Very dependant on the scale factor for a specific problem.

Thank you!The End!

ExamplesSpectral CluteringImages from Matthew Brand (TR-2002-42)

Spectral clusteringMakes use of the spectrum of the similarity matrix of the data to cluster the points. Solve clustering for affinity matrixw(i,j) distance node i to node j

Graph terminologySimilarity matrix:Degree of node:Volume of set:Graph cuts:

*View more*