EE 566 - Pattern Recognition Project
-
Upload
oscar-duque-suarez -
Category
Documents
-
view
222 -
download
0
Transcript of EE 566 - Pattern Recognition Project
-
7/28/2019 EE 566 - Pattern Recognition Project
1/19
Comparison of Dimensionality ReductionTechniques for Face Recognition
Berkay Topu 6950Sabanci University
-
7/28/2019 EE 566 - Pattern Recognition Project
2/19
Outline Motivation
Face Detection
Database (M2VTS) Different Dimensionality Reduction Techniques
PCA, LDA, aPAC, Normalized PCA, Normalized LDA
Classification Results
Conclusion
-
7/28/2019 EE 566 - Pattern Recognition Project
3/19
Motivation Face recognition: active research area specialising on
how to recognize faces within images or videos
Dimensionality reduction: to be able torepresent/classify with less amount of data
Linear Transforms
-
7/28/2019 EE 566 - Pattern Recognition Project
4/19
Overall System
Dimensionality Reduction
FaceDetection
PCADimensionReduction
Classification
Pattern
Recognition
-
7/28/2019 EE 566 - Pattern Recognition Project
5/19
Face Detection Automatic face detection of OpenCV library
Using Haar-like features
Resized to 64x48
-
7/28/2019 EE 566 - Pattern Recognition Project
6/19
Database M2VTS database used for Audio-visual
speech recognition
Lip detection suitable for face recognition
-
7/28/2019 EE 566 - Pattern Recognition Project
7/19
Database 40 pictures of each 37 subjects
32 pics for training & 8 pics for testing
64x48 pixels 3072 pixels Unnecessary to use the whole image in
recognition system
Is it possible to represent with less information?
-
7/28/2019 EE 566 - Pattern Recognition Project
8/19
PCA (Principal Component
Analysis) Weaknesses:
Translation variant
Scale variant Background variant
Lighting variant
Advantages:
Fast and needs lesser amount of memory
-
7/28/2019 EE 566 - Pattern Recognition Project
9/19
PCA (Principal Component
Analysis) Principal component analysis (PCA) seeks a
computational model that best describes a face byextracting the most relevant information contained inthat face.
Finds a lower dimensional subspace whose basisvectors correspond to the maximum variancedirection in the original image space.
Solution is the eigenvectors of the scatter matrix
-
7/28/2019 EE 566 - Pattern Recognition Project
10/19
LDA (Linear Discriminant
Analysis) Finds the vectors in the underlying space that best
discriminate among classes.
The goal is to maximize between-class scatter(covariance) while minimizing within-class scatter.
maximize the ratio
Solution is the eigenvectors of
W
B
S
S
det
det
BW SS 1
-
7/28/2019 EE 566 - Pattern Recognition Project
11/19
aPAC (Approximate Pairwise
Accuracy Criterion) Drawbacks of LDA:
Maximizing the squared distances between pairs of classes,outliers dominate the eigenvalue decomposition.
So, LDA tends to over-weight the influence of classes thatare already well seperated.
Solution is generalization of LDA by weighting thecontribution of each class due to Mahanalobis
distance between classes.
-
7/28/2019 EE 566 - Pattern Recognition Project
12/19
aPAC (Approximate Pairwise
Accuracy Criterion) K-class LDA can be decomposed into
two-class LDA.
Introducing a weighting of the contributions ofindividual class pairs to the overall criterion.
Weighting function depends on the Bayes error rate*between classes.
Altough it is generalization of LDA, no additionalcomplexity in computation.
* Bayes error rate: theoretical minimum to the error any classifiercan make.
)1(2
1KK
-
7/28/2019 EE 566 - Pattern Recognition Project
13/19
nPCA (Normalized PCA) PCA computes the projection that maximizes the
preservation of pairwise distances in the projectedspace.
Weighting this sum of the squared distances byintroducing symmetric pairwise dissimilarities.
Proposed weights:
spaceoriginalin thedistanceEuclidean:
where1
ij
ij
ij
dist
distd
-
7/28/2019 EE 566 - Pattern Recognition Project
14/19
nPCA (Normalized PCA)Solution is the eigenvectors of where is a
matrix containing pairwise dissimilarities.
XLX dT dL
-
7/28/2019 EE 566 - Pattern Recognition Project
15/19
nLDA (Normalized LDA) Drawbacks of the LDA can be overcome by
Appropriately chosen weights to reduce the dominance oflarge distances
Pairwise similarities together with the pairwise dissimilarities
Attraction between elements of the same class and
repulsion between elements of different classes.
-
7/28/2019 EE 566 - Pattern Recognition Project
16/19
Classification (Training &Testing)
Classification in MATLAB PrTools (PatternRecognition Toolbox)
Nearest Mean Classifier (nmc) & LinearClassifier (ldc)
40 images from 37 subject 1480 images
32x37 = 1184 images for training
8x37 = 296 images for testing
-
7/28/2019 EE 566 - Pattern Recognition Project
17/19
Training and Testing
Dimension
Reduction
DimensionReduction
Detected
faces fromdifferentpeople
Classifier
Training
Scorecalculation foreach method
Statictical
data forface images
Unknowndetected
faces
Training
Testing
RecognitionRates
-
7/28/2019 EE 566 - Pattern Recognition Project
18/19
Test Results
Reduced dimension = 32
Reduced dimension = 16
Recognition rate prior to dimension reduction (using all pixels) is 79.05
PCA(128) LDA aPAC nPCA nLDA
nmc 77.7 % 89.19 % 87.84 % 71.62 % 88.85 %
ldc 61.15 % 88.85 % 87.84 % 86.15 % 88.85 %
PCA(128) LDA aPAC nPCA nLDA
nmc 77.7 % 83.11 % 85.47 % 66.89 % 86.15 %
ldc 61.15 % 84.46 % 85.47 % 79.73 % 84.46 %
-
7/28/2019 EE 566 - Pattern Recognition Project
19/19
Conclusion
Face recognition in the lower dimension
Improved recognition rates for several
dimensionality reduction techniques Further work:
Analysis of low recognition rates in some cases
Block PCA and LDA