We have the data set - ORLFACEDATABASE.mat consisting of 400 images (10 images each of 40 people). Each image is of the size 48 x 48 (grey scale). Each image is reshaped to the size of 2304 x 1 (column vector). Size of the MAT file is therefore 2304 x 400
I will be using LDA (as a discriminative technique) followed by nearest mean to form a classifier.
- First initialize all the necessary variables,
% Number of classes
k = 40;
% number of vectors in ith class (for all i \in [1,.. ,k])
ni = 10;
% number of features - pixels
n = 2304;
- For each class (person) we can now compute the “mean vector” which is basically the mean image found from the 10 images.
% collection of class means
means = zeros(4,2304);
for i = 1:1:40
means(i,:) = ((1/10)*sum(C(:,(1+(10*(i-1))):i*10),2))';
end
- Here in ‘means’ the ith row is the ith person’s mean image vector
we pass 2 as argument to sum in order for it to add column wise
By default it adds row wise (also passing one does that). Its better to have each person’s mean vector as a column vector - like how C is arranged.
means = means';
- Now we can also find the dataset mean - the average human image which I shall call meanStar
% Total mean (all data vectors)
meanStar = (1/400)*sum(C(:,:),2);
- As discussed for assignment 2, I will employ multi class LDA and find - the scatter in between class matrix.
% Construct Sb - between class Covariance Matrix
Sb = zeros(n,n);
for i = 1:1:k
Sb = Sb + ni*(means(:,i) - meanStar)*(means(:,i) - meanStar)';
end
- Similarly we can construct - the scatter within class matrix,
% Construct Sw - within class Covariance matrix
Sw = zeros(n,n);
for i = 1:1:k
% Takes the values of the within class C.M for each of the classes
Stemp = zeros(n,n);
% The 10 vectors of the ith class arranged as columns
classVectors = C(:,(1+(10*(i-1))):i*10);
for j = 1:1:ni
Stemp = Stemp + (double(classVectors(:,j)) - means(:,j))*(double(classVectors(:,j)) - means(:,j))';
end
Sw = Sw + Stemp;
end
In practice, is often singular since the data are image vectors with large
dimensionality while the size of the data set is much smaller. Running cond
on the obtained shows us how ill-conditioned the matrix is.
To alleviate this problem, we can perform two projections:
1) PCA is first applied to the data set to reduce its dimensionality.
2) LDA is then applied to further reduce the dimensionality.
But for now I am using discrete inverse theory, according to which adding a small value c to the diagonal of the matrix A about to be inverted, is called damping the inversion and the small value to be added c is called Marquardt-Levenberg coefficient. Sometimes matrix A has zero or close to zero eigenvalues, due to which the matrix becomes singular; adding a small damping coefficient to the diagonal elements makes it stable. Bigger is the value of c, bigger is the damping, your matrix inversion is more stable,
Stemp = Sw/(10^6) + 2*eye(n,n);
cond(Sw)
- Now that we have got both and , we can find the optimal value of and use it to project the data to a lower dimension.
% the first eigen vector
w = V(:,1);
% the projected points
y = ones((k*ni),1);
for i = 1:1:(k*ni)
y(i) = real(w)'*double(C(:,i));
end
- I just plotted all the points below,
Well I was unable to find the code to plot with 40 different colors, so for now all the points look the same, but each point represent a image. Now using this we can define boundaries for each person and use that to discriminate which person a new image is.
Written with StackEdit.
No comments:
Post a Comment