Calculating eigenvalues and eigenvectors for SW-1SB 

Just as we did in PCA, we rely on eigenvalue decompositions of a specific matrix. In the case of LDA, we will be decomposing the matrix :

# calculate eigenvalues and eigenvectors of S−1W x SB
eig_vals, eig_vecs = np.linalg.eig(np.dot(np.linalg.inv(S_W), S_B))
eig_vecs = eig_vecs.real
eig_vals = eig_vals.real

for i in range(len(eig_vals)):
eigvec_sc = eig_vecs[:,i]
print 'Eigenvector {}: {}'.format(i+1, eigvec_sc)
print 'Eigenvalue {:}: {}'.format(i+1, eig_vals[i])
print

Eigenvector 1: [-0.2049 -0.3871 0.5465 0.7138]
Eigenvalue 1: 32.2719577997 Eigenvector 2: [ 0.009 0.589 -0.2543 0.767 ] Eigenvalue 2: 0.27756686384 Eigenvector 3: [ 0.2771 -0.3863 -0.4388 0.6644] Eigenvalue 3: -6.73276389619e-16 . # basically 0 Eigenvector 4: [ 0.2771 -0.3863 -0.4388 0.6644] Eigenvalue 4: -6.73276389619e-16 . # basically 0

Note that the third and fourth eigenvalues are basically zero. This is because the way LDA is trying to work is by drawing decision boundaries between our classes. Because we only have three classes in the iris, we may only draw up to two decision boundaries. In general, fitting LDA to a dataset with n classes will only produce up to n-1 components.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset