Matrix Decomposition

Preparation

We only need NumPy.

import numpy as np

Eigen Decomposition

Let us start with the definition of an Eigen value and its correspoinding Eigen vector.

\[\mathbf{A}\vec{x} = \lambda \vec{x}\]

where:

  • \(\mathbf{A}\) is a square matrix

  • \(\lambda\) is an Eigen value of \(\mathbf{A}\)

  • \(\vec{x}\) is the corresponding Eigen vector

Next, there are more than one Eigen values (\(\mathbf{\bar{\lambda}}\)) of \(\mathbf{A}\):

\[\mathbf{A} = \mathbf{Q} \, \mathbf{\Lambda} \, \mathbf{Q}^{-1}\]

where

  • \(\mathbf{A}\) is a square matrix

  • \(\mathbf{Q}\) is the orthogonal Eigen-vector matrix

  • \(\text{diag}({\mathbf{\Lambda}})\) is the Eigen values.

This definition is known as the Eigen decomposition.

Example:

Take matrix \(\mathbf{A}\), find the Eigen values and Eigenvector matrix!

A = np.array([[4, 1],
              [2, 1]])
print(A)
[[4 1]
 [2 1]]
lambdas, Q = np.linalg.eig(A)

print("Eigen values = ")
print(lambdas)

print("Eigenvector matrix = ")
print(Q)
Eigen values = 
[4.56155281 0.43844719]
Eigenvector matrix = 
[[ 0.87192821 -0.27032301]
 [ 0.48963374  0.96276969]]

Compute \(\mathbf{Q} \mathbf{\Lambda} \mathbf{Q}^{-1}\). Show that the result is \(\mathbf{A}\).

Q@np.diag(lambdas)@np.linalg.inv(Q)
array([[4., 1.],
       [2., 1.]])

SVD Decomposition

\[ \mathbf{A}=\mathbf{U} \, \mathbf{\Sigma} \, \mathbf{V}^T \]

where:

  • \(\mathbf{U}\) is an orthogonal matrix

  • \(\mathbf{\Sigma}\) is a rectangular diagonal matrix

  • \(\mathbf{V}\) is an orthogonal matrix

The SVD of \(\mathbf{A}\) is closely related to the eigendecompositions of \(\mathbf{A} \mathbf{A}^T\) and \(\mathbf{A}^T \mathbf{A}\)

  • The left singular vectors are the eigenvectors of \(\mathbf{A A}^T\)

  • The right singular vectors are the eigenvectors of \(\mathbf{A}^T \mathbf{A}\)

  • The singular values are the square roots of the eigenvalues of both \(\mathbf{A A}^T\) and \(\mathbf{A}^T \mathbf{A}\)

Example:

Take matrix \(\mathbf{A}\), find the SVD decomposition!

A = np.array([[4, 1],
              [2, 1]])
print(A)
[[4 1]
 [2 1]]
U, S, V = np.linalg.svd(A)
print("U = ")
print(U)

print("S = ") # S = diag(Sigma)
print(S)

print("V = ")
print(V)
U = 
[[-0.8816746  -0.47185793]
 [-0.47185793  0.8816746 ]]
S = 
[4.6708301  0.42818941]
V = 
[[-0.95709203 -0.28978415]
 [-0.28978415  0.95709203]]

SVD Decomposition of a Square Symmetric and Positive Definite Matrix

A square matrix is called positive definite if it is symmetric and all its eigenvalues \(\lambda\) are positive, that is \(\lambda > 0\).

Additionally, if \(\mathbf{A}\) is positive definite, then it is invertible and \(\det(\mathbf{A}) > 0\).

For square symmetric and positive definite matrix:

\[\text{diag}(\text{eig}(\mathbf{A})) = \mathbf{\Sigma} \]

Example, take matrix \(\mathbf{A}\), show that its SVD decomposition is equal to its Eigen decomposition.

A = np.array([[4, 1],
              [1, 1]])

U, S, V = np.linalg.svd(A)
e, Q = np.linalg.eig(A)

Let us show that \(\Sigma = \lambda\):

print(S, '=', e)
[4.30277564 0.69722436] = [4.30277564 0.69722436]

Now, let us take \(\mathbf{U}\) of the SVD decompositoin and show the following equation is valid.

\[\mathbf{A} = \mathbf{U} \mathbf{\Lambda} \mathbf{U}^{-1}\]
U@np.diag(e)@np.linalg.inv(U)
array([[4., 1.],
       [1., 1.]])

Next, let us take \(\mathbf{V}\) of the SVD decompositoin and show the following equation is valid.

\[\mathbf{A} = \mathbf{V} \mathbf{\Lambda} \mathbf{V}^{-1}\]
V@np.diag(e)@np.linalg.inv(V)
array([[4., 1.],
       [1., 1.]])

Hence, for a covariance matrix, which is a symmetric and positive definite matrix, we can use SVD decomposition to replace the Eigen decomposition:

  • Both \(\mathbf{U}\) from SVD and \(\mathbf{V}\) from Eigen decomposition are the Eigen-vector matrix

  • Both \(\text{diag}(\mathbf{\Sigma})\) from SVD and \(\text{diag}(\mathbf{\Lambda})\) from Eigen decomposiion are the Eigen values.

References

A NOTE ON THE RELATIONSHIP BETWEEN PCA AND SVD by Bastian Rieck