Singular Value Decomposition(SVD)

Linear Algebra
Author

Quasar

Published

July 24, 2024

Rectangular matrices do not have eigenvalues. However, we might look at the eigenvalues of the symmetric, positive semidefinite square Gram matrix \(K=AA^T\). Perhaps the eigenvalues of \(K\) might form an important role for general matrices. They were first studied by the German mathematician Erhard Schmidt in early days of the 20th century.

Definition 1 (Singular Values) The singular values \(\sigma_1,\ldots,\sigma_r\) of a rectangular matrix \(A\in \mathbf{R}^{m \times n}\) are the positive square roots, \(\sigma_i = \sqrt{\lambda_i} > 0\) of the non-zero eigenvalues of the Gram matrix \(K = AA^T\). The corresponding eigenvectors of \(K\) are known as the singular vectors of \(A\).

Since \(K=AA^T\) is necessarily positive semi-definite, its eigenvalues are necessarily non-negative, \(\lambda_i \geq 0\), which justifies the positivity of the singular values of \(A\) - independently of whether \(A\) itself has positive, negative or even complex eigenvalues, or is rectangular and has no eigenvalues at all. I will follow the standard convention, and always label the singular values in decreasing order, so that \(\sigma_1 \geq \sigma_2 \geq \ldots \geq \sigma_r\).

In the special case of symmetric matrices, there is a direct connection between their singular values and their (necessarily real) eigenvalues.

Proposition 1 If \(A = A^T\) is a symmetric matrix, its singular values are the absolute values of its nonzero eigenvalues : \(\sigma_i = |\lambda_i| > 0\); its singular vectors coincide with its non-null eigenvectors.

Proof.

When \(A\) is symmetric, \(K=A^T A = A^2\). So, if

\[ A \mathbf{v} = \lambda \mathbf{v} \]

then

\[ K \mathbf{v} = A^2 \mathbf{v} = A(A \mathbf{v}) = A(\lambda \mathbf{v}) = \lambda A \mathbf{v} = \lambda^2 \mathbf{v} \]

Thus, every eigenvector \(\mathbf{v}\) of \(A\) is also an eigenvector of \(K\) with eigenvalue \(\lambda^2\). So, the eigenvector basis of \(A\) is also an eigenvector basis for \(K\), and forms a complete system of singular vectors for \(A\). \(\blacksquare\)

SVD Factorization

The generalization of the spectral theorem to non-symmetric matrices is known as the singular value decomposition, commonly abbreviated SVD. Unlike the former, which applies to only symmetric matrices, every nonzero matrix possesses a SVD factorization.

Theorem 1 (SVD Factorization) Every non-zero real \(m \times n\) matrix \(A\) of rank \(r > 0\) can be factored:

\[ A = U \Sigma V^T \]

into the product of an \(m \times r\) matrix \(U\), the \(r \times r\) diagonal matrix \(\Sigma = diag(\sigma_1,\ldots,\sigma_r)\) and an \(r \times n\) matrix \(V^T\), such that \(U\) and \(V\) are orthonormal matrices.

Proof.

Let’s begin by writing the desired factorization as \(AQ = P \Sigma\). The individual columns