In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix, with many useful applications in signal processing and statistics.

Formally, the singular value decomposition of an m×n real or complex matrix M is a factorization of the form



where U is a m×m real or complex unitary matrix, Σ is an m×n rectangular diagonal matrix with non-negative real numbers on the diagonal, and V* (the conjugate transpose of V) is an n×n real or complex unitary matrix. The diagonal entries Σi,i of Σ are known as the singular values of M. The m columns of U and the n columns of V are called the left-singular vectors and right-singular vectors of M, respectively.

The singular value decomposition and the eigendecomposition are closely related. Namely:


  • The left-singular vectors of M are eigenvectors of MM*.
  • The right-singular vectors of M are eigenvectors of M*M.
  • The non-zero-singular values of M (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both M*M and MM*.


Applications which employ the SVD include computing the pseudoinverse, least squares fitting of data, matrix approximation, and determining the rank, range and null space of a matrix.


Intuitive interpretations
Rotation → Scaling → Rotation

In the special but common case in which M is just an m×m square matrix with positive determinant whose entries are plain real numbers, then U, V*, and Σ are m×m matrices of real numbers as well, Σ can be regarded as a scaling matrix, and U and V* can be viewed as rotation matrices.

If the above mentioned conditions are met, the expression UΣV* can thus be intuitively interpreted as a composition (or sequence) of three geometrical transformations: a rotation, a scaling, and another rotation. For instance, the figure above explains how a shear matrix can be described as such a sequence.


Singular values as semiaxes of an ellipse or ellipsoid

As shown in the figure, the singular values can be interpreted as the semi-axes of an ellipse in 2D. This concept can be generalized to n-dimensional Euclidean space, with the singular values of any n×n square matrix being viewed as the semi-axes of an n-dimensional ellipsoid. See below for further details.


The columns of U and V are orthonormal bases

Since U and V* are unitary, the columns of each of them form a set of orthonormal vectors, which can be regarded as basis vectors. By the definition of a unitary matrix, the same is true for their conjugate transposes U* and V. In short, U, U*, V, and V* are orthonormal bases.



Method

http://cs.fit.edu/~dmitra/SciComp/Resources/singular-value-decomposition-fast-track-tutorial.pdf

Sequence of matrix decomposition: MTM → Σ → VTU

  1. Compute transpose MT and MTM.
  2. Determine the eigenvalues of MTM and sort these in descending order, in the absolute sense. Square roots these to obtain the singular values of M.
  3. Construct diagonal matrix Σ by placing singular values in descending order along its diagonal. Compute its inverse, Σ-1.
  4. Use the ordered eigenvalues from step 2 and compute the eigenvectors of MTM. Place these eigenvectors along the columns of V and compute its transpose, VT.
  5. Compute U as U = MVΣ-1. To complete the proof, compute the full ΣVD using M = UΣVT.



Applications

http://en.wikipedia.org/wiki/Singular_value_decomposition#Applications_of_the_SVD



  • Pseudoinverse
  • Solving homogeneous linear equations
  • Total least squares minimization
  • Range, null space and rank
  • Low-rank matrix approximation
  • Separable models
  • Nearest orthogonal matrix
  • The Kabsch algorithm
  • and other examples



References