Singular value decomposition

I don't know how I have not learned this until today because it is very important.
Given any real square matrix A, we can write

A=UΣV

where U and V are orthogonal matrices and Σ is a diagonal matrix with non negative entries.
The meaning of this is easy: any linear transformation of a vector space can be obtained by a rigid transformation (rotation or reflection), followed by a scale change in the main axis direction (and different scales could be applied in every axis) and finally followed by another rigid transformation.
Pasted image 20220413160357.png

It is related to the polar decomposition.

Visualization: see this web.

Non square matrices

This also works for non square matrices. Consider that A is the n×m matrix of a transformation T:RnRm. Then again A=UΣV where U and V are square orthogonal matrices of dimension m and n respectively. But now, Σ is a non square diagonal matrix with non-negative entries.
The m columns of U represent a orthonormal basis of Rm, BU. And the n columns of V are a orthonormal basis of Rn, BV. The transformation T can be understood like the one which sends the ith element of BV to the ith element of BU multiplied by the (non-negative) diagonal element σii of Σ. This happens for every linear transformation!
Pasted image 20220413184919.png|700

It is related to matrix diagonalization. Indeed they are equal when the matrix is symmetric positive-demidefinite.