Symmetric matrix

A symmetric matrix A is always of the form

A=VtDV

where V is a orthogonal matrix and D is a diagonal one. This is a version of the spectral theorem.

The meaning is that symmetric matrices correspond to scale changes. While diagonal matrices encode scale changes in the main axis, a general symmetric matrix represents a scale change in another (orthogonal) axis, and V is the orthogonal transformation between these axis. In a sense, they are a kind of generalization of diagonal matrices.

Symmetric matrices are, therefore, always diagonalizable and have orthogonal eigenvectors. This is generalized to self adjoint operator in finite dimensional Hilbert spaces or, better said, to self adjoint matrixs

Keep an eye: is not the same as symmetric operator, but is related.

Generalization

We can say that a matrix A is a (1,1)-tensor, with components Aji with respect to a particular basis {ei} given by the matrix entries. Given a inner product g, in the sense of a rank (0,2)-tensor, we say that A is symmetric with respect to g if the components of

Aij:=gimAjm

are symmetric, i.e, Aij=Aji.

Observe that given a vector v=vjej, we have Av=Ajivjei and

g(Av,w)=gab(Av)awb=gabAjavjwb=gbaAjavjwb=Abjvjwb=Ajbvjwb==gjkAbkvjwb=g(v,Aw),

which is the defining property of symmetric operators.