- You are here
- Everything Explained.Today
- A-Z Contents
- S
- SY
- SYM
- SYMM
- SYMME
- SYMMET
- Symmetric matrix

In linear algebra, a **symmetric matrix** is a square matrix that is equal to its transpose. Formally,

Because equal matrices have equal dimensions, only square matrices can be symmetric.

The entries of a symmetric matrix are symmetric with respect to the main diagonal. So if

*a*_{ij}

*i*

*j*

for all indices

*i*

*j.*

Every square diagonal matrix is symmetric, since all off-diagonal elements are zero. Similarly in characteristic different from 2, each diagonal element of a skew-symmetric matrix must be zero, since each is its own negative.

In linear algebra, a real symmetric matrix represents a self-adjoint operator^{[1]} over a real inner product space. The corresponding object for a complex inner product space is a Hermitian matrix with complex-valued entries, which is equal to its conjugate transpose. Therefore, in linear algebra over the complex numbers, it is often assumed that a symmetric matrix refers to one which has real-valued entries. Symmetric matrices appear naturally in a variety of applications, and typical numerical linear algebra software makes special accommodations for them.

The following

3 x 3

*A*=*
**\begin{bmatrix}
*1*&*7*&*3*\\
*7*&*4*&*5*\\
*3*&*5*&*0*
**\end{bmatrix}
*

- The sum and difference of two symmetric matrices is again symmetric
- This is not always true for the product: given symmetric matrices

*A*

*B*

*AB*

*A*

*B*

*AB*=*BA*

- For integer

*n*

*A*^{n}

*A*

- If

*A*^{-1}

*A*

Any square matrix can uniquely be written as sum of a symmetric and a skew-symmetric matrix. This decomposition is known as the Toeplitz decomposition. Let

Mat_{n}

*n* x *n*

Sym_{n}

*n* x *n*

Skew_{n}

*n* x *n*

Mat_{n}=Sym_{n}+Skew_{n}

Sym_{n}*\cap*Skew_{n}=*\{*0*\}*

Mat_{n}=Sym_{n} ⊕ Skew_{n}*,*

where

⊕

*X**\in*Mat_{n}

*X*=

1 | |

2 |

*\left(X*+*X*^{sf{T}\right)+}

1 | |

2 |

*\left(X*-*X*^{sf{T}\right)}

*X*

A symmetric

*n* x *n*

*\tfrac{*1*}{*2*}n(n*+1*)*

*\tfrac{*1*}{*2*}n(n*-1*)*

Any matrix congruent to a symmetric matrix is again symmetric: if

*X*

*A**X**A*^{T}

*A*

A (real-valued) symmetric matrix is necessarily a normal matrix.

Denote by

*\langle* ⋅ *,* ⋅ *\rangle*

R^{n}

*n* x *n*

*A*

*\langle**Ax,**y**\rangle*=*\langle**x,**Ay**\rangle* *\forall**x,**y**\in*R^{n.}

Since this definition is independent of the choice of basis, symmetry is a property that depends only on the linear operator A and a choice of inner product. This characterization of symmetry is useful, for example, in differential geometry, for each tangent space to a manifold may be endowed with an inner product, giving rise to what is called a Riemannian manifold. Another area where this formulation is used is in Hilbert spaces.

The finite-dimensional spectral theorem says that any symmetric matrix whose entries are real can be diagonalized by an orthogonal matrix. More explicitly: For every real symmetric matrix

*A*

*Q*

*D*=*Q*^{T}*A**Q*

If

*A*

*B*

*n* x *n*

R^{n}

*A*

*B*

Every real symmetric matrix is Hermitian, and therefore all its eigenvalues are real. (In fact, the eigenvalues are the entries in the diagonal matrix

*D*

*D*

*A*

A complex symmetric matrix can be 'diagonalized' using a unitary matrix: thus if

*A*

*U*

*U**A**U*^{T}

*B*=*A*^{\dagger}*A*

*V*

*V*^{\dagger}*B**V*

*C*=*V*^{T}*A**V*

*C*^{\dagger}*C*

*C*=*X*+*iY*

*X*

*Y*

*C*^{\dagger}*C*=*X*^{2+Y}^{2+i(XY-YX)}

*XY*=*YX*

*X*

*Y*

*W*

*W**X**W*^{T}

*W**Y**W*^{T}

*U*=*W**V*^{T}

*UAU*^{T}

*U*

*U*

*UAU*^{T}

*UAU*^{T=}*\operatorname{diag}(r*_{1}

i\theta_{1} | |

e |

*,r*_{2}

i\theta_{2} | |

e |

*,*...*,**r*_{n}

i\theta_{n} | |

e |

*)*

*D*=

-i\theta_{1/2} | |

\operatorname{diag}(e |

-i\theta_{2/2} | |

,e |

*,*...*,*

-i\theta_{n/2} | |

e |

*)*

*DUAU*^{TD=}*\operatorname{diag}(r*_{1,}*r*_{2,}...*,**r*_{n)}

*U'*=*DU*

*A*^{\dagger}*A*

*A*

*A*

*A*

*A*

Using the Jordan normal form, one can prove that every square real matrix can be written as a product of two real symmetric matrices, and every square complex matrix can be written as a product of two complex symmetric matrices.^{[4]}

Every real non-singular matrix can be uniquely factored as the product of an orthogonal matrix and a symmetric positive definite matrix, which is called a polar decomposition. Singular matrices can also be factored, but not uniquely.

Cholesky decomposition states that every real positive-definite symmetric matrix

*A*

*L*

*A*=*LL*^{sf{T}.}

If the matrix is symmetric indefinite, it may be still decomposed as

*PAP*^{sf{T}=}*LDL*^{sf{T}}

*P*

*L*

*D*

1 x 1

2 x 2

A general (complex) symmetric matrix may be defective and thus not be diagonalizable. If

*A*

*A*=*Q*Λ*Q*^{sf{T}}

where

*Q*

*Q**Q*^{sf{T}=}*I*

Λ

*A*

*A*

*Q*

Λ

x

y

λ_{1}

λ_{2}

λ_{1}*\langle*x,y*\rangle*=*\langle**A*x,y*\rangle*=*\langle*x,*A*y*\rangle*=λ_{2}*\langle*x,y*\rangle.*

Since

λ_{1}

λ_{2}

*\langle*x,y*\rangle*=0

Symmetric

*n* x *n*

*n*

*q*

R^{n}

*q(x)*=x^{sf{T}A}x

*n* x *n*

*A*

R^{n}

*q\left(x*_{1,}*\ldots,**x*_{n\right)}=

n | |

\sum | |

i=1 |

λ_{i}

2 | |

x | |

i |

with real numbers

λ_{i}

*\left\{*x*:**q(x)*=1*\right\}*

This is important partly because the second-order behavior of every smooth multi-variable function is described by the quadratic form belonging to the function's Hessian; this is a consequence of Taylor's theorem.

An

*n* x *n*

*A*

*D*

*S*

*A*=*DS.*

The transpose of a symmetrizable matrix is symmetrizable, since

*A*^{T}=*(DS)*^{T}=*SD*=*D*^{-1}*(DSD)*

*DSD*

*A*=*(a*_{ij}*)*

*a*_{ij}=0

*a*_{ji}=0

1*\le**i**\le**j**\le**n.*

a | |

i_{1}i_{2} |

a | |

i_{2}i_{3} |

...

a | |

i_{k}i_{1} |

=

a | |

i_{2}i_{1} |

a | |

i_{3}i_{2} |

...

a | |

i_{1}i_{k} |

*\left(i*_{1,}*i*_{2,}...*,**i*_{k\right).}

Other types of symmetry or pattern in square matrices have special names; see for example:

- Antimetric matrix
- Centrosymmetric matrix
- Circulant matrix
- Covariance matrix
- Coxeter matrix
- Hankel matrix
- Hilbert matrix
- Persymmetric matrix
- Skew-symmetric matrix
- Sylvester's law of inertia
- Toeplitz matrix

See also symmetry in mathematics.

- Book: Jesús Rojo García. Álgebra lineal . es. 2nd. Editorial AC. 1986. 84-7288-120-2.
- Book: R.A.. Horn. C.R.. Johnson. Matrix analysis . 2013 . 2nd . Cambridge University Press . 2978290. pp. 263, 278.
- See:
- A. J.. Bosch . The factorization of a square matrix into two symmetric matrices . . 1986 . 93 . 462–464 . 10.2307/2323471 . 6 . 2323471.
- Book: G.H. Golub, C.F. van Loan. . Matrix Computations . The Johns Hopkins University Press, Baltimore, London . 1996.