TOPICS
Search

Antisymmetric Matrix


An antisymmetric matrix, also known as a skew-symmetric or antimetric matrix, is a square matrix that satisfies the identity

 A=-A^(T)
(1)

where A^(T) is the matrix transpose. For example,

 A=[0 -1; 1 0]
(2)

is antisymmetric.

A matrix m may be tested to see if it is antisymmetric in the Wolfram Language using AntisymmetricMatrixQ[m].

In component notation, this becomes

 a_(ij)=-a_(ji).
(3)

Letting k=i=j, the requirement becomes

 a_(kk)=-a_(kk),
(4)

so an antisymmetric matrix must have zeros on its diagonal. The general 3×3 antisymmetric matrix is of the form

 [0 a_(12) a_(13); -a_(12) 0 a_(23); -a_(13) -a_(23) 0].
(5)

Applying A^(-1) to both sides of the antisymmetry condition gives

 -A^(-1)A^(T)=I.
(6)

Any square matrix can be expressed as the sum of symmetric and antisymmetric parts. Write

 A=1/2(A+A^(T))+1/2(A-A^(T)).
(7)

But

 A=[a_(11) a_(12) ... a_(1n); a_(21) a_(22) ... a_(2n); | | ... |; a_(n1) a_(n2) ... a_(nn)]
(8)
 A^(T)=[a_(11) a_(21) ... a_(n1); a_(12) a_(22) ... a_(n2); | | ... |; a_(1n) a_(2n) ... a_(nn)],
(9)

so

 A+A^(T)=[2a_(11) a_(12)+a_(21) ... a_(1n)+a_(n1); a_(12)+a_(21) 2a_(22) ... a_(2n)+a_(n2); | | ... |; a_(1n)+a_(n1) a_(2n)+a_(n2) ... 2a_(nn)],
(10)

which is symmetric, and

 A-A^(T)=[0 a_(12)-a_(21) ... a_(1n)-a_(n1); -(a_(12)-a_(21)) 0 ... a_(2n)-a_(n2); | | ... |; -(a_(1n)-a_(n1)) -(a_(2n)-a_(n2)) ... 0],
(11)

which is antisymmetric.

All n×n antisymmetric matrices of odd dimension are singular. This follows from the fact that

 A^(T)=-A.
(12)

So, by the properties of determinants,

det(A^(T))=det(-A)
(13)
=(-1)^ndet(A).
(14)

Therefore, if n is odd, then

 det(-A)=-det(A)=0,
(15)

thus proving all antisymmetric matrices of odd dimension are singular.

The set of n×n antisymmetric matrices is denoted o(n). o(n) is a vector space, and the commutator

 [A,B]=AB-BA
(16)

of two antisymmetric matrices is antisymmetric. Hence, the antisymmetric matrices are a Lie algebra, which is related to the Lie group of orthogonal matrices. In particular, suppose A(t) is a path of orthogonal matrices through A(0)=I, i.e., A(t)A^(T)(t)=I for all t. The derivative at t=0 of both sides must be equal so dA/dt(0)+dA^(T)/dt(0)=0. That is, the derivative of A(t) at the identity must be an antisymmetric matrix.

The matrix exponential map of an antisymmetric matrix is an orthogonal matrix.


See also

Antihermitian Matrix, Antisymmetric Part, Bisymmetric Matrix, Diagonal Matrix, Hankel Matrix, Symmetric Matrix, Transpose

Portions of this entry contributed by Todd Rowland

Explore with Wolfram|Alpha

Cite this as:

Rowland, Todd and Weisstein, Eric W. "Antisymmetric Matrix." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/AntisymmetricMatrix.html

Subject classifications