Eigenvectors are a special set of vectors associated with a linear system of equations (i.e., a matrix equation) that are sometimes also known as characteristic vectors, proper vectors, or latent vectors (Marcus and Minc 1988, p. 144).
The determination of the eigenvectors and eigenvalues of a system is extremely important in physics and engineering, where it is equivalent to matrix diagonalization and arises in such common applications as stability analysis, the physics of rotating bodies, and small oscillations of vibrating systems, to name only a few. Each eigenvector is paired with a corresponding so-called eigenvalue. Mathematically, two different kinds of eigenvectors need to be distinguished: left eigenvectors and right eigenvectors. However, for many problems in physics and engineering, it is sufficient to consider only right eigenvectors. The term "eigenvector" used without qualification in such applications can therefore be understood to refer to a right eigenvector.
The decomposition of a square matrix into eigenvalues and eigenvectors is known in this work as
eigen decomposition, and the fact that this
decomposition is always possible as long as the matrix consisting of the eigenvectors
of
is square
is known as the eigen decomposition theorem.
Define a right eigenvector as a column vector
satisfying
(1)
|
where
is a matrix, so
(2)
|
which means the right eigenvalues must have zero determinant, i.e.,
(3)
|
Similarly, define a left eigenvector as a row vector satisfying
(4)
|
Taking the transpose of each side gives
(5)
|
which can be rewritten as
(6)
|
Rearrange again to obtain
(7)
|
which means
(8)
|
Rewriting gives
(9)
| |||
(10)
| |||
(11)
|
where the last step follows from the identity
(12)
|
Equating equations (◇) and (11), which are both equal to 0 for arbitrary
and
, therefore requires that
, i.e., left and right eigenvalues
are equivalent, a statement that is not true for eigenvectors.
Let be a matrix
formed by the columns of the right eigenvectors and
be a matrix formed by the rows
of the left eigenvectors. Let
(13)
|
Then
(14)
| |||
(15)
|
and
(16)
| |||
(17)
|
so
(18)
|
But this equation is of the form
(19)
|
where
is a diagonal matrix, so it must be true that
is also diagonal. In particular,
if
is a symmetric
matrix, then the left and right eigenvectors are simply each other's transpose,
and if
is a self-adjoint matrix (i.e., it is Hermitian),
then the left and right eigenvectors are adjoint matrices.
Eigenvectors may not be equal to the zero vector. A nonzero scalar multiple of an eigenvector is equivalent to the original eigenvector. Hence, without loss of generality, eigenvectors are often normalized to unit length.
While an
matrix always has
eigenvalues, some or all of which may be degenerate, such a matrix may have between
0 and
linearly independent eigenvectors. For example, the matrix
has only the single eigenvector
.
Eigenvectors may be computed in the Wolfram Language using Eigenvectors[matrix].
This command always returns a list of length , so any eigenvectors that are not linearly independent are
returned as zero vectors. Eigenvectors and eigenvalues can be returned together using
the command Eigensystem[matrix].
Given a matrix
with eigenvectors
,
,
and
and corresponding eigenvalues
,
, and
, then an arbitrary vector
can be written
(20)
|
Applying the matrix ,
(21)
| |||
(22)
|
so
(23)
|
If , and
, it therefore follows that
(24)
|
so repeated application of the matrix to an arbitrary vector amazingly results in a vector proportional to the eigenvector with largest eigenvalue.