TOPICS
Search

Eigenvalue


Eigenvalues are a special set of scalars associated with a linear system of equations (i.e., a matrix equation) that are sometimes also known as characteristic roots, characteristic values (Hoffman and Kunze 1971), proper values, or latent roots (Marcus and Minc 1988, p. 144).

The determination of the eigenvalues and eigenvectors of a system is extremely important in physics and engineering, where it is equivalent to matrix diagonalization and arises in such common applications as stability analysis, the physics of rotating bodies, and small oscillations of vibrating systems, to name only a few. Each eigenvalue is paired with a corresponding so-called eigenvector (or, in general, a corresponding right eigenvector and a corresponding left eigenvector; there is no analogous distinction between left and right for eigenvalues).

The decomposition of a square matrix A into eigenvalues and eigenvectors is known in this work as eigen decomposition, and the fact that this decomposition is always possible as long as the matrix consisting of the eigenvectors of A is square is known as the eigen decomposition theorem.

The Lanczos algorithm is an algorithm for computing the eigenvalues and eigenvectors for large symmetric sparse matrices.

Let A be a linear transformation represented by a matrix A. If there is a vector X in R^n!=0 such that

 AX=lambdaX
(1)

for some scalar lambda, then lambda is called the eigenvalue of A with corresponding (right) eigenvector X.

Letting A be a k×k square matrix

 [a_(11) a_(12) ... a_(1k); a_(21) a_(22) ... a_(2k); | | ... |; a_(k1) a_(k2) ... a_(kk)]
(2)

with eigenvalue lambda, then the corresponding eigenvectors satisfy

 [a_(11) a_(12) ... a_(1k); a_(21) a_(22) ... a_(2k); | | ... |; a_(k1) a_(k2) ... a_(kk)][x_1; x_2; |; x_k]=lambda[x_1; x_2; |; x_k],
(3)

which is equivalent to the homogeneous system

 [a_(11)-lambda a_(12) ... a_(1k); a_(21) a_(22)-lambda ... a_(2k); | | ... |; a_(k1) a_(k2) ... a_(kk)-lambda][x_1; x_2; |; x_k]=[0; 0; |; 0].
(4)

Equation (4) can be written compactly as

 (A-lambdaI)X=0,
(5)

where I is the identity matrix. As shown in Cramer's rule, a linear system of equations has nontrivial solutions iff the determinant vanishes, so the solutions of equation (5) are given by

 det(A-lambdaI)=0.
(6)

This equation is known as the characteristic equation of A, and the left-hand side is known as the characteristic polynomial.

For example, for a 2×2 matrix, the eigenvalues are

 lambda_+/-=1/2[(a_(11)+a_(22))+/-sqrt(4a_(12)a_(21)+(a_(11)-a_(22))^2)],
(7)

which arises as the solutions of the characteristic equation

 x^2-x(a_(11)+a_(22))+(a_(11)a_(22)-a_(12)a_(21))=0.
(8)

If all k eigenvalues are different, then plugging these back in gives k-1 independent equations for the k components of each corresponding eigenvector, and the system is said to be nondegenerate. If the eigenvalues are n-fold degenerate, then the system is said to be degenerate and the eigenvectors are not linearly independent. In such cases, the additional constraint that the eigenvectors be orthogonal,

 X_i·X_j=|X_i||X_j|delta_(ij),
(9)

where delta_(ij) is the Kronecker delta, can be applied to yield n additional constraints, thus allowing solution for the eigenvectors.

Eigenvalues may be computed in the Wolfram Language using Eigenvalues[matrix]. Eigenvectors and eigenvalues can be returned together using the command Eigensystem[matrix].

Assume we know the eigenvalue for

 AX=lambdaX.
(10)

Adding a constant times the identity matrix to A,

 (A+cI)X=(lambda+c)X=lambda^'X,
(11)

so the new eigenvalues equal the old plus c. Multiplying A by a constant c

 (cA)X=c(lambdaX)=lambda^'X,
(12)

so the new eigenvalues are the old multiplied by c.

Now consider a similarity transformation of A. Let |A| be the determinant of A, then

|Z^(-1)AZ-lambdaI|=|Z^(-1)(A-lambdaI)Z|
(13)
=|Z||A-lambdaI||Z^(-1)|
(14)
=|A-lambdaI|,
(15)

so the eigenvalues are the same as for A.


See also

Brauer's Theorem, Characteristic Equation, Characteristic Polynomial, Complex Matrix, Condition Number, Eigen Decomposition, Eigen Decomposition Theorem, Eigenfunction, Eigenvector, Frobenius Theorem, Gershgorin Circle Theorem, Lanczos Algorithm, Lyapunov's First Theorem, Lyapunov's Second Theorem, Matrix Diagonalization, Ostrowski's Theorem, Perron's Theorem, Perron-Frobenius Theorem, Poincaré Separation Theorem, Random Matrix, Real Matrix, Schur's Inequalities, Similarity Transformation, Sturmian Separation Theorem, Sylvester's Inertia Law, Wielandt's Theorem Explore this topic in the MathWorld classroom

Explore with Wolfram|Alpha

References

Arfken, G. "Eigenvectors, Eigenvalues." §4.7 in Mathematical Methods for Physicists, 3rd ed. Orlando, FL: Academic Press, pp. 229-237, 1985.Hoffman, K. and Kunze, R. "Characteristic Values." §6.2 in Linear Algebra, 2nd ed. Englewood Cliffs, NJ: Prentice-Hall, p. 182, 1971.Kaltofen, E. "Challenges of Symbolic Computation: My Favorite Open Problems." J. Symb. Comput. 29, 891-919, 2000.Marcus, M. and Minc, H. Introduction to Linear Algebra. New York: Dover, p. 145, 1988.Nash, J. C. "The Algebraic Eigenvalue Problem." Ch. 9 in Compact Numerical Methods for Computers: Linear Algebra and Function Minimisation, 2nd ed. Bristol, England: Adam Hilger, pp. 102-118, 1990.Press, W. H.; Flannery, B. P.; Teukolsky, S. A.; and Vetterling, W. T. "Eigensystems." Ch. 11 in Numerical Recipes in FORTRAN: The Art of Scientific Computing, 2nd ed. Cambridge, England: Cambridge University Press, pp. 449-489, 1992.

Referenced on Wolfram|Alpha

Eigenvalue

Cite this as:

Weisstein, Eric W. "Eigenvalue." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/Eigenvalue.html

Subject classifications