Eigenvalues are a special set of scalars associated with a linear system of equations (i.e., a matrix equation) that are sometimes also known as characteristic roots, characteristic values (Hoffman and Kunze 1971), proper values, or latent roots (Marcus and Minc 1988, p. 144).
The determination of the eigenvalues and eigenvectors of a system is extremely important in physics and engineering, where it is equivalent to matrix diagonalization and arises in such common applications as stability analysis, the physics of rotating bodies, and small oscillations of vibrating systems, to name only a few. Each eigenvalue is paired with a corresponding so-called eigenvector (or, in general, a corresponding right eigenvector and a corresponding left eigenvector; there is no analogous distinction between left and right for eigenvalues).
The decomposition of a square matrix  into eigenvalues and eigenvectors is known in this work as
 eigen decomposition, and the fact that this
 decomposition is always possible as long as the matrix consisting of the eigenvectors
 of 
 is square
 is known as the eigen decomposition theorem.
The Lanczos algorithm is an algorithm for computing the eigenvalues and eigenvectors for large symmetric sparse matrices.
Let  be a linear
 transformation represented by a matrix 
. If there is a vector 
 such that
| 
 
(1)
 
 | 
for some scalar , then 
 is called the eigenvalue of 
 with corresponding (right) eigenvector 
.
Letting 
 be a 
 square
 matrix
| 
 
(2)
 
 | 
with eigenvalue ,
 then the corresponding eigenvectors satisfy
| 
 
(3)
 
 | 
which is equivalent to the homogeneous system
| 
 
(4)
 
 | 
Equation (4) can be written compactly as
| 
 
(5)
 
 | 
where 
 is the identity matrix. As shown in Cramer's
 rule, a linear system of equations
 has nontrivial solutions iff the determinant
 vanishes, so the solutions of equation (5) are given by
| 
 
(6)
 
 | 
This equation is known as the characteristic equation of ,
 and the left-hand side is known as the characteristic
 polynomial.
For example, for a 
 matrix, the eigenvalues are
| 
 
(7)
 
 | 
which arises as the solutions of the characteristic equation
| 
 
(8)
 
 | 
If all 
 eigenvalues are different, then plugging these back in gives 
 independent equations for the 
 components of each corresponding eigenvector,
 and the system is said to be nondegenerate. If the eigenvalues are 
-fold degenerate, then the system
 is said to be degenerate and the eigenvectors are
 not linearly independent. In such cases, the additional constraint that the eigenvectors
 be orthogonal,
| 
 
(9)
 
 | 
where 
 is the Kronecker delta, can be applied to yield
 
 additional constraints, thus allowing
 solution for the eigenvectors.
Eigenvalues may be computed in the Wolfram Language using Eigenvalues[matrix]. Eigenvectors and eigenvalues can be returned together using the command Eigensystem[matrix].
Assume we know the eigenvalue for
| 
 
(10)
 
 | 
Adding a constant times the identity matrix to ,
| 
 
(11)
 
 | 
so the new eigenvalues equal the old plus . Multiplying 
 by a constant 
| 
 
(12)
 
 | 
so the new eigenvalues are the old multiplied by .
Now consider a similarity transformation of . Let 
 be the determinant of 
, then
| 
 
(13)
 
 | |||
| 
 
(14)
 
 | |||
| 
 
(15)
 
 | 
so the eigenvalues are the same as for .