TOPICS
Search

Matrix


A matrix is a concise and useful way of uniquely representing and working with linear transformations. In particular, every linear transformation can be represented by a matrix, and every matrix corresponds to a unique linear transformation. The matrix, and its close relative the determinant, are extremely important concepts in linear algebra, and were first formulated by Sylvester (1851) and Cayley.

In his 1851 paper, Sylvester wrote, "For this purpose we must commence, not with a square, but with an oblong arrangement of terms consisting, suppose, of m lines and n columns. This will not in itself represent a determinant, but is, as it were, a Matrix out of which we may form various systems of determinants by fixing upon a number p, and selecting at will p lines and p columns, the squares corresponding of pth order." Because Sylvester was interested in the determinant formed from the rectangular array of number and not the array itself (Kline 1990, p. 804), Sylvester used the term "matrix" in its conventional usage to mean "the place from which something else originates" (Katz 1993). Sylvester (1851) subsequently used the term matrix informally, stating "Form the rectangular matrix consisting of n rows and (n+1) columns.... Then all the n+1 determinants that can be formed by rejecting any one column at pleasure out of this matrix are identically zero." However, it remained up to Sylvester's collaborator Cayley to use the terminology in its modern form in papers of 1855 and 1858 (Katz 1993).

In his 1867 treatise on determinants, C. L. Dodgson (Lewis Carroll) objected to the use of the term "matrix," stating, "I am aware that the word 'Matrix' is already in use to express the very meaning for which I use the word 'Block'; but surely the former word means rather the mould, or form, into which algebraical quantities may be introduced, than an actual assemblage of such quantities...." However, Dodgson's objections have passed unheeded and the term "matrix" has stuck.

The transformation given by the system of equations

x_1^'=a_(11)x_1+a_(12)x_2+...+a_(1n)x_n
(1)
x_2^'=a_(21)x_1+a_(22)x_2+...+a_(2n)x_n
(2)
|
(3)
x_m^'=a_(m1)x_1+a_(m2)x_2+...+a_(mn)x_n
(4)

is represented as a matrix equation by

 [x_1^'; x_2^'; |; x_m^']=[a_(11) a_(12) ... a_(1n); a_(21) a_(22) ... a_(2n); | | ... |; a_(m1) a_(m2) ... a_(mn)][x_1; x_2; |; x_n],
(5)

where the a_(ij) are called matrix elements.

Matrix

An m×n matrix consists of m rows and n columns, and the set of m×n matrices with real coefficients is sometimes denoted R^(m×n). To remember which index refers to which direction, identify the indices of the last (i.e., lower right) term, so the indices m,n of the last element a_(34) in the above matrix identify it as an 3×4 matrix. Note that while this convention matches the one used for expressing measurements of a painting on canvas (where height comes first then width), it is opposite that used to measure paper, room dimensions, and windows, (in which the width is listed first followed by the height; e.g., 8 1/2 inch by 11 inch paper is 8 1/2 inches wide and 11 inches high).

A matrix is said to be square if m=n, and rectangular if m!=n. An m×1 matrix is called a column vector, and a 1×n matrix is called a row vector. Special types of square matrices include the identity matrix I, with a_(ij)=delta_(ij) (where delta_(ij) is the Kronecker delta) and the diagonal matrix a_(ij)=c_idelta_(ij) (where c_i are a set of constants).

In this work, matrices are represented using square brackets as delimiters, but in the general literature, they are more commonly delimited using parentheses. This latter convention introduces the unfortunate notational ambiguity between matrices of the form (a; b) and the binomial coefficient

 (a; b)=(a!)/(b!(a-b)!).
(6)

When referenced symbolically in this work, matrices are denoted in a sans serif font, e.g, A, B, etc. In this concise notation, the transformation given in equation (5) can be written

 x^'=Ax,
(7)

where x^' and x are vectors and A is a matrix. A number of other notational conventions also exist, with some authors preferring an italic typeface.

It is sometimes convenient to represent an entire matrix in terms of its matrix elements. Therefore, the (i,j)th element of the matrix A could be written a_(ij), and the matrix composed of entries a_(ij) could be written as A=(a_(ij))_(ij), or simply A=(a)_(ij) for short.

Two matrices may be added (matrix addition) or multiplied (matrix multiplication) together to yield a new matrix. Other common operations on a single matrix are matrix diagonalization, matrix inversion, and transposition.

The determinant det(A) or |A| of a matrix A is a very important quantity which appears in many diverse applications. The sum of the diagonal elements of a square matrix is known as the matrix trace Tr(A) and is also an important quantity in many sorts of computations.


See also

Adjacency Matrix, Adjoint, Alternating Sign Matrix, Antisymmetric Matrix, Block Matrix, Bohr Matrix, Bourque-Ligh Conjecture, Cartan Matrix, Circulant Matrix, Condition Number, Cramer's Rule, Determinant, Diagonal Matrix, Dirac Matrices, Eigen Decomposition Theorem, Eigenvector, Elementary Matrix, Elementary Row and Column Operations, Equivalent Matrix, Fourier Matrix, Gram Matrix, Hilbert Matrix, Hypermatrix, Identity Matrix, Ill-Conditioned Matrix, Incidence Matrix, Irreducible Matrix, Kac Matrix, Least Common Multiple Matrix, LU Decomposition, Matrix Addition, Matrix Inverse, Matrix Multiplication, Matrix Trace, McCoy's Theorem, Minimal Matrix, Normal Matrix, Pauli Matrices, Permutation Matrix, Positive Definite Matrix, Random Matrix, Rational Canonical Form, Reducible Matrix, Roth's Removal Rule, Shear Matrix, Singular Matrix, Smith Normal Form, Sparse Matrix, Special Matrix, Square Matrix, Stochastic Matrix, Submatrix, Symmetric Matrix, Tournament Matrix Explore this topic in the MathWorld classroom

Explore with Wolfram|Alpha

References

Arfken, G. "Matrices." §4.2 in Mathematical Methods for Physicists, 3rd ed. Orlando, FL: Academic Press, pp. 176-191, 1985.Bapat, R. B. Linear Algebra and Linear Models, 2nd ed. New York: Springer-Verlag, 2000.Dodgson, C. L. An Elementary Treatise on Determinants, with Their Application to Simultaneous Linear Equations and Algebraical Geometry. London: Macmillan, 1867.Frazer, R. A.; Duncan, W. J.; and Collar, A. R. Elementary Matrices and Some Applications to Dynamics and Differential Equations. Cambridge, England: Cambridge University Press, 1955.Katz, V. J. A History of Mathematics. An Introduction. New York: HarperCollins, 1993.Kline, M. Mathematical Thought from Ancient to Modern Times. Oxford, England: Oxford University Press, 1990.Lütkepohl, H. Handbook of Matrices. New York: Wiley, 1996.Meyer, C. D. Matrix Analysis and Applied Linear Algebra. Philadelphia, PA: SIAM, 2000.Sylvester, J. J. "Additions to the Articles 'On a New Class of Theorems' and 'On Pascal's Theorem.' " Philos. Mag., 363-370, 1850. Reprinted in J. J. Sylvester's Mathematical Papers, Vol. 1. Cambridge, England: At the University Press, pp. 145-151, 1904.Sylvester, J. J. An Essay on Canonical Forms, Supplement to a Sketch of a Memoir on Elimination, Transformation and Canonical Forms. London, 1851. Reprinted in J. J. Sylvester's Collected Mathematical Papers, Vol. 1. Cambridge, England: At the University Press, p. 209, 1904.Wolfram, S. A New Kind of Science. Champaign, IL: Wolfram Media, p. 1168, 2002.Zhang, F. Matrix Theory: Basic Results and Techniques. New York: Springer-Verlag, 1999.

Referenced on Wolfram|Alpha

Matrix

Cite this as:

Weisstein, Eric W. "Matrix." From MathWorld--A Wolfram Web Resource. https://mathworld.wolfram.com/Matrix.html

Subject classifications