Matrix

EXPLORE THIS TOPIC IN the MathWorld Classroom Contribute to this entry

A matrix is a concise and useful way of uniquely representing and working with linear transformations. In particular, every linear transformation can be represented by a matrix, and every matrix corresponds to a unique linear transformation. The matrix, and its close relative the determinant, are extremely important concepts in linear algebra, and were first formulated by Sylvester (1851) and Cayley.

In his 1851 paper, Sylvester wrote, "For this purpose we must commence, not with a square, but with an oblong arrangement of terms consisting, suppose, of m lines and n columns. This will not in itself represent a determinant, but is, as it were, a Matrix out of which we may form various systems of determinants by fixing upon a number p, and selecting at will p lines and p columns, the squares corresponding of pth order." Because Sylvester was interested in the determinant formed from the rectangular array of number and not the array itself (Kline 1990, p. 804), Sylvester used the term "matrix" in its conventional usage to mean "the place from which something else originates" (Katz 1993). Sylvester (1851) subsequently used the term matrix informally, stating "Form the rectangular matrix consisting of n rows and (n+1) columns.... Then all the n+1 determinants that can be formed by rejecting any one column at pleasure out of this matrix are identically zero." However, it remained up to Sylvester's collaborator Cayley to use the terminology in its modern form in papers of 1855 and 1858 (Katz 1993).

In his 1867 treatise on determinants, C. L. Dodgson (Lewis Carroll) objected to the use of the term "matrix," stating, "I am aware that the word 'Matrix' is already in use to express the very meaning for which I use the word 'Block'; but surely the former word means rather the mould, or form, into which algebraical quantities may be introduced, than an actual assemblage of such quantities...." However, Dodgson's objections have passed unheeded and the term "matrix" has stuck.

The transformation given by the system of equations

x_1^'=a_(11)x_1+a_(12)x_2+...+a_(1n)x_n
(1)
x_2^'=a_(21)x_1+a_(22)x_2+...+a_(2n)x_n
(2)
|
(3)
x_m^'=a_(m1)x_1+a_(m2)x_2+...+a_(mn)x_n
(4)

is represented as a matrix equation by

 [x_1^'; x_2^'; |; x_m^']=[a_(11) a_(12) ... a_(1n); a_(21) a_(22) ... a_(2n); | | ... |; a_(m1) a_(m2) ... a_(mn)][x_1; x_2; |; x_n],
(5)

where the a_(ij) are called matrix elements.

Matrix

An m×n matrix consists of m rows and n columns, and the set of m×n matrices with real coefficients is sometimes denoted R^(m×n). To remember which index refers to which direction, identify the indices of the last (i.e., lower right) term, so the indices m,n of the last element in the above matrix identifies it as an m×n matrix.

A matrix is said to be square if m=n, and rectangular if m!=n. An m×1 matrix is called a column vector, and a 1×n matrix is called a row vector. Special types of square matrices include the identity matrix I, with a_(ij)=delta_(ij) (where delta_(ij) is the Kronecker delta) and the diagonal matrix a_(ij)=c_idelta_(ij) (where c_i are a set of constants).

In this work, matrices are represented using square brackets as delimiters, but in the general literature, they are more commonly delimited using parentheses. This latter convention introduces the unfortunate notational ambiguity between matrices of the form (a; b) and the binomial coefficient

 (a; b)=(a!)/(b!(a-b)!).
(6)

When referenced symbolically in this work, matrices are denoted in a sans serif font, e.g, A, B, etc. In this concise notation, the transformation given in equation (5) can be written

 x^'=Ax,
(7)

where x^' and x are vectors and A is a matrix. A number of other notational conventions also exist, with some authors preferring an italic typeface.

It is sometimes convenient to represent an entire matrix in terms of its matrix elements. Therefore, the (i,j)th element of the matrix A could be written a_(ij), and the matrix composed of entries a_(ij) could be written as A=(a_(ij))_(ij), or simply A=(a)_(ij) for short.

Two matrices may be added (matrix addition) or multiplied (matrix multiplication) together to yield a new matrix. Other common operations on a single matrix are matrix diagonalization, matrix inversion, and transposition.

The determinant det(A) or |A| of a matrix A is a very important quantity which appears in many diverse applications. The sum of the diagonal elements of a square matrix is known as the matrix trace Tr(A) and is also an important quantity in many sorts of computations.

Wolfram Web Resources

Mathematica »

The #1 tool for creating Demonstrations and anything technical.

Wolfram|Alpha »

Explore anything with the first computational knowledge engine.

Wolfram Demonstrations Project »

Explore thousands of free applications across science, mathematics, engineering, technology, business, art, finance, social sciences, and more.

Computerbasedmath.org »

Join the initiative for modernizing math education.

Online Integral Calculator »

Solve integrals with Wolfram|Alpha.

Step-by-step Solutions »

Walk through homework problems step-by-step from beginning to end. Hints help you try the next step on your own.

Wolfram Problem Generator »

Unlimited random practice problems and answers with built-in Step-by-step solutions. Practice online or make a printable study sheet.

Wolfram Education Portal »

Collection of teaching and learning tools built by Wolfram education experts: dynamic textbook, lesson plans, widgets, interactive Demonstrations, and more.

Wolfram Language »

Knowledge-based programming for everyone.