Matrices
The language of linear transformations
From Vectors to Grids of Numbers
In the previous chapters, we worked with vectors—columns of numbers representing magnitude and direction. Now we meet a new object: the matrix.
A matrix is a rectangular grid of numbers arranged in rows and columns. But here is the key insight: each column of a matrix is itself a vector. A matrix is simply several vectors placed side by side.
The first column is the vector (2, 1). The second column is the vector (-1, 1.5). This column-centric view will be essential for understanding what matrices actually do.
Interactive: See the columns as vectors
Hover over each matrix to highlight the corresponding column vector
Reading a Matrix
We describe the size of a matrix by its dimensions: rows × columns. A matrix with 2 rows and 3 columns is called a 2×3 matrix (read "two by three"). The matrix above is 2×2, also called a square matrix.
Individual entries are referenced by their row and column position. The entry in row and column is written . For our matrix above, , , , and .
The transpose of a matrix, written , flips rows and columns. The first row becomes the first column, the second row becomes the second column, and so on.
Matrix-Vector Multiplication
Here is where matrices become powerful. When we multiply a matrix by a vector, we are computing a linear combination of the matrix's columns, with the vector's entries as weights.
Given a matrix with columns and , and a vector :
Take copies of the first column, add copies of the second column. The result is where the input vector lands.
Interactive: Matrix-vector multiplication as linear combination
Matrix-vector multiplication is a linear combination of the matrix columns, weighted by the input vector
This is exactly the linear combination concept from earlier chapters, now packaged in matrix form. The matrix holds the vectors; the input vector holds the weights.
Columns as Destinations
There is another way to see this. The standard basis vectors in 2D are and . What happens when we multiply the matrix by these vectors?
The first column of the matrix is where lands. The second column is where lands. The columns are the destinations of the basis vectors.
This is the geometric heart of matrices: a matrix encodes a transformation by telling you where the basis vectors go. Every other vector follows along, because every vector is a linear combination of the basis vectors.
Interactive: Build a matrix by choosing where basis vectors land
Drag the blue and red points to choose where î and ĵ land. The matrix updates automatically. The green point (1.5, 1) transforms to the purple point.
Drag the destinations of î and ĵ, and the matrix updates to match. The transformation of any point follows automatically from these two choices.
Special Matrices
Some matrices have special structure and meaning. Understanding them provides useful reference points.
The identity matrix leaves every vector unchanged. Its columns are the standard basis vectors, so for any vector. It is the matrix equivalent of multiplying by 1.
The zero matrix sends every vector to the origin. All its entries are zero, so every linear combination produces zero. It is the matrix equivalent of multiplying by 0.
A diagonal matrix has non-zero entries only on the main diagonal (top-left to bottom-right). Diagonal matrices scale each axis independently without mixing them. They represent pure stretching or compression along the coordinate axes.
Interactive: Compare special matrix types
The identity matrix leaves every vector unchanged. It's the 'do nothing' transformation.
Looking Ahead
We have now established what a matrix is: a grid of numbers whose columns are vectors, and whose action on a vector is a linear combination weighted by that vector's components.
In the next chapter, we will explore linear transformations—the geometric perspective on matrices. We will see rotation, shearing, scaling, and reflection, and understand why the columns-as-destinations view makes matrices so natural for describing transformations of space.
Later chapters will reveal how multiplying matrices corresponds to composing transformations, how the determinant measures how much a matrix scales area, and how special vectors called eigenvectors stay pointed in the same direction under transformation.
Key Takeaways
- A matrix is a rectangular array of numbers; each column is a vector
- Matrix dimensions are given as rows × columns (e.g., a 2×3 matrix has 2 rows and 3 columns)
- Matrix-vector multiplication computes a linear combination of the matrix's columns, weighted by the input vector's entries
- The columns of a matrix tell you where the basis vectors and land after transformation
- The identity matrix leaves vectors unchanged; the zero matrix collapses everything to the origin
- Diagonal matrices scale each axis independently