Eigenvalues and Eigenvectors

An intuitive introduction to eigenvalues and eigenvectors, explained with the idea of direction and length change, plus real-world applications like PCA.

Linear AlgebraEigenvaluesEigenvectorsPCA

What Are Eigenvalues and Eigenvectors?

A matrix transforms vectors — it can rotate, stretch, squash, or flip them.
Most vectors change both their direction and length when multiplied by a matrix.

But there are special vectors that behave differently:
Their direction stays the same, and only their length changes.

These special vectors are called eigenvectors, and
the factor by which their length changes is called the eigenvalue.


The Mathematical Definition

An eigenvalue–eigenvector pair satisfies:

Av=λvA\mathbf{v} = \lambda \mathbf{v}
  • AA: the matrix (transformation)
  • v\mathbf{v}: eigenvector
  • λ\lambda: eigenvalue

Here, v0\mathbf{v} \neq \mathbf{0}.
The equation says: Applying AA to v\mathbf{v} simply scales it by λ\lambda.


How to Find Them

  1. Start with the equation:
(AλI)v=0(A - \lambda I)\mathbf{v} = 0
  1. For a non-trivial solution (v0\mathbf{v} \neq \mathbf{0}), we require:
det(AλI)=0\det(A - \lambda I) = 0

This is called the characteristic equation.
Solving it gives the eigenvalues λ\lambda.

  1. Substitute each eigenvalue back into (AλI)v=0(A - \lambda I)\mathbf{v} = 0 to find the corresponding eigenvectors.

Example: A 2×2 Matrix

Let

A=(2112)A = \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix}
  1. Characteristic equation:
det((2112)λ(1001))=0\det\left( \begin{pmatrix} 2 & 1 \\ 1 & 2 \end{pmatrix} - \lambda \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} \right) = 0

That is:

det(2λ112λ)=0\det\begin{pmatrix} 2-\lambda & 1 \\ 1 & 2-\lambda \end{pmatrix} = 0
  1. Expanding:
(2λ)21=0(2λ)2=1(2-\lambda)^2 - 1 = 0 \\ (2-\lambda)^2 = 1

So:

λ=3orλ=1\lambda = 3 \quad \text{or} \quad \lambda = 1
  1. For λ=3\lambda = 3, solving (A3I)v=0(A - 3I)\mathbf{v} = 0 gives:
v=(11)\mathbf{v} = \begin{pmatrix} 1 \\ 1 \end{pmatrix}
  1. For λ=1\lambda = 1, we get:
v=(11)\mathbf{v} = \begin{pmatrix} 1 \\ -1 \end{pmatrix}

Geometric Meaning

  • λ=3\lambda = 3 with eigenvector (1,1)(1, 1): the transformation triples the vector’s length but keeps its direction.
  • λ=1\lambda = 1 with eigenvector (1,1)(1, -1): the length stays the same, direction unchanged.

Real-World Application: PCA

Eigenvalues and eigenvectors are not just theoretical — they are key tools in data science and machine learning.
A famous example is Principal Component Analysis (PCA).

PCA analyzes high-dimensional data to find the directions where the data varies the most.
These directions are:

  • Eigenvectors → directions of maximum variance (principal components)
  • Eigenvalues → amount of variance along those directions

In PCA, we compute the eigenvectors and eigenvalues of the covariance matrix of the data,
and choose the top eigenvectors (with the largest eigenvalues) to reduce dimensionality while keeping most of the information.


Interactive Demo

Eigenvalues and Eigenvectors Demo

Watch how regular vectors change both direction and length, while eigenvectors only change length!

v1(2, 0)v2(0, 2)v3(-1, 1)v4(1, -2)e1(1, 1)e2(1, -1)

Legend:

Regular vectors (direction and length change)
Eigenvectors (only length changes)

Transformation Matrix:

[3, 1]
[1, 3]

Eigenvalues:

λ₁ = 4(for eigenvector (1,1))
λ₂ = 2(for eigenvector (1,-1))

🧮 Try calculating by hand:

Matrix A × vector v₁ = [3,1; 1,3] × [1,0]
= [3×1+1×0, 1×1+3×0] = [3,1]
Matrix A × eigenvector e₁ = [3,1; 1,3] × [1,1]
= [3×1+1×1, 1×1+3×1] = [4,4] = 4×[1,1]
→ Direction unchanged, scaled by λ₁=4!

🔑 Key Insight:

When you apply the transformation, notice that the red eigenvectorskeep pointing in the same direction but change length by their eigenvalue factor. The gray regular vectors change both direction and length.

Demo idea:

  • Show multiple arrows (vectors) on a plane.
  • When a matrix is applied, all arrows move — except eigenvectors, which only change in length.
  • PCA mode: show a scatter plot of data points, highlight the principal components (eigenvectors), and scale them according to their eigenvalues.
← Back to Encyclopedia