13. Eigenvalues

The last stop on our whirlwind tour of linear algebra is the hardest to motivate for a little while. However, the topic is important to the differential equations we study in future chapters, in the sense of oxygen being an important part of breathing.

We are still operating with square matrices only.

Definition 13.1 (Eigenvalue and eigenvector)

Suppose \(\bfA\in\cmn{n}{n}\). If there exist a scalar \(\lambda\) and a nonzero vector \(\bfv\) such that

\[\bfA \bfv = \lambda \bfv,\]

then \(\lambda\) is an eigenvalue of \(\bfA\) with associated eigenvector \(\bfv\).

If you think of \(\bfA\) as acting on vectors, then an eigenvector is a direction in which the action of \(\bfA\) is the same as a scalar; we have found a little one-dimensional oasis in which the behavior of \(\bfA\) is easy to comprehend.

13.1. Eigenspaces

An eigenvalue is a clean, well-defined target. Eigenvectors are a little slipperier. For starters, if \(\bfA\bfv=\lambda\bfv\), then

\[\bfA(c\bfv) = c(\bfA\bfv)=c(\lambda\bfv)=\lambda(c\bfv).\]

Note

Every nonzero multiple of an eigenvector is also an eigenvector for the same eigenvalue.

But there can be even more ambiguity than scalar multiples.

Example

Let \(\meye\) be an identity matrix. Then \(\meye\bfx=\bfx\) for any vector \(\bfx\), so every nonzero vector is an eigenvector!

Fortunately we already have the tools we need to describe a more robust target, based on the very simple reformulation

\[\bfzero=\bfA\bfv-\lambda\bfv=(\bfA-\lambda\meye)\bfv.\]
Definition 13.2 (Eigenspace)

Let \(\lambda\) be an eigenvalue of \(\bfA\). The eigenspace associated with \(\lambda\) is the general solution of \((\bfA-\lambda\meye)\bfx = \bfzero\).

Eigenspaces, unlike eigenvectors, are unique. We have to be a bit careful, though, because we usually express such spaces using basis vectors, and those bases are not themselves unique. It’s also not unusual for problems and discussions to use eigenvectors and just put up with the nonuniqueness.

13.2. Computing eigenvalues and eigenvectors

Note that if \(\lambda\) is not an eigenvalue, then by definition the only solution of \((\bfA-\lambda\meye)\bfv=\bfzero\) is \(\bfv=\bfzero\). That requires \(\bfA-\lambda\meye\) to be invertible.

Theorem 13.3

\(\lambda\) is an eigenvalue of \(\bfA\) if and only if \(\bfA-\lambda\meye\) is singular.

In practice the most common way to find eigenvalues by hand is through the equivalent condition \(\det(\bfA-\lambda\meye)=0\). This determinant has a particular form and name.

Definition 13.4 (Characteristic polynomial of a matrix)

Suppose \(\bfA\) is an \(n\times n\) matrix. The function \(p(z) = \det(\bfA-z\meye)\) is a polynomial of degree \(n\) in \(z\), known as the characteristic polynomial of \(\bfA\).

Algorithm 13.5 (Eigenvalues and eigenspaces)

Given an \(n\times n\) matrix \(\bfA\):

  1. Find the characteristic polynomial \(p\) of \(\bfA\).

  2. Let \(\lambda_1,\ldots,\lambda_k\) be the distinct roots of \(p\). These are the eigenvalues. (If \(k<n\), it’s because one or more roots has multiplicity greater than 1.)

  3. For each \(\lambda_j\), find the general solution of \((\bfA-\lambda_j\meye)\bfv=\bfzero\). This is the eigenspace associated with \(\lambda_j\).

Example

Find the eigenvalues and eigenspaces of

\[\begin{split}\bfA = \begin{bmatrix} 1 & 1 \\ 4 & 1 \end{bmatrix}.\end{split}\]

13.3. MATLAB

MATLAB computes eigenvalues (through an entirely different process) with the eig command. From the preceding example, for instance,

A = [ 1 1; 4 1 ];
lambda = eig(A)
ans =

    '9.7.0.1296695 (R2019b) Update 4'
lambda =

   3.000000000000000
  -1.000000000000000

If you want eigenvectors as well, use an alternate form for the output:

[V,D] = eig(A)
V =

   0.447213595499958  -0.447213595499958
   0.894427190999916   0.894427190999916


D =

   3.000000000000000                   0
                   0  -1.000000000000000

In most cases, column V(:,k) is an eigenvector for the eigenvalue D(k,k). (For eigenvalues of multiplicity greater than 1, the interpretation can be more complicated.) Keep in mind that any scalar multiple of an eigenvector is equally valid.

13.4. Eigenvectors for \(2\times 2\)

Finding the exact roots of a cubic polynomial is not an easy matter unless the polynomial is special. Thus most of our hand computations will be with \(2\times 2\) matrices. Suppose \(\lambda\) is known to be an eigenvalue of \(\bfA\). Then \(\bfA-\lambda\meye\) must be singular, and its RREF has at least one free column. Hence row elimination will zero out the second row entirely, and we can ignore it. That allows us to deduce the following.

Algorithm 13.6 (Eigenvectors for \(2\times 2\))
  1. Let \(\lambda\) be an eigenvalue of \(\bfA\).

  2. Let the first row of \(\bfA-\lambda\meye\) be designated by \([\alpha,\beta]\).

    • If \(\alpha=\beta=0\), then \(\bfA-\lambda\meye\) is a zero matrix and all of \(\complex^2\) is the eigenspace of \(\lambda\).

    • Otherwise, the vector \([\beta;\,-\alpha]\) is a basis of the eigenspace of \(\lambda\).

Example

Find the eigenstuff of

\[\bfA = \twomat{1}{1}{-1}{1}.\]