5. Constant-coefficient problems¶
While the theory of homogeneous linear ODE systems is straightforward, the good news for the general case pretty much ends there. In most problems we cannot get a reasonably simple expression for a fundamental matrix. But there is a very important special case in which we can say a lot: when the coefficient matrix is constant.
Accordingly, we now specialize to
where \(\bfA\) is an \(n\times n\) constant matrix. We are finally ready to see the significance of the eigenvalue condition
If we seek a solution in the form \(\mathbf{x}(t)=g(t)\mathbf{v}\) for this eigenvector, then
which can be satisfied if \(g'=\lambda g\). That is, we have a solution,
5.1. General solution¶
Counting algebraic multiplicities, we know that \(\mathbf{A}\) has eigenvalues \(\lambda_1,\ldots,\lambda_n\). Say that we have eigenvectors \(\mathbf{v}_1,\ldots,\mathbf{v}_n\) to go with them. Then we have \(n\) homogeneous solutions
Our next move is to form
and determine whether this is a fundamental matrix. According to Abel’s theorem, we can ask that question at any value of \(t\), including \(t=0\). So the key issue is whether
is invertible. If so, then we have the ingredients of the general homogeneous solution.
Let \(\mathbf{A}\) have eigenvalues \(\lambda_1,\ldots,\lambda_n\) and corresponding eigenvectors \(\mathbf{v}_1,\ldots,\mathbf{v}_n\). If the matrix \(\bfV\) in (5.4) is invertible, then
is a fundamental matrix for \(\mathbf{x}'=\mathbf{A}\mathbf{x}\). Hence the general solution can be expressed as
Example
Given that
has eigenpairs
find the general solution of \(\bfx'=\bfA\bfx\).
Solution
The determinant of
is \(-4\), so this matrix is not singular. Hence a general solution is
We can often skip the singularity check in the previous example.
If the eigenvalues of \(\bfA\) are distinct, then their corresponding eigenvectors are the columns of an invertible matrix.
We will return later to the situation of a repeated eigenvalue.
5.2. Complex eigenvalues¶
Example
Find the general solution of
Solution
In the first chapter we found the eigenvalues \(1\pm i\) and corresponding eigenvectors \(\twovec{1}{\pm i}\). This leads to the solution
If we expect a real solution (because of real initial values, say), then \(c_2 = \overline{c_1}\) and we can instead write
Heading toward a completely real form, decompose \(c_1= a_1 - i a_2\) and note that
Then
where everything omitted is purely imaginary. Since we only want the real part, we get
for arbitrary real \(a_1\) and \(a_2\).
The essence of the example is that to convert the solution to an entirely real expression, decompose the complex eigensolution into real and imaginary parts. Suppose that
Then
The real and imaginary parts of the above product form the basis of the real solution. Specifically,
for arbitrary real \(a_1\) and \(a_2\).
5.3. The oscillator reloaded¶
In an earlier section we showed that the harmonic oscillator
can be converted to the constant-coefficient system
via \(x_1=u\), \(x_2=u'\). The eigenvalues of the coefficient matrix are the roots of the characteristic polynomial
This is also what we called the characteristic polynomial of the oscillator ODE! That is, the characteristic values we used there are the eigenvalues here. The exponential solutions we saw before also come from the first component of the general vector solution. To repeat, all second-order problems are really first-order systems in a cheap disguise.