ATTENTION / DISCLAIMER: This document was authored by Prof. Dr. Ali Özgür Kişisel (METU Mathematics Department), not by Selim Kaan Ozsoy. This .tex version was generated from the original PDF lecture notes using AI.

MATH 219 Spring 2025 Lecture 10 Lecture notes by Özgür Kişisel

Content: Constant coefficient systems. Complex eigenvalues.

Suggested Problems: (Boyce, Di Prima, 10th edition)

  • §7.6: 3a, 5a, 9, 13, 26

1. Complex eigenvalues

Definition 1.1

Let $A$ be an $n\times n$ matrix. The polynomial $\det(A-\lambda I)$ (which is of degree $n$ in $\lambda$) is called the characteristic polynomial of the matrix $A$.

Roots of the characteristic polynomial are eigenvalues of $A$. Even if the matrix $A$ has real entries, the characteristic polynomial may have non-real eigenvalues, hence we may have to work with complex numbers.

Example 1.1

Let $A=\begin{bmatrix}0 & -1 \\ 1 & 0\end{bmatrix}$. Then

$$ \det(A-\lambda I)=\begin{vmatrix}-\lambda & -1 \\ 1 & -\lambda\end{vmatrix}=\lambda^{2}+1. $$

Therefore the eigenvalues of $A$ are $+i$ and $-i$. $\square$

Eigenvalues and eigenvectors of a real matrix $A$ always appear in complex conjugate pairs:

Lemma 1.1

Say $\lambda=a+ib$ is an eigenvalue of a real matrix $A$ with corresponding eigenvector $v$. Then $\overline{\lambda}=a-ib$ is also an eigenvalue of $A$ with corresponding eigenvector $\overline{v}$.

Proof: We have $Av=\lambda v$. Take complex conjugates of both sides.

$$ A\overline{v}=\overline{A}\overline{v}=\overline{Av}=\overline{\lambda v}=\overline{\lambda}\overline{v}, $$

hence the claim follows. $\square$

Now, let us consider the system $x^{\prime}=Ax$ where $A$ is a constant matrix with complex eigenvalues. If $\lambda=a+ib$ is an eigenvalue with eigenvector $v$ then we can still write down a complex valued solution $x=e^{\lambda t}v$ since our previous analysis about these solutions in the previous lecture is essentially unchanged, regardless of whether the eigenvalue is real or complex valued.

However, we are usually interested in real valued solutions. So, a way to extract real valued solutions by using the complex valued solutions would be very helpful at this point. In order to do this, we should first discuss the meaning of $e^{\lambda t}=e^{at}e^{ibt}$.

Theorem 1.1 (Euler’s identity)

Let $b$ be a real number. Then

$$ e^{ibt}=\cos(bt)+i\sin(bt). $$

Sketch of Proof: (Assuming some facts which say that the theorems about Taylor series from calculus also hold for complex numbers.) Compute the Taylor series for $e^{ibt}$. Since both sides of the equation are analytic functions, equality of their Taylor series will imply the equality of the functions themselves:

$$ \begin{aligned} e^{ibt} &= \sum_{n=0}^{\infty}\frac{(ibt)^{n}}{n!} \\ &= \sum_{k=0}^{\infty}\frac{(ibt)^{k}}{(2k)!}+\sum_{k=0}^{\infty}\frac{(ibt)^{2k+1}}{(2k+1)!} \\ &= \sum_{k=0}^{\infty}(-1)^{k}\frac{(bt)^{k}}{(2k)!}+i\sum_{k=0}^{\infty}\frac{(-1)^{k}(bt)^{2k+1}}{(2k+1)!} \\ &= \cos(bt)+i\sin(bt) \end{aligned} $$

This finishes the proof. $\square$

Therefore, we have a method for finding complex solutions of a constant coefficient system and we can express the result in terms of familiar trigonometric functions. How can we use these to find the real solutions of the system?

After all, the system that we started with is real, therefore by the basic theory, it must have real solutions. Actually, if the system is $n\times n$, then it must have a fundamental set of $n$ linearly independent real solutions. For any real life application, the real solutions will be the relevant ones, therefore this passage from complex solutions to real solutions will be necessary at some point.

Now if $\lambda=a+ib$ is an eigenvalue of $A$ with $b\ne0$ and $v$ a corresponding (complex) eigenvector, then $Av=\lambda v.$ We have

$$ A\overline{v}=\overline{\lambda}\overline{v}, \quad \quad e^{\overline{\lambda}t}=e^{at}e^{-ibt}=\overline{e^{\lambda t}} $$

therefore $x^{(2)}=e^{\overline{\lambda}t}\overline{v}=\overline{e^{\lambda t}v}=\overline{x^{(1)}}.$ If $x^{(1)}=y^{(1)}+iy^{(2)}$ with $y^{(1)}$ and $y^{(2)}$ real, then $x^{(2)}=y^{(1)}-iy^{(2)}$. Therefore,

$$ \begin{aligned} y^{(1)} &= \frac{1}{2}x^{(1)}+\frac{1}{2}x^{(2)}, \\ y^{(2)} &= \frac{1}{2i}x^{(1)}-\frac{1}{2i}x^{(2)}. \end{aligned} $$

By the principle of superposition, since $x^{(1)}$ and $x^{(2)}$ are solutions of the system, $y^{(1)}$ and $y^{(2)}$ which are their linear combinations must also be solutions of the system. Therefore we produced two real solutions corresponding to the two eigenvalues $\lambda_{1}=a+ib$ and $\lambda_{2}=a-ib$.

One can check that if $x^{(1)}$ and $x^{(2)}$ are linearly independent to start with, then $y^{(1)}$ and $y^{(2)}$ obtained in this way are also linearly independent. Therefore, by applying this exchange process to every conjugate pair of eigenvalues, we will obtain as many linearly independent real solutions as linearly independent complex solutions.

There is a slight shortcut for obtaining $y^{(1)}$ and $y^{(2)}$ above from $x^{(1)}$ only. Since $x^{(2)}$ is the complex conjugate of $x^{(1)}$, it essentially carries the same information with $x^{(1)}$. A quick check shows us that $y^{(1)}$ is the real part of $x^{(1)}$ and $y^{(2)}$ is the imaginary part of $x^{(1)}$.

Example 1.2

Solve the system $$ \begin{aligned} x_{1}^{\prime} &= -4x_{1}+10x_{2} \\ x_{2}^{\prime} &= -5x_{1}+6x_{2} \end{aligned} $$

Solution: The coefficient matrix is $A=\begin{bmatrix}-4 & 10 \\ -5 & 6\end{bmatrix}$. Then

$$ \begin{aligned} \det(A-\lambda I) &= \begin{vmatrix}-4-\lambda & 10 \\ -5 & 6-\lambda\end{vmatrix} \\ &= (-4-\lambda)(6-\lambda)+50 \\ &= \lambda^{2}-2\lambda+26 \\ &= (\lambda-1)^{2}+25 \end{aligned} $$

The eigenvalues of $A$ are $\lambda_{1,2}=1\mp5i.$ Let us find the eigenvectors for $\lambda_{1}=1+5i.$

$$ \begin{bmatrix}-5-5i & 10 \\ -5 & 5-5i\end{bmatrix} \xrightarrow{R_{1}/(-5)\rightarrow R_{1}} \begin{bmatrix}1+i & -2 \\ -5 & 5-5i\end{bmatrix} \xrightarrow{5R_{1}+R_{2}\rightarrow R_{2}} \begin{bmatrix}1+i & -2 \\ 0 & 0\end{bmatrix} $$

(Note: The row operations in the PDF were slightly condensed, but lead to the same result).

Hence $v=\begin{bmatrix}1-i \\ 1\end{bmatrix}$ is an eigenvector. Then

$$ \begin{aligned} x^{(1)} &= \begin{bmatrix}1-i \\ 1\end{bmatrix}e^{(1+5i)t} \\ &= \left(\begin{bmatrix}1 \\ 1\end{bmatrix}-i\begin{bmatrix}1 \\ 0\end{bmatrix}\right)e^{t}(\cos 5t+i\sin 5t) \\ &= e^{t}\left(\begin{bmatrix}\cos 5t \\ \cos 5t\end{bmatrix}+i\begin{bmatrix}\sin 5t \\ \sin 5t\end{bmatrix}-i\begin{bmatrix}\cos 5t \\ 0\end{bmatrix}-i^2\begin{bmatrix}\sin 5t \\ 0\end{bmatrix}\right) \\ &= e^{t}\left(\begin{bmatrix}\cos 5t+\sin 5t \\ \cos 5t\end{bmatrix}+i\begin{bmatrix}\sin 5t-\cos 5t \\ \sin 5t\end{bmatrix}\right) \end{aligned} $$

Looking at the real and imaginary parts of the solution, we find that

$$ y^{(1)}=\begin{bmatrix}e^{t}\cos 5t+e^{t}\sin 5t \\ e^{t}\cos 5t\end{bmatrix} $$

$$ y^{(2)}=\begin{bmatrix}e^{t}\sin 5t-e^{t}\cos 5t \\ e^{t}\sin 5t\end{bmatrix} $$

are solutions. Since $\lambda_{1}\ne\lambda_{2}$, the solutions $x^{(1)}$ and $x^{(2)}$ are linearly independent. Hence $y^{(1)}$ and $y^{(2)}$ are linearly independent.

All solutions of the system are

$$ x=c_{1}y^{(1)}+c_{2}y^{(2)} $$

with $c_{1}, c_{2}\in\mathbb{R}.$ $\square$