This is the second post on eigenvectors and eigenvalues. Here is the first. Before I start though, I want to emphasize something I think a lot of people struggle visualizing—the operation a matrix performs on a vector.

## Multiplying a matrix by a vector

When you multiply a vector x by a matrix A, a few different things can happen. The first of which is a change of dimension. If \(A\) is an \(m \times n\) matrix, and \(x\) is an \(n \times 1\) vector (notice how the \(n\)-dimension has to line up), then the vector \(b\) given by \(Ax = b\) will be \(\textbf{m} \times 1\). So unless \(A\) is a square \(n \times n\) matrix, then the resulting vector won’t be in the same dimension as \(x\) (this hints that only square matrices have eigenvalues…why?). Here’s a concrete example:

$$\begin{bmatrix} 1 & 2 \\ 3 & 7 \\ 4 & 9 \end{bmatrix} \begin{bmatrix} 1 \\ 3 \end{bmatrix} = \begin{bmatrix} (1 \cdot 1) + (2 \cdot 3) \\ (3 \cdot 1) + (7 \cdot 3) \\ (4 \cdot 1) + (9 \cdot 3) \end{bmatrix} = \begin{bmatrix} 7 \\ 24 \\ 31 \end{bmatrix} $$

Recall that when we multiply a matrices we take the dot product with the rows of the first and the columns of the second. The vector x lies in \(\mathbb{R}^2\), but the result of \(Ax\) lies in \(\mathbb{R}^3\).

So, applying a matrix \(A\) to a vector \(x\) can do a number of things—change the dimension, rotate it, etc. We also established that if the vector x is an eigenvector of \(A\), then \(A\) is simply scaling it (not changing the direction or dimension, only the length of the vector). You can refer back to the previous article for a more precise definition.

I discussed in the previous article how to find the eigenvalues of a matrix A, but not the eigenvectors. That’s what I want to explore here. Furthermore, you’ll see that we need the eigenvalues in order to find the eigenvectors.

## How to find the eigenvectors of a matrix

Recall that we can manipulate the definition of an eigenvector (and eigenvalue) in order to arrive at \((A-\lambda I)x = 0\) .

$$\Large{\begin{align} &1.\ Ax = \lambda x \\ &2.\ Ax = \lambda Ix \\ &3.\ Ax – \lambda Ix = 0 \\ &4.\ (A – \lambda I)x = 0 \end{align}}$$

From equation 4, it’s obvious that the matrix \((A – \lambda I)\) will send to vector \(x\) to \(0\). That is, \(x\) is in the Null space of \(A – \lambda I)\). Note – \(0\) in this case is the 0-vector \(\vec{0}\), not just the scalar 0.

Side note about the Null Space (skip this if you’re comfortable with LA) –

If we have some matrix \(A\), and some vector \(x\), and \(Ax = 0\), then \(x\) is said to be in the null space of \(A\). Since a matrix multiplying a vector also represents a system of equations, this also represents a system of equations, as shown below.

$$Ax = 0 \text{ is equivalent to } \\ \begin{align} & a_{11}x_1 + a_{12}x_2 + … + a_{1n}x_n = 0 \\ & a_{21}x_1 + a_{22}x_2 + … + a_{2n}x_n = 0 \\ & \hspace{15pt} \vdots \\ & a_{m1}x_1 + a_{m2}x_2 + … + a_{mn}x_n = 0 \end{align}$$

Or, for a concrete example,

$$A = \begin{bmatrix} 2 & 2 & 3 \\ 6 & 6 & 9\\ 1 & 4 & 8 \end{bmatrix}, \text{ and}$$

$$\begin{bmatrix} 2 & 2 & 3 \\ 6 & 6 & 9\\ 1 & 4 & 8 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2\\ x_3 \end{bmatrix} = \vec{0}, \text{ so}$$

$$\begin{align} & 2x_1 + 2x_2 + 3x_3 = 0 \\ & 6x_1 + 6x_2 + 9x_3 = 0 \\ & 1x_1 + 4x_2 + 8x_3 = 0 \end{align}$$

This relationship as well as Gaussian Elimination form the whole basis (?) for Linear Algebra.

Back to Eigenvectors…

SO, now that we know \(x\) is in the Null space of \((A – \lambda I)\), and we have \(\lambda\), because we solved for the eigenvalues first (I posted in this order for a reason…). The problem is now simply – find a basis for the null space of \((A – \lambda I)\), which is just a matrix. All we have to do is solve a system of equations, which is done through Gaussian Elimination. It almost always boils down to Gaussian Elimination.

Alright, so let’s do an example.

$$A = \begin{bmatrix} 4 & 2 & 3 \\ -1 & 1 & -3\\ 2 & 4 & 9 \end{bmatrix}, \text{ and } \lambda = 3 \text{. Find an eigenvector of A.}$$

$$A – 3\lambda = \begin{bmatrix} 1 & 2 & 3 \\ -1 & -2 & -3\\ 2 & 4 & 6 \end{bmatrix}, \text{ then we reduce the matrix through Gaussian Elimination…}$$

$$\rightarrow \begin{bmatrix} 1 & 2 & 3 \\ 0 & 0 & 0\\ 0 & 0 & 0 \end{bmatrix}$$

Ok good, so we have two free variables. If we didn’t we wouldn’t expect an eigenvector (meaning, if there were no free variables, the columns would be linearly independent, and the null space would consist of only the trivial solution).

$$\begin{align} & x_1 = -2s – 3t \\ & x_2 = s \\ & x_3 = t \end{align}$$

$$\vec{x} = \begin{bmatrix} -2s-3t \\ s\\ t \end{bmatrix} = s \begin{bmatrix} -2 \\ 1\\ 0 \end{bmatrix} + t \begin{bmatrix} -3 \\ 0 \\ 1 \end{bmatrix},\ \ \ s,t \in \mathbb{R}$$

These two vectors form a basis for the null space of \((A – \lambda I)\), so they are the eigenvectors of \(A\). In fact, any vectors that lies in the null space of \((A – \lambda I)\) will be an eigenvector of \(A\). So any linear combination of those two basis vectors is also an eigenvector.

$$\vec{x} = \begin{bmatrix} -2s-3t \\ s\\ t \end{bmatrix} = s \begin{bmatrix} -2 \\ 1\\ 0 \end{bmatrix} + t \begin{bmatrix} -3 \\ 0 \\ 1 \end{bmatrix},\ \ \ s,t \in \mathbb{R}$$

$$\text{so all possible eigenvectors of } A \text{ can be written as a linear combination of} \\ \Bigg\{ \begin{bmatrix} -2 \\ 1\\ 0 \end{bmatrix}, \begin{bmatrix} -3 \\ 0 \\ 1 \end{bmatrix} \Bigg\}$$

All possible eigenvectors of A can be written as a linear combination of the two vectors above in brackets.

One more question. *What happens if 0 is an eigenvalue? *

Then we can prove the matrix \(A\) is not invertible.

$$\begin{align} & A\vec{v} = \lambda \vec{v} \text{, and } \lambda = 0, \\ & A\vec{v} = 0 \vec{v} \\ & A\vec{v} = \vec{0}, \text{ and } \vec{v} \neq \vec{0} \text{ by definition} \end{align}$$

So A has a non-zero vector \(\vec{v}\) that sends it to 0. In other words, there is a non-trivial solution in \(Null(A)\). Therefore (by the invertible matrix theorem), \(A\) is not invertible.

Well, that’s all for today! I hope you learned something about linear algebra. Also, if you picked up on my basis pun, I appreciate you. Haha