So far, we have been working on many matrices. We also learned that the cool notation “matrix” can describe a lot of things: linear equations, transformation, rotation, change the basis… We can rely on the matrix to do a lot of calculations that couldn’t be done or are hard to describe before. However, in real life, even if we use the matrix to cheat, the questions can still be really hard and tedious. In this chapter, we will be focusing on ONE way to make our life easier. Also, it turns out, this specific method has a huge impact on many area.

Our main character has a German name: Eigen. It is not because the guy who invent this is a German, but because David Hilbert, a German mathematician gave him this classic German name. From then, everyone starts calling him this.

Although this idea first arises in the study of quadratic forms and differential equations, we are not going to take that approach. Instead, let’s first define what it is, and then see what can this bad boy do.

So let’s look at an example matrix:

$$A = \begin{bmatrix} 1 & 0\\ 1 & 2 \\ \end{bmatrix}$$

If we treat this matrix as a transformation, applying this transformation on any 2d vector will give you another 2d vector. But this is not a new story, we did this before a billion times. What is interesting is that for some vectors in 2d space, their direction will not change after the transformation. What does that mean algebraically? Assume for some vector \(\pmb{x}\), we have the relation:

$$A\pmb{x} = \lambda\pmb{x}$$

Where \(\lambda\) is some non-zero scalar value. Notice this value just tells us how the vector is been “stretched” or “compressed”. If the \(\lambda\) turns out to be negative, it just means the direction is reversed.

Let’s look at one of such vectors. In our case, if we input vector

$$\begin{bmatrix} -1\\ 1 \end{bmatrix}$$

and calculate the vector after the transformation, we get:

$$A\pmb{x} = \begin{bmatrix} 1 & 0\\ 1 & 2 \\ \end{bmatrix}\begin{bmatrix} -1\\ 1 \end{bmatrix} = \begin{bmatrix} -1\\ 1 \end{bmatrix}$$

What a coincident! If we multiply 2 on both side, we get:

$$2\times A\pmb{x} = 2\begin{bmatrix} 1 & 0\\ 1 & 2 \\ \end{bmatrix}\begin{bmatrix} -1\\ 1 \end{bmatrix} = \begin{bmatrix} 1 & 0\\ 1 & 2 \\ \end{bmatrix}\begin{bmatrix} -2\\ 2 \end{bmatrix} = \begin{bmatrix} -2\\ 2 \end{bmatrix}$$

What does the above equation means? It means not only the vector

$$\begin{bmatrix} -1\\ 1 \end{bmatrix}$$

can do that, but any multiple of vector

$$\begin{bmatrix} -1\\ 1 \end{bmatrix}$$

also satisfy the equation:

$$A\pmb{x} = \lambda\pmb{x}$$

For this specific vector, our vector stays the same after the transformation. So in this case, \(\lambda\) is just \(1\).

If we can find such vector and the corresponding \(\lambda\) that satisfy the equation \(A\pmb{x} = \lambda\pmb{x}\). Then we call the vector \(\pmb{x}\) the Eigenvector of our matrix \(A\), and we call the value \(\lambda\) the corresponding Eigenvalue of the matrix.

Great, now we know what does that means algebraically. Can we visualize this?

If you take the transformation and graph the change on different vector input, you will get something looks like this:

w6example

As we expected, the vector

$$\begin{bmatrix} -1\\ 1 \end{bmatrix}$$

and all the vectors along that line stay on that line after the transformation. But if you pay more attention, along the y-axis, the vectors do not change their direction too. We can check it by multiplying any vector along the y-axis:

$$A\pmb{x}' = \begin{bmatrix} 1 & 0\\ 1 & 2 \\ \end{bmatrix}\begin{bmatrix} 0\\ 1 \end{bmatrix} =\begin{bmatrix} 0\\ 2 \end{bmatrix}$$

So any vector on the y-axis will get doubled after the transformation. What does that mean? Well, that vector didn’t change the direction(still pointing along the y-axis) so it is obviously another eigenvector of our matrix \(A\). What is the eigenvalue corresponding to that eigenvector? Since the vectors get doubled, then the eigenvalue must be two.

Wonderful! We have two different eigenvectors and eigenvalues for a single matrix. Can we find more? Unfortunately, we can’t in this case. Later on, we will know why that is the case.

Now, you might be wondering, does that means the eigenvectors and eigenvalues exist for any matrices? If so, how could we find them without guessing? We will leave that question to the future chapter.

Like this post? Share on: TwitterFacebookEmail


Published

Last Updated

Category

Chapter 7 Eigenvector, eigenvalue, and Diagonalization

Stay in Touch