14 Pre-Class Assignment: Diagonalization and Powers
Contents
14 Pre-Class Assignment: Diagonalization and Powers#
Readings for this topic (Recommended in bold)#
Goals for today’s pre-class assignment#
1. Eigenvalues and eigenvectors review#
Definition: A non-zero vector \(x\) in \(\mathbb R^n\) is called an eigenvector of a \(n\times n\) matrix \(A\) if \(Ax\) is a scalar multiple of \(x\) (i.e. \(Ax\) has the same direction as \(x\)). If \(Ax = \lambda x\), then \(\lambda\) is called the eigenvalue of \(A\) corresponding to \(x\).
Steps for finding the eigenvalues and eigenvectors#
We want to find \(\lambda\) and non-zero vector \(x\) such that \(Ax=\lambda x\) for a \(n\times n\) matrix.
We introduce an identity matrix \(I\) of \(n\times n\). Then the equation becomes $\(Ax = \lambda I x\)\( \)\(Ax-\lambda I x = 0\)\( \)\((A-\lambda I)x = 0\)$
This suggests that we want to find \(\lambda\) such that \((A-\lambda I)x=0\) has a non-trivial solution. It is equivalent to the matrix \(A-\lambda I\) being singular, i.e., it having a determinant of \(0\). $\(|A-\lambda I|=0\)$
The determinant is a polynomial in \(\lambda\) (called the characteristic polynomial of \(A\)) with degree \(n\). We solve this equation (called the characteristic equation) for all possible \(\lambda\) (eigenvalues).
After finding the eigenvalues, we substitute them back into $\((A-\lambda I)x=0\)\( and find the eigenvectors \)x$.
Note:#
A matrix with real entries can have complex eigenvalues/eigenvectors that come in conjugate pairs. Below is a good read on imaginary eigenvalues.
Short Reading on Imaginary Eigenvalues#
Read pages 223- 224 of Prof. Gibert Strang’s Introduction to Linear Algebra book on Imaginary Eigenvalues.
Let’s calculate eigenvalues for the following matrix:
Find eigenvalues#
Looking at the above recipe, let’s solve the problem symbollically using sympy
. First lets create a matrix \(B\) such that:
%matplotlib inline
import matplotlib.pylab as plt
import numpy as np
import sympy as sym
sym.init_printing()
#Most sympy requires defining the variables as "symbols"
#Once we do this we can use the variables in place of numbers
lam = sym.symbols('lambda')
A = sym.Matrix([[0, 0 ,-2], [1, 2, 1], [1, 0, 3]])
I = sym.eye(3)
B = A - lam*I
B
Now, per step 2, the determinant of \(B\) must be zero. Note that sympy
calculates the determinant symbolically as follows:
B.det()
Note that the above was done for illustrative purposes, using A.charpoly()
produces a similar result.
✅ Do This: Use the sympy.roots
function on the determinant of \(B\) to solve for \(\lambda\) and eigenvalues of \(A\). Verify that the solution to the last question produces the same eigenvalues as above.
# Put your code to solve for det(B) = 0 here
✅ Do This: To do this entire procedure in one step, the .eigenvals()
method can be used on \(A\) to calculate the eigenvalues.
# Put your code here
✅ Do This: Since this example had a cubic polynomial, we should expect three roots from the solution, but here we only see two. What is explanation for this? .eigenvals()
is outputting a dictionary object with the eigenvalues as keys. What do the values of these keys indicate in the sympy
output?
Put your answer to these questions here.
Find eigenvectors#
Now we know the eigenvalues, we can substitue them back into the equation to find the eigenvectors.
We solve this symbolically using sympy
. First let’s make a vector of our eigenvalues (from above):
eig = sorted(A.eigenvals().keys())
Now (per step 4 above) we need to solve the equation \((A-\lambda I)x=0\). One way to do this in sympy
is as follows:
x1,x2,x3 = sym.symbols(['x_1','x_2','x_3'])
x = sym.Matrix([[x1],[x2],[x3]])
x
for lam in eig:
vec = sym.solve((A - lam*I)*x,x)
print(lam,vec)
✅ QUESTION: Try to explain the output here. (Hint: eigenvectors can be scaled arbitrarily, so x_3 will always be a free variable, but what about x_2 for the second eigenvalue?)
Put your answer here
✅ Do This: Next, let’s use the .eigenvects()
method from sympy
to find three linear independent eigenvectors for the matrix \(A\).
# Put your answer to the above question here
✅ QUESTION: Compare this answer to our calculation above. Does this answer make sense? What does the syntax tell us?
Put your answer here
✅ DO THIS: Find the eigenvalues and eigenvectors of the following matrix: $\( A2=\begin{bmatrix} 2 & 1 \\ 0 & 2 \end{bmatrix}\)$
✅ QUESTION: What are the eigenvalues for the matrix \(A2\)?
Put your answer to the above question here
✅ QUESTION: What are the eigenvectors for the matrix \(A2\)?
Put your answer to the above question here
2. Diagonalizable Matrix#
In class we will be using matrix diagonalization to solve some problems.
Definition: If \(A\) and \(B\) are \(n\times n\) (square) matrices, then \(A\) is said to be similar to \(B\) if there exists an invertible matrix, \(P\), such that \(A = P^{-1}BP\).
Definition: Matrix \(A\) is diagonalizable if there exists a diagonal matrix \(D\) that is similar to \(A\):
If matrix \(A\) has \(n\) linearly independent eigenvectors (\(v_1, \ldots v_n\)) then \(A\) is diagonalizable with those eigenvectors forming a basis:
Each column of \(C\) is each linearly independent eigenvector of \(A\). The diagonal matrix \(D\) is composed of those eigenvector’s corresponding eigenvalues:
One can use the fact that \(\mbox{det}(A)=\mbox{det}(P^{-1}BP)= \mbox{det}(P^{-1})\mbox{det}(B)\mbox{det}(P)\) to prove the following theorem.
Theorem: If the matrices \(A\) and \(B\) are similar to each other, then \(A\) and \(B\) have the same characteristic polynomial, and hence have the same eigenvalues.
✅ DO THIS: Look at the two matrices used in the earlier section. Which of them are diagonalizable?
Put your answer here.
%matplotlib inline
import matplotlib.pylab as plt
import numpy as np
import sympy as sym
sym.init_printing(use_unicode=True)
✅ DO THIS: Using numpy
, Diagonalize (i.e. calculate C and D) the following matrix (Hint: consider np.diag
):
A = np.matrix([[5, -2, 2], [4, -3, 4], [4,-6,7]])
sym.Matrix(A)
# Put your answer here
from answercheck import checkanswer
checkanswer.matrix(D,'56821475223b52e0b6e751da444a1441');
✅ DO THIS: Verify that \(A\) was diagonalized by confirming that \(A\) is equal to \(CDC^{-1}\) using np.allclose
. (Note that since np.linalg.eig
outputs the eigenvectors as columns rather than rows, it ‘starts’ as the inverse, which would be akin to computing \(C^{-1}DC\), which fits our original definition of similar)
#Put your verification code here.
np.allclose(A,A2)
Benefits of Diagonalization#
Consider taking the square of a diagonalizable matrix \(A\):
As can be seen in this brief derivation, the square of a diagonalizable matrix reduces to the square of the diagonal matrix \(D\), with \(C^{-1}\) and \(C\) on either side as before. Squaring a diagonal matrix is particularly easy, and as we will see, using diagonalization can lead to significant gains in computation speed when it comes to matrix powers.
✅ QUESTION: Why is it particularly easy to take the power of a diagonal matrix? If the reason is not clear, try out an example.
Put the answer to the above question here.
Congratulations, we’re done!#
Written by Dr. Dirk Colbry, Michigan State University
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.