In order to successfully complete this assignment you need to participate both individually and in groups during class. If you attend class in-person then have one of the instructors check your notebook and sign you out before leaving class. If you are attending asyncronously, turn in your assignment using D2L no later than 11:59pm on the day of class. See links at the end of this document for access to the class timeline for your section.
Image from: https://en.wikipedia.org/wiki/Diagonalizable_matrix
%matplotlib inline
import matplotlib.pylab as plt
import numpy as np
import sympy as sym
sym.init_printing()
Reminder: The eigenvalues of triangular (upper and lower) and diagonal matrices are easy:
Definition: A square matrix $A$ is said to be diagonalizable if there exist a matrix $C$ such that $D=C^{-1}AC$ is a diagonal matrix.
Definition: $B$ is a similar matrix of $A$ if we can find $C$ such that $B=C^{-1}AC$.
Given an $n\times n$ matrix $A$, can we find another $n \times n$ invertable matrix $C$ such that when $D=C^{-1}AC$ is diagonal, i.e., $A$ is diagonalizable?
Then we check the corresponding columns of the both sides. We have $$Ax_1 = \lambda_1x_1\\\vdots\\Ax_n=\lambda x_n$$
$A$ has $n$ linear independent eigenvectors.
$A$ is saied to be similar to the diagonal matrix $D$, and the transformation of $A$ into $D$ is called a similarity transformation.
Consider the following: $$ A = \begin{bmatrix}7& -10\\3& -4\end{bmatrix},\quad C = \begin{bmatrix}2& 5\\1& 3\end{bmatrix}$$
✅ Do this: Find the similar matrix $D = C^{-1}AC$ of $A$.
#Put your answer to the above question here.
from answercheck import checkanswer
checkanswer.matrix(D, '8313fe0f529090d6a8cdb36248cfdd6c');
✅ Do this: Find the eigenvalues and eigenvectors of $A$. Set variables e1
and vec1
to be the smallest eigenvalue and it's associated eigenvector and e2, vec2
to represent the largest.
#Put your answer to the above question here.
from answercheck import checkanswer
checkanswer.float(e1, "e4c2e8edac362acab7123654b9e73432");
from answercheck import checkanswer
checkanswer.float(e2, "d1bd83a33f1a841ab7fda32449746cc4");
from answercheck import checkanswer
checkanswer.eq_vector(vec1, "d28f0a721eedb3d5a4c714744883932e", decimal_accuracy = 4)
from answercheck import checkanswer
checkanswer.eq_vector(vec2, "09d9df5806bc8ef975074779da1f1023", decimal_accuracy = 4)
Theorem: Similar matrices have the same eigenvalues.
Proof: Assume $B=C^{-1}AC$ is a similar matrix of $A$, and $\lambda$ is an eigenvalue of $A$ with corresponding eigenvector $x$. That is, $$Ax=\lambda x$$ Then we have $$B(C^{-1}x) = C^{-1}AC(C^{-1}x) = C^{-1}Ax = C^{-1}(\lambda x)= \lambda (C^{-1}x).$$ That is $C^{-1}x$ is an eigenvector of $B$ with eigenvalue $\lambda$.
✅ Do this: Consider
$$ A = \begin{bmatrix}-4& -6\\3& 5\end{bmatrix}.$$
Find a matrix $C$ such that $C^{-1}AC$ is diagonal. (Hint, use the function diagonalize
in sympy
.)
#Put your answer to the above question here.
#Check the output type
assert(type(C)==sym.Matrix)
from answercheck import checkanswer
checkanswer.matrix(C,'ba963b7fef354b4a7ddd880ca4bac071')
✅ Do this: Consider
$$ A = \begin{bmatrix}5& -3\\3& -1\end{bmatrix}.$$
Can we find a matrix $C$ such that $C^{-1}AC$ is diagonal? (Hint: find eigenvalues and eigenvectors using sympy
)
#Put your answer to the above question here.
Definition: The set of all eigenvectors of a $n\times n$ matrix corresponding to a eigenvalue $\lambda$, together with the zero vector, is a subspace of $R^n$. This subspace spaces is called eigenspace.
The dimension of an eigenspace of a matrix is less than or equal to the multiplicity of the corresponding eigenvalue as a root of the characteristic equation.
A matrix is diagonalizable if and only if the dimension of every eigenspace is equal to the multiplicity of the corresponding eigenvalue as a root of the characteristic equation.
✅ Do this: Consider $$ A = \begin{bmatrix}2& -1\\1& 2\end{bmatrix}.$$ Can we find a matrix $C$ such that $C^{-1}AC$ is diagonal?
#Put your answer to the above question here.
# Here are some libraries you may need to use
%matplotlib inline
import numpy as np
import sympy as sym
import networkx as nx
import matplotlib.pyplot as plt
sym.init_printing(use_unicode=True)
**DO THIS**: Using matrix algebra, show that $\frac{1}{2}(I + AD^{-1})$ is similar to $I-\frac{1}{2}N$, where $N=D^{-\frac{1}{2}}(D-A)D^{-\frac{1}{2}}$ is the normalized graph Laplacian.
Your answer goes here (follow along after attempting)
To generate the barbell graph, run the following cell.
n = 60 # number of nodes
B = nx.Graph() # initialize graph
## initialize empty edge lists
edge_list_complete_1 = []
edge_list_complete_2 = []
edge_list_path = []
## generate node lists
node_list_complete_1 = np.arange(int(n/3))
node_list_complete_2 = np.arange(int(2*n/3),n)
node_list_path = np.arange(int(n/3)-1,int(2*n/3))
## generate edge sets for barbell graph
for u in node_list_complete_1:
for v in np.arange(u+1,int(n/3)):
edge_list_complete_1.append((u,v))
for u in node_list_complete_2:
for v in np.arange(u+1,n):
edge_list_complete_2.append((u,v))
for u in node_list_path:
edge_list_path.append((u,u+1))
# G.remove_edges_from([(3,0),(5,7),(0,7),(3,5)])
## add edges
B.add_edges_from(edge_list_complete_1)
B.add_edges_from(edge_list_complete_2)
B.add_edges_from(edge_list_path)
## draw graph
pos=nx.spring_layout(B) # positions for all nodes
### nodes
nx.draw_networkx_nodes(B,pos,
nodelist=list(node_list_complete_1),
node_color='c',
node_size=400,
alpha=0.8)
nx.draw_networkx_nodes(B,pos,
nodelist=list(node_list_path),
node_color='g',
node_size=200,
alpha=0.8)
nx.draw_networkx_nodes(B,pos,
nodelist=list(node_list_complete_2),
node_color='b',
node_size=400,
alpha=0.8)
### edges
nx.draw_networkx_edges(B,pos,
edgelist=edge_list_complete_1,
width=2,alpha=0.5,edge_color='c')
nx.draw_networkx_edges(B,pos,
edgelist=edge_list_path,
width=3,alpha=0.5,edge_color='g')
nx.draw_networkx_edges(B,pos,
edgelist=edge_list_complete_2,
width=2,alpha=0.5,edge_color='b')
plt.axis('off')
plt.show() # display
✅ Do this: Generate the lazy random walk matrix, $W$, for the above graph.
A = nx.adjacency_matrix(B)
A = A.todense()
d = np.sum(A,0) # Make a vector of the sums.
D = np.diag(np.asarray(d)[0])
#Put your answer to the above question here.
from answercheck import checkanswer
checkanswer.matrix(W, "7af4a5b11892da6e1a605c8239b62093")
✅ Do this: Compute the eigenvalues and eigenvectors of $W$. Make a diagonal matrix $A$ with the eigenvalues on the diagonal. Name the matrix of eigenvectors $V$ (each column is an eigenvector).
#Put your answer to the above question here.
Now we make sure we constructed $V$ and $A$ correcrectly by double checking that $W = VAV^{-1}$
np.allclose(W, V*A*np.linalg.inv(V))
✅ Do this: Let your $p_{0}=[1,0,0,\ldots,0]$. Compute $p_{t}$ for $t=1,2,\ldots,100$, and plot $||v_{1}-p_{t}||_{1}$ versus $t$, where $v_{1}$ is the eigenvector associated with the largest eigenvalue $\lambda_{1}=1$ and whose sum equals 1. (Note: $||\cdot||_{1}$ may be computed using np.linalg.norm(v_1-p_t, 1)
.)
#Put your answer to the above question here.
If you complete the above, do the same for a complete graph on the same number of nodes.
✅ Question: What do you notice about the graph that is different from that above?
Put your answer to the above question here.
If you attend class in-person then have one of the instructors check your notebook and sign you out before leaving class. If you are attending remote, turn in your assignment using D2L.
Written by Dirk Colbry, Michigan State University
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.