In order to successfully complete this assignment you need to participate both individually and in groups during class. If you attend class in-person then have one of the instructors check your notebook and sign you out before leaving class. If you are attending asyncronously, turn in your assignment using D2L no later than 11:59pm on the day of class. See links at the end of this document for access to the class timeline for your section.
</p>
Let $Ax = y$ be a system of $m$ linear equations in $n$ variables. A least squares solution of $Ax = y$ is an solution $\hat{x}$ in $R^n$ such that:
$$ \min_{\hat{x}}\|y - A\hat{x}\|.$$Note we substitute $y$ for our typical variable $b$ here because we will use $b$ later to represent the intercept to a line and we want to try and avoid confusion in notation. It also consistent with the picture above.
In other words, $\hat{x}$ is a value of $x$ for which $Ax$ is as close as possible to $y$. From previous lectures, we know this to be true if the vector $$y - A\hat{x}$$ is orthogonal (perpendicular) to the column space of $A$.
We also know that the dot product is zero if two vectors are orthogonal. So we have
$$a \cdot (Ax - y) = 0, $$
for all vectors $a$ in the column spaces of $A$.
The columns of $A$ span the column space of $A$. Denote the columns of $A$ as $$A = [a_1, \cdots, a_n].$$ Then we have $$a_1^\top (Ax - y) = 0, \\ a_2^\top(Ax-y)=0\\\vdots \\a_n^\top(Ax-y)=0.$$ It is the same as taking the transpose of $A$ and doing a matrix multiply: $$A^\top (Ax - y) = 0.$$
That is:
$$A^\top Ax = A^\top y$$
The above equation is called the least squares solution to the original equation $Ax=y$. The matrix $A^\top A$ is symmetric and invertable. Then solving for $\hat{x}$ can be calculated as follows:
$$x = (A^\top A)^{-1}A^\top y$$The matrix $(A^\top A)^{-1}A^\top$ is also called the left inverse.
Example: A researcher has conducted experiments of a particular Hormone dosage in a population of rats. The table shows the number of fatalities at each dosage level tested. Determine the least squares line and use it to predict the number of rat fatalities at hormone dosage of 22.
Hormone level | 20 | 25 | 30 | 35 | 40 | 45 | 50 |
---|---|---|---|---|---|---|---|
Fatalities | 101 | 115 | 92 | 64 | 60 | 50 | 49 |
%matplotlib inline
import matplotlib.pylab as plt
import numpy as np
import sympy as sym
import time
sym.init_printing(use_unicode=True)
H = [20,25,30,35,40,45,50]
f = [101,115, 92,64,60,50,49]
plt.scatter(H,f)
plt.xlabel('Hormone Level')
plt.ylabel('Fatalities')
f = np.matrix(f).T
We want to determine a line that is expressed by the following equation
$$f = aH + b,$$to approximate the connection between Hormone dosage ($H$) and Fatalities $f$. That is, we want to find $a$ (slope) and $b$ (y-intercept) for this line. First we define the variable $ x = \left[ \begin{matrix} a \\ b \end{matrix} \right] $ as the column vector that needs to be solved.
✅DO THIS: Rewrite the system of equations to the form $Ax=y$ by defining your numpy
matrices A
and y
using the data from above:
#put your code here
✅ **QUESTION:** Calculate the square matrix $C = A^\top A$ and the modified right hand side vector as $A^\top y$ (Call it Aty
):
#put your code here
✅QUESTION: Find the least squares solution by solving $Cx=A^\top y$ for $x$.
# Put your code here
✅QUESTION: Given the solution above, define the two scalars slope a
and y-intercept b
.
#put your code here
The following code will Plot the original data and the line estimated by the coefficients found in the above quation.
H = [20,25,30,35,40,45,50]
f = [101,115, 92,64,60,50,49]
plt.scatter(H,f)
H2 = np.linspace(np.min(H), np.max(H))
f2 = a * H2 + b
plt.plot(H2, f2)
--------------------------------------------------------------------------- NameError Traceback (most recent call last) <ipython-input-1-9bf8df9d29e8> in <module> 5 H2 = np.linspace(np.min(H), np.max(H)) 6 ----> 7 f2 = a * H2 + b 8 9 plt.plot(H2, f2) NameError: name 'a' is not defined
✅QUESTION: Repeat the above analysis but now with a eight-order polynomial.
✅QUESTION: Check the rank of $C=A^\top A$ for the previous case. What do you get? Why?
Put your answer to the above question here
In this class we often talk about solving problems of the form:
$$Ax = b$$Currently we have determined that this problem becomes very nice when the $n \times n$ matrix $A$ has an inverse. We can easily multiply each side by the inverse:
$$A^{-1}Ax = A^{-1}b$$Since $A^{-1}A = I$ the solution for $x$ is simply:
$$x = A^{-1}b$$Now, let us consider a a more general problem where the $m \times n$ matrix $A$ is not square, i.e. $m \neq n$ and its rank $r$ maybe less than $m$ and/or $n$. In this case we want to find a Pseudoinverse (which we denote as $A^+$) which acts like an inverse for a non-square matrix. In other words we want to find an $A^+$ for $A$ such that:
$$A^+A \approx I$$Assuming we can find the $n \times m$ matrix $A^+$, we should then be able to solve for $x$ as follows:
$$Ax = b$$$$A^+Ax = A^+b$$$$x \approx A^+b$$Assuming the general case of a $m\times n$ matrix $A$ where its rank $r$ maybe less than $m$ and/or $n$ ($r\leq m$ and $r\leq n$). We can conclude the following about the fundamental spaces of $A$:
Because the rowspace of $A$ and the column space $A$ have the same dimension then $A$ is a the one-to-one mapping from the row space to the columnspace. In other words:
The above is not really a proof but hopefully there is sufficient information to convince yourself that this is true.
We want to find the $n\times m$ matrix that maps from columnspace to the rowspace of $A$, and $x=A^+Ax,$ if $x$ is in the rowspace.
Let's apply SVD on $A$: $$A= U\Sigma V^\top,$$ where $U$ is a $m\times m$ matrix, $V^\top$ is a $n\times n$ matrix, and $\Sigma$ is a diagonal $m\times n$ matrix. We can decompose the matrices as $$A = \begin{bmatrix}\vdots & \vdots \\ U_1 & U_2 \\ \vdots &\vdots\end{bmatrix} \begin{bmatrix}\Sigma_1 & 0 \\ 0 & 0\end{bmatrix} \begin{bmatrix}\cdots & V_1^\top & \cdots \\ \cdots & V_2^\top &\cdots \end{bmatrix}.$$ Here $U_1$ is of $m\times r$, $U_2$ is of $m\times (m-r)$, $\Sigma_1$ is of $r\times r$, $V_1^\top$ is of $r\times n$, and $V_2^\top$ is of $(n-r)\times n$.
If $x$ is in the rowspace of $A$, we have that $V_2^\top x=0$. We have $Ax = U_1\Sigma_1 V_1^\top x$.
The matrix $B$ is the pseudoinverse of matrix $A$. $$A^+ = V_1\Sigma_1^{-1}U_1^\top$$ $$A^+ = \begin{bmatrix}\vdots & \vdots \\ V_1 & V_2 \\ \vdots &\vdots\end{bmatrix} \begin{bmatrix}\Sigma_1^{-1} & 0 \\ 0 & 0\end{bmatrix} \begin{bmatrix}\cdots & U_1^\top & \cdots \\ \cdots & U_2^\top &\cdots \end{bmatrix}.$$
Example 1: Let $$A=[1,2]$$ we know that $r=m=1$ and $n=2$.
A = np.matrix([[1,2]])
✅TODO: Calculate the pseudoinverse $A^+$ of $A$ using the numpy.linalg
function pinv
:
#put your code here
✅DO THIS: Compute $AA^+$ and $A^+A$
#put your code here
✅QUESTION: If $x$ is in the nullspace of $A$ what is the effect of $A^+Ax$?
Put your answer to the above question here
✅QUESTION: If $x$ is in the rowspace of $A$ what is the effect of $A^+Ax$?
Put your answer to the above question here
We can compute the left inverse of $A$ if $r=n\leq m$. In this case, we may have more rows than columns, and the matrix $A$ has full column rank.
In this case, the SVD of $A$ is $$A = U\Sigma V^\top .$$ Here $U$ is of $m\times n$, $\Sigma$ is of $n\times n$ and nonsingular, $V^\top$ is of $n\times n$. The pseudoinverse of $A$ is $$A^+ = V\Sigma^{-1}U^\top$$
The left inverse of $A$ is $$(A^\top A)^{-1}A^\top= (V\Sigma U^\top U\Sigma V^\top )^{-1} V\Sigma U^\top = V(\Sigma \Sigma )^{-1} V^\top V\Sigma U^\top = V\Sigma ^{-1} U^\top =A^+$$
Example 2: Let $$A=\begin{bmatrix}1\\2\end{bmatrix}$$ we know that $r=n=1$ and $m=2$. Then we have the left inverse.
A = np.matrix([[1],[2]])
A
matrix([[1], [2]])
✅DO THIS: Calculate the pseudoinverse $A^+$ of $A$.
Put your answer to the above question here
✅DO THIS: Calculate the left inverse of $A$, and verify that it is the same as $A^+$.
Put your answer to the above question here
If you attend class in-person then have one of the instructors check your notebook and sign you out before leaving class. If you are attending remote, turn in your assignment using D2L.
Written by Dr. Dirk Colbry, Michigan State University
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.