picture picture

Ch 3.1: Linear Regression

Lecture 4 - CMSE 381
Michigan State University
::
Dept of Computational Mathematics, Science & Engineering
Wed Jan 21, 2026
Announcements

Last time:

Covered in this lecture

Section 1 picture picture

Simple Linear Regression
Setup

Predict \(\beamer@setmathcolor Y\) on a single predictor variable \(\beamer@setmathcolor X\) \begin {equation*} Y \approx \beta _0 + \beta _1 X \end {equation*} ”\(\beamer@setmathcolor \approx \)” .... “is approximately modeled as”

SVG-Viewer needed.

SVG-Viewer needed.

Setup

Predict \(\beamer@setmathcolor Y\) on a single predictor variable \(\beamer@setmathcolor X\) \begin {equation*} Y \approx \beta _0 + \beta _1 X \end {equation*} ”\(\beamer@setmathcolor \approx \)” .... “is approximately modeled as”

SVG-Viewer needed.

SVG-Viewer needed.

Example

PIC
Least squares criterion: Setup How do we estimate the coefficients?

PIC
Least squares criterion: RSS

PIC \begin {equation*} \textrm {sales} \approx \beta _0 + \beta _1 \textrm {TV} \end {equation*}
Residual sum of squares RSS is \begin {align*} RSS &= e_1^2 + \cdots + e_n^2\\ &= \sum _i (y_i - \hat \beta _0 - \hat \beta _1 x_i)^2 \end {align*}
picture
Least squares criterion
picture picture
Find \(\beamer@setmathcolor \beta _0\) and \(\beamer@setmathcolor \beta _1\) that minimize the RSS.
Least squares coefficient estimates

\begin {equation*} \min _{\beta _0,\beta _1} \sum _i (y_i - \hat \beta _0 - \hat \beta _1 x_i)^2 \end {equation*}

\begin {align*} \frac {\partial RSS}{\partial \hat \beta _0} & = -2 \sum _i (y_i - \hat {\beta }_0 - \hat {\beta }_1x_i) = 0 \\ \frac {\partial RSS}{\partial \hat \beta _1} & = -2 \sum _i x_i(y_i - \hat \beta _0 - \hat \beta _1x_i) = 0 \end {align*}

\begin {align*} \hat \beta _1 &= \frac {\sum _{i=1}^n(x_i-\overline x) (y_i - \overline y)} {\sum _{i=1}^n (x_i - \overline x)^2}\\ \hat \beta _0 & = \overline y - \hat \beta _1 \overline x \end {align*}

Coding group work

Next time

Next time:

PIC

Announcements