# Lecture 8-1: Convex Optimization: Part 1

Download the original slides: [CMSE382-Lec8_1.pdf](CMSE382-Lec8_1.pdf)

```{warning}
This is an AI-generated transcript of the lecture slides and may contain errors or inaccuracies. Please refer to the original course materials for authoritative content.
```

---

## Convex Optimization Definition

### Convex optimization

**General definition**

**Convex Optimization: General Definition**

$$
\begin{aligned}
& \text{minimize}  &  & f(\mathbf{x}) \\
& \text{such that} & & \mathbf{x} \in C
\end{aligned}
$$

where $C$ is a convex set, and $f$ is a convex function over $C$.

![](../../../figures/smart_grid_illustration.png)
Minimize cost. Control: Production, Demand, Transmission.

![](../../../figures/portfolio_optimization.png)
Minimize risk. Control Assets' allocation.

![](../../../figures/robot_balancing.png)
Minimize tracking errors. Control joint positions.

---

### Functional definition

**Convex Optimization: Functional Definition**

$$
\begin{aligned}
& \text{minimize}  & & f(\mathbf{x}) \\
& \text{such that} & & g_i(\mathbf{x}) \le 0, i=1,2,\ldots, m, \\
&                  & & h_j(\mathbf{x}) = 0, j=1,2,\ldots, p,
\end{aligned}
$$

where $f,g_1,\ldots,g_m:\mathbb{R}^n \to \mathbb{R}$ are convex functions and $h_1,h_2,\ldots, h_p:\mathbb{R}^n\to \mathbb{R}$ are affine functions.

![](../../../figures/convex_optimization_functional_definition.png)

---

### Reasons we like convex optimization: Local minima are global minima

**Theorem (local minimum is global minimum in convex optimization)**

Let $f: C \to \mathbb{R}$ be defined on the convex set $C$.
Let $x^* \in C$ be a local minimum of $f$ over $C$.

- If $f$ is **convex**, then $\mathbf{x}^*$ is a global minimum of $f$ over $C$.
- If $f$ is **strictly convex**, then $\mathbf{x}^*$ is a strict global minimum of $f$ over $C$.

![](../../../figures/local_min_is_global.png)

---

### Reasons we like convex optimization: Convexity of the optimal set

**Theorem (Convexity of the optimal set in convex optimization)**

Let $f: C \to \mathbb{R}$ be a convex function defined over the convex set $C \subseteq \mathbb{R}^n$.
Let $X^*$ be the set of optimal solutions of the problem given by the equation

$$
X^* = \text{argmin}\{f(\mathbf{x}): \mathbf{x} \in C\}.
$$

Then:

- If $f$ is **convex**, then $X^*$ is convex.
- If $f$ is **strictly convex**, then $X^*$ cotains at most one optimal solution.

![](../../../figures/argmin_convex_functions.png)

---

## Convex Optimization Example: Linear Programming

### Motivation

**Linear Optimization (Linear Programming)**

Find:

$[$ Production(P),
Demand(D),
Transmission (T) $]$

That maximizes efficiency $c_1 \times P + c_2 \times D + c_3 \times T$

Subject to constraints:

production = demand

transmission $\leq$ grid capacity

energy stored $\leq$ capacity

$\vdots$

**Linear optimization**

$$
\begin{aligned}
\text{Find} \quad & \mathbf{x} \\
\text{min} \quad & c^{\top} \mathbf{x} \\
\text{Subject to} \quad & A \mathbf{x} \leq \mathbf{b}; \\
& B \mathbf{x} = \mathbf{g} \\
& \mathbf{x} \geq 0
\end{aligned}
$$

$$
\begin{aligned}
\text{Find} \quad & \mathbf{x} \\
\text{max} \quad & c^{\top} \mathbf{x} \\
\text{Subject to} \quad & A \mathbf{x} \leq \mathbf{b}; \\
& \mathbf{x} \geq 0
\end{aligned}
$$

(Standard form)

---

### Recall: Feasible region

![](../../../figures/linear_programming_idea.png)

**Definition**

A **Feasible region** of an optimization problem is the set of all possible points that satisfy the problem's constraints.

- It represents all the possible candidates for the optimization solution.

---

### Recall: Guarantees related to linear optimization

Linear programming problems seek to optimize a linear function $f(\mathbf{x}) = \mathbf{c}^{\top} \mathbf{x}$, which is both convex and concave.

**Theorem (Existence of maximizers at extreme points)**

Let $f: C \to \mathbb{R}$ be a convex and continuous function over the convex and compact set $C \subseteq \mathbb{R}^n$.
Then there exists at least one maximizer of $f$ over $C$ that is an extreme point of $C$.

**Theorem (equivalence between extreme points and bfs)**

Let $P = {\mathbf{x} \in \mathbb{R}^n: A\mathbf{x} = \mathbf{b}, \mathbf{x} \geq  \mathbf{0}}$, where $A \in \mathbb{R}^{m\times n} $has linearly independent rows and $b \in \mathbb{R}^m$.
Then $\bar{x}$ is a basic feasible solution of $P$ if and only if it is an extreme point of $P$.

**Theorem (Fundamental theorem of linear programming)**

If the problem has an optimal solution, then it necessarily has an optimal basic feasible solution.
