## Class Schedule

Class meets Mondays and Wednesdays from 9:30-10:45am.

Bluejeans link: https://bluejeans.com/985094704

Class meets Mondays and Wednesdays from 9:30-10:45am.

Bluejeans link: https://bluejeans.com/985094704

Mark Davenport

Email: mdav (at) gatech (dot) edu

Magnus Egerstedt

Email: magnus (dot) egerstedt (at) gatech (dot) edu

Office Hours: Fridays 1-2pm

Bluejeans link: bluejeans.com/8869708739

Nauman Ahad

Email: nauman (dot) ahad (at) gatech (dot) edu

Office Hours: Mondays 11am-noon

Bluejeans link: bluejeans.com/703770912

Namrata Nadagouda

Email: namrata (dot) nadagouda (at) gatech (dot) edu

Office Hours: Wednesdays 3-4pm

Bluejeans link: bluejeans.com/455910717

This course will cover the fundamentals of convex optimization. We will talk about mathematical fundamentals, modeling (i.e., how to set up optimization problems in different applications), and algorithms.

Download the syllabus.

Students should be familiar with linear algebra (e.g., solving systems of equations, matrix factorizations including SVD, QR, LU, Cholesky, least squares), basic probability (e.g.,\ you should be comfortable with multivariate probability densities), and have good programming skills.

There is no required text. Course notes will be posted as they become available at the course website. These notes will be based on material sourced from several different texts. The main resources these will draw from include:

*Convex Optimization* by Boyd and Vanderberghe (2008). (amazon, also available as a free pdf from Boyd).

*Convex Analysis and Optimization* by Bertsekas, Nedic, and Ozdeglar (2003). (amazon).

*Numerical Optimization* by Nocedal and Wright (2006). (amazon).

*Lectures on Modern Convex Optimization* by Ben-Tal and Nemirovski (1987). (amazon).

*Optimization by Vector Space Methods* by Luenberger (1969). (amazon).

**Additional online resources**

A short review of matrix calculus

*If you find anything else useful, let me know and I will post it here.*

- Introduction to optimization, basic geometric and algebraic concepts
- Convexity
- convex sets
- convex functions
- convexity, gradients, and optimization

- Unconstrained minimization
- gradient descent
- line search methods
- convergence analysis
- accelerated first order methods (Heavy ball, Neseterov)
- incremental and stochastic gradients
- Newton's method
- Quasi-Newton methods
- subgradient descent
- proximal methods

- Theory for constrained optimization
- optimality conditions
- Fenchel duality
- Lagrange duality
- Karush-Kuhn-Tucker (KKT) conditions

- Methods for constrained optimization
- barrier techniques
- projected gradient descent
- splitting methods, alternating direction method of multipliers
- ADMM

- Applications/extensions
- convex relaxation and nonconvex optimization
- optimization for robotics
- optimization for control
- optimization for statistical inference
- optimization for machine learning
- optimization for inverse problems

Throughout the course, we will be using different applications to motivate the theory. These will cover some well-known (and not so well-known) problems in signal and image processing, communications, control, machine learning, and statistical estimation (among other things).