Post

info

info

Topics

Basic Methods

  • Least Squares, Gradient Descent, Newton’s Method, Line Search

Constrained Optimization

  • Lagrange Multipliers, KKT Conditions

Convexity

  • Convex Function, Convex Set, Duality

Study Resources & Scope Used to Study

[1] Mathematics for Machine Learning - Deisenroth, Faisal, and Ong

  • Continuous Optimization
    • 7.1 Optimization Using Gradient Descent
    • 7.2 Constrained Optimization and Lagrange Multipliers
    • 7.3 Convex Optimization

[2] Convex Optimization - Boyd and Vandenberghe

  • Convex Sets
    • 2.1 Affine and Convex Sets
    • 2.2 Some Important Examples
    • 2.3 Operations that Preserve Convexity
    • 2.4 Generalized Inequalities
    • 2.5 Separating and Supporting Hyperplanes
    • 2.6 Dual Cones and Generalized Inequalities
  • Convex Functions
    • 3.1 Basic Properties and Examples
    • 3.2 Operations that Preserve Convexity
    • 3.3 The Conjugate Function
    • 3.4 Quasiconvex Functions
    • 3.5 Log-Concave and Log-Convex Functions
    • 3.6 Convexity with Respect to Generalized Inequalities
  • Unconstrained Minimization
    • 9.1 Unconstrained Minimization Problems
    • 9.2 Descent Methods
    • 9.3 Gradient Descent Method
    • 9.4 Steepest Descent Method
    • 9.5 Newton’s Method
    • 9.6 Self-Concordance
    • 9.7 Implementation

[3] Python/PyTorch Implementations

  • Code for gradient descent, Newton’s method, and verifying autograd computations.
This post is licensed under CC BY 4.0 by the author.