Math 560 (Fall 2014). Introduction to smooth nonlinear programming.
Math 560. (Fall 2014)This is an Introduction to nonlinear optimization. Materials are being shared online at MyCourses. Some past material may be available here.
Unconstrained optimization: optimality conditions, step size selection, local and global convergence results for steepest descent, Newton and Quasi-Newton methods. Hessian modification. Conjugate gradient including improved convergence based on Krylov space analysis. Nonlinear conjugate gradient. Trust Region Methods and global convergence results. Optimality conditions of Trust Region Subproblem. Applications: least squares problems, stochastic gradient descent, nonlinear equations.
Constrained (Smooth) Optimization Optimality conditions for convex optimization. Derivation of first order necessary optimality conditions (KKT). Second order necessary and sufficient conditions. Sensitivity analysis. Lagrange duality. Interior point methods for linear programming. Taxonomy of nonlinear programming algorithms (penalty and Lagrangian methods, sequential quadratic programming). Introduction to conic programming.