KMA/NMOM Numerical Methods of Optimization
Lecturer: Horymír Netuka
Lecture: 2 hours/week + exercise 2 hours/week
Credits: 4
Summer semester
Form of course completion: course credit, exam
- Unconstrained optimization – subject, applications, examples. Introductory definitions and basic conceptions.
- First-order necessary optimality conditions. Second-order necessary optimality conditions. Second-order sufficient optimality conditions.
- Univariate minimization. Minimization without using derivatives (comparative method, Fibonacci search method, golden section search method). Methods using derivatives (bisection, Newton method).
- Derivative-free minimization of functions of several variables. Nelder-Mead method. Hooke-Jeeves method.
- Minimization of quadratic functions using gradient methods – part I. Quadratic function and its properties. Descent methods basics. Method of steepest descent for quadratic functions.
- Minimization of quadratic functions using gradient methods – part II. Conjugate gradient method. Convergence analysis.
- Line search methods – part I. Fundamental principles. Step length selection using backtracking line search. Armijo condition. Backtracking-Armijo line search algorithm. Convergence analysis.
- Line search methods – part II. Wolfe conditions. Inexact line search using Wolfe conditions. Convergence analysis.
- Line search methods – part III. Method of steepest descent for nonquadratic functon. Conjugate gradient method for nonquadratic functon and its two main versions.
- Newton's method and its modifications. Classical Newton method. Modifications of Newton's method (damped Newton's method, finite-difference Newton's method).
- Quasi-Newton methods. Principles of quasi-Newton methods. General scheme with B matrices. General scheme with G matrices. Broyden's method, DFP method, BFGS method.
- Solution of systems of nonlinear equations. Multivariate minimization and its connection with nonlinear algebraic equations – Gauss-Newton method. Newton method. Broyden method.