Optimization features

The ability to differentiate your code is most often used in combination with a gradient-based optimization algorithm, whose job is to find the state vector x of input variables that minimizes the scalar function J(x) by repeated calculation of the value of J, its first derivative ∂J/∂x (a vector) and optionally its second derivative ∂2J/∂x2 (the Hessian matrix). Since version 2.1, Adept includes several minimization algorithms, all of which can be used with or without box constraints on the values of x. Usage is described in Chapter 4 of the User Guide.

The video below illustrates three of these algorithms minimizing the two-dimensional Rosenbrock "banana" function using Adept's test_minimizer program, subject to the constraint that the solution cannot lie in the shaded area on the left. The colours indicate the value of the cost function J; Rosenbrock's function is a challenging test of a minimization algorithm because it is easy to find the valley (in yellow) but difficult to find the lowest point. Real-world problems are usually of much higher dimensionality, but are therefore more difficult to visualize.

The three minimization algorithms demonstrated are:

Adept does not include any derivative-free methods at present; if you are using Adept you will invariably have the derivative available, in which case a minimization method that uses it will achieve much faster convergence.