Minimization

This module performs the optimization given a step proposal.

Classes Summary

Optimizer(fun, ub, lb[, verbose, options, …])

Performs optimization

Classes

class fides.minimize.Optimizer(fun, ub, lb, verbose=10, options=None, funargs=None, hessian_update=None)[source]

Performs optimization

Variables
  • fun – Objective function

  • funargs – Keyword arguments that are passed to the function

  • lb – Lower optimization boundaries

  • ub – Upper optimization boundaries

  • options – Options that configure convergence checks

  • delta_iter – Trust region radius that was used for the current step

  • delta – Updated trust region radius

  • x – Current optimization variables

  • fval – Objective function value at x

  • grad – Objective function gradient at x

  • x_min – Optimal optimization variables

  • fval_min – Objective function value at x_min

  • grad_min – Objective function gradient at x_min

  • hess – Objective function Hessian (approximation) at x

  • hessian_update – Object that performs hessian updates

  • starttime – Time at which optimization was started

  • iteration – Current iteration

  • converged – Flag indicating whether optimization has converged

  • exitflag – ExitFlag to indicate reason for termination

  • verbose – Verbosity level for logging

  • logger – logger instance

__init__(fun, ub, lb, verbose=10, options=None, funargs=None, hessian_update=None)[source]

Create an optimizer object

Parameters
  • fun (typing.Callable) – This is the objective function, if no hessian_update is provided, this function must return a tuple (fval, grad), otherwise this function must return a tuple (fval, grad, Hessian)

  • ub (numpy.ndarray) – Upper optimization boundaries. Individual entries can be set to np.inf for respective variable to have no upper bound

  • lb (numpy.ndarray) – Lower optimization boundaries. Individual entries can be set to -np.inf for respective variable to have no lower bound

  • verbose (typing.Optional[int]) – Verbosity level, pick from logging.[DEBUG,INFO,WARNING,ERROR]

  • options (typing.Optional[typing.Dict]) – Options that control termination of optimization. See minimize for details.

  • funargs (typing.Optional[typing.Dict]) – Additional keyword arguments that are to be passed to fun for evaluation

  • hessian_update (typing.Optional[fides.hessian_approximation.HessianApproximation]) – Subclass of fides.hessian_update.HessianApproximation that performs the hessian updates in every iteration.

check_continue()[source]

Checks whether minimization should continue based on convergence, iteration count and remaining computational budget

Return type

bool

Returns

flag indicating whether minimization should continue

check_convergence(fval, x, grad)[source]

Check whether optimization has converged.

Parameters
  • fval – updated objective function value

  • x – updated optimization variables

  • grad – updated objective function gradient

Return type

None

check_finite(grad=None, hess=None)[source]

Checks whether objective function value, gradient and Hessian ( approximation) have finite values and optimization can continue.

Parameters
Raises

RuntimeError if any of the variables have non-finite entries

check_in_bounds(x=None)[source]

Checks whether the current optimization variables are all within the specified boundaries

Raises

RuntimeError if any of the variables are not within boundaries

get_affine_scaling()[source]

Computes the vector v and dv, the diagonal of it’s Jacobian. For the definition of v, see Definition 2 in [Coleman-Li1994]

Return type

typing.Tuple[numpy.ndarray, numpy.ndarray]

Returns

v scaling vector dv diagonal of the Jacobian of v wrt x

log_header()[source]

Prints the header for diagnostic information, should complement Optimizer.log_step().

log_step(accepted, step, fval)[source]

Prints diagnostic information about the current step to the log

Parameters
  • accepted (bool) – flag indicating whether the current step was accepted

  • step (fides.steps.Step) – proposal step

  • fval (float) – new fval if step is accepted

log_step_initial()[source]

Prints diagnostic information about the initial step to the log

make_non_degenerate(eps=2.220446049250313e-14)[source]

Ensures that x is non-degenerate, this should only be necessary for initial points.

Parameters

eps – degeneracy threshold

Return type

None

minimize(x0)[source]

Minimize the objective function the interior trust-region reflective algorithm described by [ColemanLi1994] and [ColemanLi1996] Convergence with respect to function value is achieved when math:|f_{k+1} - f_k| < options[fatol] - \(f_k\) options[ frtol]. Similarly, convergence with respect to optimization variables is achieved when \(||x_{k+1} - x_k||\) < options[ xatol] - \(x_k\) options[xrtol]. Convergence with respect to the gradient is achieved when \(||g_k||\) < options[gatol] or ||g_k|| < options[grtol] * f_k. Other than that, optimization can be terminated when iterations exceed options[ maxiter] or the elapsed time is expected to exceed options[maxtime] on the next iteration.

Parameters

x0 (numpy.ndarray) – initial guess

Returns

fval: final function value, x: final optimization variable values, grad: final gradient, hess: final Hessian (approximation)

track_minimum(x_new, fval_new, grad_new)[source]

Function that tracks the optimization variables that have minimal function value independent of whether the step is accepted or not.

Parameters
Return type

None

Returns

update(step, x_new, fval_new, grad_new, hess_new=None)[source]

Update self according to employed step

Parameters
  • step (fides.steps.Step) – Employed step

  • x_new (numpy.ndarray) – New optimization variable values

  • fval_new (float) – Objective function value at x_new

  • grad_new (numpy.ndarray) – Objective function gradient at x_new

  • hess_new (typing.Optional[numpy.ndarray]) – (Approximate) objective function Hessian at x_new

Return type

None

update_tr_radius(fval, grad, step, dv)[source]

Update the trust region radius

Parameters
  • fval (float) – new function value if step defined by step_sx is taken

  • grad (numpy.ndarray) – new gradient value if step defined by step_sx is taken

  • step (fides.steps.Step) – step

  • dv (numpy.ndarray) – derivative of scaling vector v wrt x

Return type

bool

Returns

flag indicating whether the proposed step should be accepted