Minimizers#
zfit supplies wrappers for different minimizers from multiple libraries. Most of the are local
minimizers (such as Minuit
, Ipyopt
or
ScipyLBFGSB
are) while there are also a few global ones such as
the NLoptISRES
or NLoptStoGO
.
While the former are usually faster and preferred, they depend more on the initial values than the latter. Especially in higher dimensions, a global search of the parameters can increase the minimization time drastically and is often infeasible. It is also possible to couple the minimizers by first doing an approximate global minimization and then polish the minimum found with a local minimizer.
All minimizers support similar arguments, most notably tol
which denotes the termination
value. This is reached if the value of the convergence criterion, which defaults to
EDM
, the same that is also used in Minuit
.
Other than that, there are a also a few minimizer specific arguments that differ from each minimizer.
They all have the exact same minimization method minimize()
which takes a loss, parameters and (optionally) a FitResult
from which it can
take information to have a better start into the minimization.
Minuit#
|
Minuit is a longstanding and well proven algorithm of the L-BFGS-B class implemented in iminuit. |
Levenberg-Marquardt#
|
Levenberg-Marquardt minimizer for general non-linear minimization by interpolating between Gauss-Newton and Gradient descent optimization. |
Ipyopt#
|
Ipopt is a gradient-based minimizer that performs large scale nonlinear optimization of continuous systems. |
Scipy#
|
Local, gradient based quasi-Newton algorithm using the BFGS algorithm. |
|
Local, gradient based quasi-Newton algorithm using the limited-memory BFGS approximation. |
|
Trust-region based local minimizer. |
|
Local minimizer using the modified Powell algorithm. |
|
Local, gradient-based minimizer using tho Sequential Least Squares Programming algorithm.name. |
|
Local, gradient based minimization algorithm using a truncated Newton method. |
|
UNSTABLE! Local gradient-free dowhhill simplex-like method with an implicit linear approximation. |
|
PERFORMS POORLY! Local Newton conjugate gradient trust-region algorithm. |
|
This minimizer requires the hessian and gradient to be provided by the loss itself. |
|
PERFORMS POORLY! Local, gradient based (nearly) exact trust-region algorithm using matrix vector products with the hessian. |
NLopt#
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |
|
alias of |