The loss, or also called “cost”, describes the disagreement between the data and the model. Most commonly, the likelihood (or, to be precise, the negative log likelihood) is used, as the maximum likelihood estimation provides many beneficial characteristics.

Binned losses require the PDF and data to be binned as well.

Extended losses take the expected count (“yield”) of a PDF into account and require the PDF to be extended in the first place.

zfit.loss.UnbinnedNLL(model, data[, ...])

Unbinned Negative Log Likelihood.

zfit.loss.ExtendedUnbinnedNLL(model, data[, ...])

An Unbinned Negative Log Likelihood with an additional poisson term for the number of events in the dataset.

zfit.loss.BinnedNLL(model, data[, ...])

Binned negative log likelihood.

zfit.loss.ExtendedBinnedNLL(model, data[, ...])

Extended binned likelihood using the expected number of events per bin with a poisson probability.

zfit.loss.BinnedChi2(model, data[, ...])

Binned Chi2 loss, using the :math:`N_{tot} from the data.

zfit.loss.ExtendedBinnedChi2(model, data[, ...])

Binned Chi2 loss, using the :math:`N_{tot} from the PDF.

zfit.loss.BaseLoss(model, data[, fit_range, ...])

A "simultaneous fit" can be performed by giving one or more model, data, fit_range to the loss.

zfit.loss.SimpleLoss(func[, params, ...])

Loss from a (function returning a) Tensor.