{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Introduction to zfit\n", "\n", "zfit is a model building and fitting library, mostly for likelihood fits and with advanced features for HEP analyses.\n", "\n", "It aims to define an API and workflow to be the stable core and allow other libraries to be built on top of it.\n", "Furthermore, it uses JIT compilation and autograd (current backend TensorFlow, future JAX (?)) to significantly speed up the computation.\n", "\n", "## Workflow\n", "\n", "There are five mostly independent parts in zfit\n", "![zfit workflow](_static/zfit_workflow.png)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "import matplotlib.pyplot as plt\n", "import mplhep\n", "import numpy as np\n", "import zfit\n", "# znp is a subset of numpy functions with a numpy interface but using actually the zfit backend (currently TF)\n", "import zfit.z.numpy as znp\n", "import zfit_physics as zphysics # extension (for user contribution) containing physics PDFs\n", "from zfit import z # z is the backend, for gradients, JIT etc" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Data\n", "\n", "This component in general plays a minor role in zfit: it is mostly to provide a unified interface for data.\n", "\n", "Preprocessing is therefore not part of zfit and should be done beforehand. Python offers many great possibilities to do so (e.g. Pandas).\n", "\n", "zfit knows unbinned and binned datasets.\n", "\n", "The unbinned `Data` can load data from various sources, most notably from Numpy, Pandas DataFrame, TensorFlow Tensor and ROOT (using uproot). It is also possible, for convenience, to convert it directly `to_pandas`. The constructors are named `from_numpy`, `from_root` etc." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A `Data` needs not only the data itself but also the observables: the human-readable string identifiers of the axes (corresponding to \"columns\" of a Pandas DataFrame). It is convenient to define the `Space` not only with the observable but also with a limit: this can directly be re-used as the normalization range in the PDF.\n", "\n", "First, let's define our observables" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "obs = zfit.Space('obs1', (-5, 10))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This `Space` has limits and offers the following functionality:\n", "- area(): calculate the area (e.g. distance between upper and lower)\n", "- inside(): return a boolean Tensor corresponding to whether the value is _inside_ the `Space`\n", "- filter(): filter the input values to only return the one inside" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "size_normal = 10_000\n", "data_normal_np = np.random.normal(size=size_normal, scale=2)\n", "\n", "data_normal = zfit.Data.from_numpy(obs=obs, array=data_normal_np)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The main functionality is\n", "- nevents: attribute that returns the number of events in the object\n", "- data_range: a `Space` that defines the limits of the data; if outside, the data will be cut\n", "- n_obs: defines the number of dimensions in the dataset\n", "- with_obs: returns a subset of the dataset with only the given obs\n", "- weights: event weights\n", "\n", "Furthermore, `value` returns a Tensor with shape `(nevents, n_obs)`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(f\"We have {data_normal.nevents} events in our dataset with the minimum of {np.min(data_normal.value())}\") # remember! The obs cut out some of the data" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "data_normal.n_obs" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data_normal[\"obs1\"] # indexing Pandas DataFrame style" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Model\n", "\n", "Building models is by far the largest part of zfit. We will therefore cover an essential part, the possibility to build custom models, in an extra chapter. Let's start out with the idea that you define your parameters and your observable space; the latter is the expected input data.\n", "\n", "PDFs are normalized (over a specified range); this is the main model and is what we are going to use throughout the tutorials.\n", "\n", "A PDF is defined by\n", "\n", "\\begin{align}\n", "\\mathrm{PDF}_{f(x)}(x; \\theta) = \\frac{f(x; \\theta)}{\\int_{a}^{b} f(x; \\theta)}\n", "\\end{align}\n", "\n", "where a and b define the normalization range (`norm`), over which (by inserting into the above definition) the integral of the PDF is unity.\n", "\n", "Let's create a simple Gaussian PDF. We already defined the `Space` for the data before, now we only need the parameters. These are a different objects than a `Space`.\n", "\n", "### Parameter\n", "A `Parameter` (there are different kinds actually, more on that later) takes the following arguments as input:\n", "`Parameter(name, initial value[, lower limit, upper limit])` where the limits are highly recommended but not mandatory. Furthermore, `step_size` can be given (which is useful to be around the given uncertainty, e.g. for large yields or small values it can help a lot to set this). Also, a `floating` argument is supported, indicating whether the parameter is allowed to float in the fit or not (just omitting the limits does _not_ make a parameter constant).\n", "\n", "Parameters have a unique name. This is served as the identifier for e.g. fit results." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "mu = zfit.Parameter('mu', 1, -3, 3, step_size=0.2)\n", "sigma_num = zfit.Parameter('sigma42', 1, 0.1, 10, floating=False)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we have everything to create a Gaussian PDF:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss = zfit.pdf.Gauss(obs=obs, mu=mu, sigma=sigma_num)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Since this holds all the parameters and the observables are well-defined, we can retrieve them" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss.n_obs # dimensions" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss.obs" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss.space" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss.norm" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "As we've seen, the `obs` we defined is the `space` of Gauss: this acts as the default limits whenever needed (e.g. for sampling). `gauss` also has a `norm`, which equals by default as well to the `obs` given, however, we can explicitly change that." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can also access the parameters of the PDF in two ways, depending on our intention:\n", "either by _name_ (the parameterization name, e.g. `mu` and `sigma`, as defined in the `Gauss`), which is useful if we are interested in the parameter that _describes_ the shape" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss.params" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "or to retrieve all the parameters that the PDF depends on. As this now may sounds trivial, we will see later that models can depend on other models (e.g. sums) and parameters on other parameters. There is one function that automatically retrieves _all_ dependencies, `get_params`. It takes three arguments to filter:\n", "- floating: whether to filter only floating parameters, only non-floating or don't discriminate\n", "- is_yield: if it is a yield, or not a yield, or both\n", "- extract_independent: whether to recursively collect all parameters. This, and the explanation for why independent, can be found later on in the `Simultaneous` tutorial.\n", "\n", "Usually, the default is exactly what we want if we look for _all free parameters that this PDF depends on_." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss.get_params()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The difference will also be clear if we e.g. use the same parameter twice:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "gauss_only_mu = zfit.pdf.Gauss(obs=obs, mu=mu, sigma=mu)\n", "print(f\"params={gauss_only_mu.params}\")\n", "print(f\"get_params={gauss_only_mu.get_params()}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Functionality\n", "\n", "PDFs provide a few useful methods. The main features of a zfit PDF are:\n", "\n", "- `pdf`: the normalized value of the PDF. It takes an argument `norm_range` that can be set to `False`, in which case we retrieve the unnormalized value\n", "- `integrate`: given a certain range, the PDF is integrated. As `pdf`, it takes a `norm_range` argument that integrates over the unnormalized `pdf` if set to `False`\n", "- `sample`: samples from the pdf and returns a `Data` object" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "integral = gauss.integrate(limits=(-1, 3)) # corresponds to 2 sigma integral\n", "integral" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Tensors\n", "\n", "As we see, many zfit functions return Tensors. This is however no magical thing! If we're outside of models, then we can always safely convert them to a numpy array by calling `zfit.run(...)` on it (or any structure containing potentially multiple Tensors) or simply `np.array`. However, this may not even be required often! They can be added just like numpy arrays and interact well with Python and Numpy:\n", "\n", "[**Extended tutorial on TensorFlow**](https://zfit-tutorials.readthedocs.io/en/master/tutorials/TensorFlow/HPC_with_TensorFlow.html)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "np.sqrt(integral)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "They also have shapes, dtypes, can be slices etc. So do not convert them except you need it. More on this can be seen in the talk later on about zfit and TensorFlow 2.0." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "sample = gauss.sample(n=1000) # default space taken as limits\n", "sample" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "sample['obs1'][:10]\n", "# sample.value()[:10, 0] # \"value\" returns a numpy array-like object" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "sample.n_obs" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "sample.obs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We see that sample returns also a zfit `Data` object with the same space as it was sampled in. This can directly be used e.g." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "probs = gauss.pdf(sample)\n", "probs[:10]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**NOTE**: In case you want to do this repeatedly (e.g. for toy studies), there is a more efficient way (see later on)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Plotting\n", "\n", "So far, we have a dataset and a PDF. Before we go for fitting, we can make a plot. This functionality is not _directly_ provided in zfit (but can be added to [zfit-physics](https://github.com/zfit/zfit-physics)). It is however simple enough to do it:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "def plot_model(model, data, scale=1, plot_data=True): # we will use scale later on\n", "\n", " nbins = 50\n", "\n", " lower, upper = data.data_range.limit1d\n", " x = znp.linspace(lower, upper, num=1000) # np.linspace also works\n", " y = model.pdf(x) * size_normal / nbins * data.data_range.area()\n", " y *= scale\n", " plt.plot(x, y, label='model')\n", " if plot_data:\n", " # legacy way, also works\n", " # data_plot = data.value()[:, 0] # we could also use the `to_pandas` method\n", " # plt.hist(data_plot, bins=nbins)\n", " # modern way\n", " data_binned = data.to_binned(nbins)\n", " mplhep.histplot(data_binned, label='data')\n", " plt.legend()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "plot_model(gauss, data_normal)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can of course do better (and will see that later on, continuously improve the plots), but this is quite simple and gives us the full power of matplotlib/mplhep." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Different models\n", "\n", "zfit offers a selection of predefined models (and extends with models from zfit-physics that contain physics specific models such as ARGUS shaped models)." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(zfit.pdf.__all__)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(zphysics.pdf.__all__)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To create a more realistic model, we can build some components for a mass fit with a\n", "- signal component: CrystalBall\n", "- combinatorial background: Exponential\n", "- partial reconstructed background on the left: Kernel Density Estimation\n", "\n", "[**Extended tutorial on different KDEs in zfit**](https://zfit-tutorials.readthedocs.io/en/master/tutorials/components/13%20-%20Kernel%20Density%20Estimation.html)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Shape fit" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "mass_obs = zfit.Space('mass', (0, 1000))" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "# Signal component\n", "\n", "mu_sig = zfit.Parameter('mu_sig', 400, 100, 600)\n", "sigma_sig = zfit.Parameter('sigma_sig', 50, 1, 100)\n", "alpha_sig = zfit.Parameter('alpha_sig', 200, 100, 400, floating=False) # won't be used in the fit\n", "npar_sig = zfit.Parameter('n sig', 4, 0.1, 30, floating=False)\n", "signal = zfit.pdf.CrystalBall(obs=mass_obs, mu=mu_sig, sigma=sigma_sig, alpha=alpha_sig, n=npar_sig)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "# combinatorial background\n", "\n", "lam = zfit.Parameter('lambda', -0.01, -0.05, -0.0001)\n", "comb_bkg = zfit.pdf.Exponential(lam, obs=mass_obs)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "part_reco_data = np.random.normal(loc=150, scale=160, size=70_000)\n", "part_reco = zfit.pdf.KDE1DimGrid(obs=mass_obs, data=part_reco_data)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Composing models\n", "\n", "We can also compose multiple models together. Here we'll stick to one dimensional models, the extension to multiple dimensions is explained in the \"custom models tutorial\".\n", "\n", "Here we will use a `SumPDF`. This takes pdfs and fractions. If we provide n pdfs and:\n", "- n - 1 fracs: the nth fraction will be 1 - sum(fracs)\n", "- n fracs: no normalization attempt is done by `SumPDf`. If the fracs are not implicitly normalized, this can lead to bad fitting\n", " behavior if there is a degree of freedom too much\n", "\n", "[**Extended tutorial on composed models, including multidimensional products**](https://zfit-tutorials.readthedocs.io/en/master/tutorials/TensorFlow/HPC_with_TensorFlow.html)\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "sig_frac = zfit.Parameter('sig_frac', 0.3, 0, 1)\n", "comb_bkg_frac = zfit.Parameter('comb_bkg_frac', 0.25, 0, 1)\n", "model = zfit.pdf.SumPDF([signal, comb_bkg, part_reco], [sig_frac, comb_bkg_frac])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In order to have a corresponding data sample, we can just create one. Since we want to fit to this dataset later on, we will create it with slightly different values. Therefore, we can use the ability of a parameter to be set temporarily to a certain value with" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(f\"before: {sig_frac}\")\n", "with sig_frac.set_value(0.25):\n", " print(f\"new value: {sig_frac}\")\n", "print(f\"after 'with': {sig_frac}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "While this is useful, it does not fully scale up. We can use the `zfit.param.set_values` helper therefore.\n", "(_Sidenote: instead of a list of values, we can also use a `FitResult`, the given parameters then take the value from the result_)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "with zfit.param.set_values([mu_sig, sigma_sig, sig_frac, comb_bkg_frac, lam], [370, 34, 0.08, 0.31, -0.0025]):\n", " data = model.sample(n=10_000)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "plot_model(model, data);" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Plotting the components is not difficult now: we can either just plot the pdfs separately (as we still can access them) or in a generalized manner by accessing the `pdfs` attribute:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "def plot_comp_model(model, data):\n", " for mod, frac in zip(model.pdfs, model.params.values()):\n", " plot_model(mod, data, scale=frac, plot_data=False)\n", " plot_model(model, data)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "plot_comp_model(model, data)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can add legends etc. Btw, did you notice that actually, the `frac` params are zfit `Parameters`? But we just used them as if they were Python scalars and it works." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(model.params)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Extended PDFs\n", "\n", "So far, we have only looked at normalized PDFs that do contain information about the shape but not about the _absolute_ scale. We can make a PDF extended by adding a yield to it.\n", "\n", "The behavior of the new, extended PDF does **NOT change**, any methods we called before will act the same. Only exception, some may require an argument _less_ now. All the methods we used so far will return the same values. What changes is that the flag `model.is_extended` now returns `True`. Furthermore, we have now a few more methods that we can use which would have raised an error before:\n", "- `get_yield`: return the yield parameter (notice that the yield is _not_ added to the shape parameters `params`)\n", "- `ext_{pdf,integrate}`: these methods return the same as the versions used before, however, multiplied by the yield\n", "- `sample` is still the same, but does not _require_ the argument `n` anymore. By default, this will now equal to a _poissonian sampled_ n around the yield.\n", "\n", "The `SumPDF` now does not strictly need `fracs` anymore: if _all_ input PDFs are extended, the sum will be as well and use the (normalized) yields as fracs\n", "\n", "The preferred way to create an extended PDf is to use `extended=yield` in the PDF constructor.\n", "\n", "Alternatively, one can use `extended_pdf = pdf.create_extended(yield)`. (_Since this relies on copying the PDF, which may does not work for different reasons, there is also a `pdf.set_yield(yield)` method that sets the yield in-place. This won't lead to ambiguities, as everything is supposed to work the same._)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "yield_model = zfit.Parameter('yield_model', 10000, 0, 20000, step_size=10)\n", "model_ext = model.create_extended(yield_model)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "alternatively, we can create the models as extended and sum them up" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "sig_yield = zfit.Parameter('sig_yield', 2000, 0, 10000, step_size=1)\n", "sig_ext = signal.create_extended(sig_yield)\n", "\n", "comb_bkg_yield = zfit.Parameter('comb_bkg_yield', 6000, 0, 10000, step_size=1)\n", "comb_bkg_ext = comb_bkg.create_extended(comb_bkg_yield)\n", "\n", "part_reco_yield = zfit.Parameter('part_reco_yield', 2000, 0, 10000, step_size=1)\n", "part_reco.set_yield(\n", " part_reco_yield) # unfortunately, `create_extended` does not work here. But no problem, it won't change anything.\n", "part_reco_ext = part_reco" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "model_ext_sum = zfit.pdf.SumPDF([sig_ext, comb_bkg_ext, part_reco_ext])" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Loss\n", "\n", "A loss combines the model and the data, for example to build a likelihood. Furthermore, it can contain constraints, additions to the likelihood. Currently, if the `Data` has weights, these are automatically taken into account." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll_gauss = zfit.loss.UnbinnedNLL(gauss, data_normal)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The loss has several attributes to be transparent to higher level libraries. We can calculate the value of it using `value`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll_gauss.value()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice that due to graph building, this will take significantly longer on the first run. Rerun the cell above and it will be way faster.\n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Furthermore, the loss also provides a possibility to calculate the gradients or, often used, the value and the gradients." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "nll_gauss.gradient()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can access the data and models (and possible constraints)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll_gauss.model" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll_gauss.data" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll_gauss.constraints" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Similar to the models, we can also get the parameters via `get_params`." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll_gauss.get_params()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Extended loss\n", "\n", "More interestingly, we can now build a loss for our composite sum model using the sampled data. Since we created an extended model, we can now also create an extended likelihood, taking into account a Poisson term to match the yield to the number of events." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll = zfit.loss.ExtendedUnbinnedNLL(model_ext_sum, data)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "nll.get_params()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Minimization\n", "\n", "While a likelihood is interesting, we usually want to minimize it. Therefore, we can use the minimizers in zfit, most notably `Minuit`, a wrapper around the [iminuit minimizer](https://github.com/scikit-hep/iminuit).\n", "\n", "The philosophy is to create a minimizer instance that is mostly _stateless_, e.g. does not remember the position.\n", "\n", "Given that iminuit provides us with a very reliable and stable minimizer, it is usually recommended to use this. Others are implemented as well and could easily be wrapped, however, the convergence is usually not as stable.\n", "\n", "Minuit has a few options:\n", "- `tol`: the Estimated Distance to Minimum (EDM) criteria for convergence (default 1e-3)\n", "- `verbosity`: between 0 and 10, 5 is normal, 7 is verbose, 10 is maximum\n", "- `gradient`: if True, uses the Minuit numerical gradient instead of the TensorFlow gradient. This is usually more stable for smaller fits; furthermore, the TensorFlow gradient _can_ (experience based) sometimes be wrong. Using `\"zfit\"` uses the TensorFlow gradient." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "minimizer = zfit.minimize.Minuit()\n", "# minimizer = zfit.minimize.NLoptLBFGSV1()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Other minimizers\n", "\n", "There are a couple of other minimizers, such as SciPy, NLopt that are wrapped and can be used.\n", "\n", "They all provide a highly standardize interface, good initial values and non-ambiguous docstring, i.e. every method has its own docstring.\n", "\n", "**When to stop**: a crucial problem is the question of convergence. iminuit has the `EDM`, a comparably reliable metric. Other minimizers in zfit use the same stopping criteria, to be able to compare results, but that is often very inefficienc: many minimizer do not use the Hessian, required for the `EDM`, internally, or do not provide it. This means, the Hessian has to be calculated separately (expensive!). Better stopping criteria ideas are welcome!" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "help(zfit.minimize.ScipyNewtonCGV1.__init__)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(zfit.minimize.__all__)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For the minimization, we can call `minimize`, which takes a\n", "- loss as we created above\n", "- optionally: the parameters to minimize\n", "\n", "By default, `minimize` uses all the free floating parameters (obtained with `get_params`). We can also explicitly specify which ones to use by giving them (or better, objects that depend on them) to `minimize`; note however that non-floating parameters, even if given explicitly to `minimize` won 't be minimized." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## iminuit compatibility\n", "\n", "As a sidenote, a zfit loss provides the same API as required by iminuit. Therefore, we can also *directly* invoke iminuit and use it.\n", "\n", "The advantage can be that iminuit offers a few things out-of-the-box, like contour drawings, which zfit does not. Disadvantage is that the zfit worflow is exited." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "import iminuit\n", "\n", "iminuit_min = iminuit.Minuit(nll, nll.get_params())\n", "# iminuit_min.migrad() # this fails sometimes, as iminuit doesn't use the limits and is therefore unstable" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "result = minimizer.minimize(nll)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "plot_comp_model(model_ext_sum, data)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Fit result\n", "\n", "The result of every minimization is stored in a `FitResult`. This is the last stage of the zfit workflow and serves as the interface to other libraries. Its main purpose is to store the values of the fit, to reference to the objects that have been used and to perform (simple) uncertainty estimation.\n", "\n", "[**Extended tutorial on the FitResult**](https://zfit-tutorials.readthedocs.io/en/master/tutorials/components/05%20-%20Exploring%20the%20FitResult.html)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(result)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This gives an overview over the whole result. Often we're mostly interested in the parameters and their values, which we can access with a `params` attribute." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(result.params)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is a `dict` which stores any knowledge about the parameters and can be accessed by the parameter (object) itself:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "result.params[mu_sig]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "'value' is the value at the minimum. To obtain other information about the minimization process, `result` contains more attributes:\n", "- **valid**: whether the result is actually valid or not (most important flag to check)\n", "- fmin: the function minimum\n", "- edm: estimated distance to minimum\n", "- info: contains a lot of information, especially the original information returned by a specific minimizer\n", "- converged: if the fit converged" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "result.fmin" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Info provides additional information on the minimization, which can be minimizer dependent. For example, if iminuit was used, the instance can be retrieved." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(result.info['minuit']) # the actual instance used in minimization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Estimating uncertainties\n", "\n", "The `FitResult` has mainly two methods to estimate the uncertainty:\n", "- a profile likelihood method (like MINOS)\n", "- Hessian approximation of the likelihood (like HESSE)\n", "\n", "When using `Minuit`, this uses (currently) its own implementation. However, zfit has its own implementation, which can be invoked by changing the method name.\n", "\n", "Hesse also supports the [corrections for weights](https://inspirehep.net/literature/1762842).\n", "\n", "We can explicitly specify which parameters to calculate, by default it does for all." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "result.hesse(name='hesse')" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "# result.hesse(method='hesse_np', name='numpy_hesse')\n", "# result.hesse(method='minuit_hesse', name='iminuit_hesse')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We get the result directly returned. This is also added to `result.params` for each parameter and is nicely displayed with an added column" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(result.params)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "errors, new_result = result.errors(params=[sig_yield, part_reco_yield, mu_sig],\n", " name='errors') # just using three, but we could also omit argument -> all used" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "errorszfit, new_result_zfit = result.errors(params=[sig_yield, part_reco_yield, mu_sig], method='zfit_errors', name='errorszfit')" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(errors)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "print(result)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### What is 'new_result'?\n", "\n", "When profiling a likelihood, such as done in the algorithm used in `errors`, a new minimum can be found. If this is the case, this new minimum will be returned, otherwise `new_result` is `None`. Furthermore, the current `result` would be rendered invalid by setting the flag `valid` to `False`. _Note_: this behavior only applies to the zfit internal error estimator." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can also access the covariance matrix of the parameters" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "result.covariance()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Bonus: serialization\n", "\n", "Human-readable serialization is experimentally supported in zfit.\n", "zfit aims to follow the [HS3](https://github.com/hep-statistics-serialization-standard/hep-statistics-serialization-standard) standard, which is currently in development.\n", "\n", "[**Extended tutorial on serialization**](https://zfit-tutorials.readthedocs.io/en/latest/tutorials/components/90%20-%20Serialization%20basics.html)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "model.to_dict()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "gauss.to_yaml()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "zfit.hs3.dumps(nll)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### End of zfit\n", "\n", "This is where zfit finishes and other libraries take over." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Beginning of hepstats" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`hepstats` is a library containing statistical tools and utilities for high energy physics. In particular you do statistical inferences using the models and likelhoods function constructed in `zfit`.\n", "\n", "Short example: let's compute for instance a confidence interval at 68 % confidence level on the mean of the gaussian defined above." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "from hepstats.hypotests import ConfidenceInterval\n", "from hepstats.hypotests.calculators import AsymptoticCalculator\n", "from hepstats.hypotests.parameters import POIarray" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "calculator = AsymptoticCalculator(input=result, minimizer=minimizer)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "value = result.params[mu_sig][\"value\"]\n", "error = result.params[mu_sig][\"hesse\"][\"error\"]\n", "\n", "mean_scan = POIarray(mu_sig, np.linspace(value - 1.5 * error, value + 1.5 * error, 10))" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "ci = ConfidenceInterval(calculator, mean_scan)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "ci.interval()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "pycharm": { "is_executing": true } }, "outputs": [], "source": [ "from utils import one_minus_cl_plot\n", "\n", "ax = one_minus_cl_plot(ci)\n", "ax.set_xlabel(\"mean\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There will be more of `hepstats` later.\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.11.5" } }, "nbformat": 4, "nbformat_minor": 4 }