Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Development PR of v0.1.7 #36

Merged
merged 9 commits into from
Feb 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .zenodo.json
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
{
"description": "smooth inference for reinterpretation studies",
"license": "MIT",
"title": "SpeysideHEP/spey: v0.1.6",
"version": "v0.1.6",
"title": "SpeysideHEP/spey: v0.1.7",
"version": "v0.1.7",
"upload_type": "software",
"creators": [
{
Expand All @@ -29,7 +29,7 @@
},
{
"scheme": "url",
"identifier": "https://github.com/SpeysideHEP/spey/tree/v0.1.6",
"identifier": "https://github.com/SpeysideHEP/spey/tree/v0.1.7",
"relation": "isSupplementTo"
},
{
Expand Down
12 changes: 6 additions & 6 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -142,12 +142,12 @@
"icon": "https://img.shields.io/pypi/v/spey?style=plastic",
"type": "url",
},
{
"name": "PyPI",
"url": "https://pypi.org/project/spey/",
"icon": "https://img.shields.io/pypi/dm/spey?style=plastic",
"type": "url",
},
# {
# "name": "PyPI",
# "url": "https://pypi.org/project/spey/",
# "icon": "https://img.shields.io/pypi/dm/spey?style=plastic",
# "type": "url",
# },
],
}

Expand Down
4 changes: 2 additions & 2 deletions docs/plugins.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ given as

where :math:`\mu, \theta` are the parameter of interest (signal strength) and nuisance parameters,
the signal and background yields are given as :math:`n_s^i` and :math:`n_b^i\pm\sigma_b^i` respectively.
Additionally, absolute uncertainties are modelled as Gaussian distributions. This model can be
Additionally, absolute uncertainties are modeled as Gaussian distributions. This model can be
used as follows

.. code-block:: python3
Expand Down Expand Up @@ -107,7 +107,7 @@ This particular example implements a two-bin histogram with uncorrelated bins. T
>>> statistical_model.exclusion_confidence_level()
>>> # [0.9701795436411219]

For all the properties of :obj:`~spey.StatisticalModel` class we refer the reader to the corresponding
For all the properties of :obj:`~spey.StatisticalModel` class, we refer the reader to the corresponding
API description.

``'default_pdf.correlated_background'``
Expand Down
3 changes: 3 additions & 0 deletions docs/releases/changelog-v0.1.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,6 +63,9 @@ Specific upgrades for the latest release can be found [here](https://github.com/
* Typofix during computation of $\sigma_\mu$.
([#29](https://github.com/SpeysideHEP/spey/pull/29))

* Bug fix in signal uncertainty synthesizer
([#34](https://github.com/SpeysideHEP/spey/pull/34))

## Contributors

This release contains contributions from (in alphabetical order):
Expand Down
1 change: 1 addition & 0 deletions docs/tutorials.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,5 +8,6 @@ Kitchen Sink
tutorials/functional_tuto
tutorials/histogram
tutorials/gradients
tutorials/signal_unc
tuto_plugin
Introduction to Spey (PyHEP 2023) <https://github.com/SpeysideHEP/PyHEP-2023>
64 changes: 64 additions & 0 deletions docs/tutorials/signal_unc.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
---
myst:
html_meta:
"property=og:title": "Signal Uncertainties"
"property=og:description": "Modules to extend the likelihood with signal uncertainties."
"property=og:image": "https://spey.readthedocs.io/en/main/_static/spey-logo.png"
jupytext:
formats: ipynb,md:myst
text_representation:
extension: .md
format_name: myst
format_version: 0.12
jupytext_version: 1.8.2
kernelspec:
display_name: Python 3
language: python
name: python3
---

# Signal Uncertainties

Let us assume that we have a set of signal uncertainties (such as scale and PDF uncertainties). We can extend the likelihood prescription to include these uncertainties as a new set of nuisance parameters.

```{note}
:class: dropdown

Note that theoretical uncertainties have different interpretations, we can interpret them similar to experimental uncertainties, as we will do in this tutorial, or they can be interpreted as uncertainties on the cross-section. In the latter case one should compute the limits by changing the signal yields with respect to the change in cross section.
```

```{code-cell} ipython3
:tags: [hide-cell]
import spey
```

We can add uncorrelated signal uncertainties just like in uncorrelated background case

$$
\mathcal{L}(\mu, \theta) = \prod_{i\in{\rm bins}}{\rm Poiss}(n^i|\mu n_s^i + n_b^i + \theta^i\sigma_b^i + \theta_i^{(s)}\sigma_s^i) \cdot \prod_{j\in{\rm nui}}\mathcal{N}(\theta^j|0, 1) \cdot \prod_{j\in{\rm nui}}\mathcal{N}(\theta_{(s)}^j|0, 1)\ ,
$$

```{code-cell} ipython3
pdf_wrapper = spey.get_backend("default_pdf.uncorrelated_background")
statistical_model = pdf_wrapper(
signal_yields=[12.0, 15.0],
background_yields=[50.0,48.0],
data=[36, 33],
absolute_uncertainties=[12.0,16.0],
signal_uncertainty_configuration={"absolute_uncertainties": [3.0, 4.0]},
analysis="example",
xsection=0.123,
)
```

Similarly, we can construct signal uncertainties using ``"absolute_uncertainty_envelops"`` keyword which accepts upper and lower uncertainties as ``[(upper, lower)]``. We can also add a correlation matrix with ``"correlation_matrix"`` keyword and third moments with ``"third_moments"`` keyword. Notice that these are completely independent of background. Now we can simply compute the limits as follows

```{code-cell} ipython3
print(f"1 - CLs: {statistical_model.exclusion_confidence_level()[0]:.5f}")
print(f"POI upper limit: {statistical_model.poi_upper_limit():.5f}")
```

```python
1 - CLs: 0.99563
POI upper limit: 0.51504
```
80 changes: 48 additions & 32 deletions src/spey/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@
from spey.interface.statistical_model import StatisticalModel, statistical_model_wrapper
from spey.system import logger
from spey.system.exceptions import PluginError
from spey.system.webutils import get_bibtex, check_updates
from spey.system.webutils import get_bibtex, check_updates, ConnectionError

from ._version import __version__
from .about import about
Expand Down Expand Up @@ -257,7 +257,7 @@ def get_backend_metadata(name: Text) -> Dict[Text, Any]:
)


def get_backend_bibtex(name: Text) -> List[Text]:
def get_backend_bibtex(name: Text) -> Dict[Text, List[Text]]:
"""
Retreive BibTex entry for backend plug-in if available.

Expand All @@ -273,47 +273,63 @@ def get_backend_bibtex(name: Text) -> List[Text]:
that prescribes likelihood function.

Returns:
``List[Text]``:
BibTex entries for the backend.
``Dict[Text, List[Text]]``:
BibTex entries for the backend. Keywords include inspire, doi.org and zenodo.

.. versionchanged:: 0.1.7

In the previous version, function was returning ``List[Text]`` now it returns
a ``Dict[Text, List[Text]]`` indicating the source of BibTeX entry.

"""
# pylint: disable=import-outside-toplevel, W1203
txt = []
out = {"inspire": [], "doi.org": [], "zenodo": []}
meta = get_backend_metadata(name)

for arxiv_id in meta.get("arXiv", []):
tmp = get_bibtex("inspire/arxiv", arxiv_id)
if tmp != "":
txt.append(textwrap.indent(tmp, " " * 4))
else:
log.debug(f"Can not find {arxiv_id} in Inspire")
for doi in meta.get("doi", []):
tmp = get_bibtex("inspire/doi", doi)
if tmp == "":
log.debug(f"Can not find {doi} in Inspire, looking at doi.org")
tmp = get_bibtex("doi", doi)
try:
for arxiv_id in meta.get("arXiv", []):
tmp = get_bibtex("inspire/arxiv", arxiv_id)
if tmp != "":
txt.append(tmp)
out["inspire"].append(textwrap.indent(tmp, " " * 4))
else:
log.debug(f"Can not find {doi} in doi.org")
else:
txt.append(textwrap.indent(tmp, " " * 4))
for zenodo_id in meta.get("zenodo", []):
tmp = get_bibtex("zenodo", zenodo_id)
if tmp != "":
txt.append(textwrap.indent(tmp, " " * 4))
else:
log.debug(f"{zenodo_id} is not a valid zenodo identifier")
log.debug(f"Can not find {arxiv_id} in Inspire")
for doi in meta.get("doi", []):
tmp = get_bibtex("inspire/doi", doi)
if tmp == "":
log.debug(f"Can not find {doi} in Inspire, looking at doi.org")
tmp = get_bibtex("doi", doi)
if tmp != "":
out["doi.org"].append(tmp)
else:
log.debug(f"Can not find {doi} in doi.org")
else:
out["inspire"].append(textwrap.indent(tmp, " " * 4))
for zenodo_id in meta.get("zenodo", []):
tmp = get_bibtex("zenodo", zenodo_id)
if tmp != "":
out["zenodo"].append(textwrap.indent(tmp, " " * 4))
else:
log.debug(f"{zenodo_id} is not a valid zenodo identifier")
except ConnectionError as err:
log.error("Can not connect to the internet. Please check your connection.")
log.debug(str(err))
return out

return txt
return out


def cite() -> List[Text]:
"""Retreive BibTex information for Spey"""
arxiv = textwrap.indent(get_bibtex("inspire/arxiv", "2307.06996"), " " * 4)
zenodo = get_bibtex("zenodo", "10156353")
linker = re.search("@software{(.+?),\n", zenodo).group(1)
zenodo = textwrap.indent(zenodo.replace(linker, "spey_zenodo"), " " * 4)
return arxiv + "\n\n" + zenodo
try:
arxiv = textwrap.indent(get_bibtex("inspire/arxiv", "2307.06996"), " " * 4)
zenodo = get_bibtex("zenodo", "10156353")
linker = re.search("@software{(.+?),\n", zenodo).group(1)
zenodo = textwrap.indent(zenodo.replace(linker, "spey_zenodo"), " " * 4)
return arxiv + "\n\n" + zenodo
except ConnectionError as err:
log.error("Can not connect to the internet. Please check your connection.")
log.debug(str(err))
return ""


if os.environ.get("SPEY_CHECKUPDATE", "ON").upper() != "OFF":
Expand Down
2 changes: 1 addition & 1 deletion src/spey/_version.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
"""Version number (major.minor.patch[-label])"""

__version__ = "0.1.6"
__version__ = "0.1.7"
47 changes: 34 additions & 13 deletions src/spey/backends/default_pdf/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -115,8 +115,13 @@ def __init__(
self._config = ModelConfig(
poi_index=0,
minimum_poi=minimum_poi,
suggested_init=[1.0] * (len(data) + 1),
suggested_bounds=[(minimum_poi, 10)] + [(None, None)] * len(data),
suggested_init=[1.0] * (len(data) + 1)
+ (signal_uncertainty_configuration is not None)
* ([1.0] * len(signal_yields)),
suggested_bounds=[(minimum_poi, 10)]
+ [(None, None)] * len(data)
+ (signal_uncertainty_configuration is not None)
* ([(None, None)] * len(signal_yields)),
)

@property
Expand Down Expand Up @@ -188,11 +193,11 @@ def lam(pars: np.ndarray) -> np.ndarray:
expectation value of the poisson distribution with respect to
nuisance parameters.
"""
return pars[0] * self.signal_yields + A + B * pars[1:]
return pars[0] * self.signal_yields + A + B * pars[slice(1, len(B) + 1)]

def constraint(pars: np.ndarray) -> np.ndarray:
"""Compute constraint term"""
return A + B * pars[1:]
return A + B * pars[slice(1, len(B) + 1)]

jac_constr = jacobian(constraint)

Expand Down Expand Up @@ -484,19 +489,23 @@ def __init__(
{
"distribution_type": "normal",
"args": [np.zeros(len(self.data)), np.ones(len(B))],
"kwargs": {"domain": slice(1, None)},
"kwargs": {"domain": slice(1, len(B) + 1)},
}
]
+ self.signal_uncertainty_configuration.get("constraint", [])
)

def lam(pars: np.ndarray) -> np.ndarray:
"""Compute lambda for Main model"""
return self.background_yields + pars[1:] * B + pars[0] * self.signal_yields
return (
self.background_yields
+ pars[slice(1, len(B) + 1)] * B
+ pars[0] * self.signal_yields
)

def constraint(pars: np.ndarray) -> np.ndarray:
"""Compute the constraint term"""
return self.background_yields + pars[1:] * B
return self.background_yields + pars[slice(1, len(B) + 1)] * B

jac_constr = jacobian(constraint)

Expand Down Expand Up @@ -703,12 +712,20 @@ def lam(pars: np.ndarray) -> np.ndarray:
expectation value of the poisson distribution with respect to
nuisance parameters.
"""
nI = A + B * pars[1:] + C * np.square(pars[1:])
nI = (
A
+ B * pars[slice(1, len(B) + 1)]
+ C * np.square(pars[slice(1, len(B) + 1)])
)
return pars[0] * self.signal_yields + nI

def constraint(pars: np.ndarray) -> np.ndarray:
"""Compute constraint term"""
return A + B * pars[1:] + C * np.square(pars[1:])
return (
A
+ B * pars[slice(1, len(B) + 1)]
+ C * np.square(pars[slice(1, len(B) + 1)])
)

jac_constr = jacobian(constraint)

Expand All @@ -732,7 +749,7 @@ def poiss_lamb(pars: np.ndarray) -> np.ndarray:
{
"distribution_type": "multivariatenormal",
"args": [np.zeros(len(self.data)), corr],
"kwargs": {"domain": slice(1, None)},
"kwargs": {"domain": slice(1, len(B) + 1)},
}
]
+ self.signal_uncertainty_configuration.get("constraint", [])
Expand Down Expand Up @@ -851,19 +868,23 @@ def effective_sigma(pars: np.ndarray) -> np.ndarray:
return np.sqrt(
np.clip(
sigma_plus * sigma_minus
+ (sigma_plus - sigma_minus) * (pars[1:] - A),
+ (sigma_plus - sigma_minus) * (pars[slice(1, len(A) + 1)] - A),
1e-10,
None,
)
)

def lam(pars: np.ndarray) -> np.ndarray:
"""Compute lambda for Main model"""
return A + effective_sigma(pars) * pars[1:] + pars[0] * self.signal_yields
return (
A
+ effective_sigma(pars) * pars[slice(1, len(A) + 1)]
+ pars[0] * self.signal_yields
)

def constraint(pars: np.ndarray) -> np.ndarray:
"""Compute the constraint term"""
return A + effective_sigma(pars) * pars[1:]
return A + effective_sigma(pars) * pars[slice(1, len(A) + 1)]

jac_constr = jacobian(constraint)
self.constraints.append(
Expand Down
Loading
Loading