Skip to content

Commit 7c57c66

Browse files
feat: Add tutorial skeleton from LHC Reinterpretation Forum Workshop 2021 (#1)
* Copy tutorial structure from LHC Reinterpretation Forum Workshop 2021 tutorial * Use pyhf v0.6.1 * Add pre-commit hooks and linting
1 parent 8e24a70 commit 7c57c66

33 files changed

+21973
-2
lines changed
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
name: Deploy Jupyter Book
2+
3+
on:
4+
push:
5+
branches:
6+
- main
7+
pull_request:
8+
workflow_dispatch:
9+
10+
jobs:
11+
12+
deploy-book:
13+
runs-on: ubuntu-latest
14+
15+
steps:
16+
- uses: actions/checkout@v2
17+
18+
- name: Set up Python 3.8
19+
uses: actions/setup-python@v2
20+
with:
21+
python-version: 3.8
22+
23+
- name: Install dependencies
24+
run: |
25+
python -m pip install --upgrade pip setuptools wheel
26+
python -m pip install --no-cache-dir -r binder/requirements.txt
27+
python -m pip install --no-cache-dir -r book/requirements.txt
28+
29+
- name: Build the book
30+
run: |
31+
jupyter-book build book/
32+
33+
- name: Deploy Jupyter book to GitHub pages
34+
if: success() && github.event_name == 'push' && github.ref == 'refs/heads/main'
35+
uses: peaceiris/actions-gh-pages@v3
36+
with:
37+
github_token: ${{ secrets.GITHUB_TOKEN }}
38+
publish_dir: book/_build/html
39+
force_orphan: true
40+
user_name: 'github-actions[bot]'
41+
user_email: 'github-actions[bot]@users.noreply.github.com'
42+
commit_message: Deploy to GitHub pages

.github/workflows/lint.yml

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
name: Lint
2+
3+
on:
4+
push:
5+
branches:
6+
- main
7+
pull_request:
8+
workflow_dispatch:
9+
10+
jobs:
11+
lint:
12+
runs-on: ubuntu-latest
13+
strategy:
14+
matrix:
15+
python-version: [ '3.8' ]
16+
17+
steps:
18+
- uses: actions/checkout@v2
19+
- name: Set up Python ${{ matrix.python-version }}
20+
uses: actions/setup-python@v1
21+
with:
22+
python-version: ${{ matrix.python-version }}
23+
- name: Install dependencies
24+
run: |
25+
python -m pip install --upgrade pip setuptools wheel
26+
python -m pip install pre-commit
27+
pre-commit install
28+
- name: Lint with pre-commit
29+
run: |
30+
pre-commit run --all-files

.pre-commit-config.yaml

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
repos:
2+
- repo: https://github.com/psf/black
3+
rev: 20.8b1
4+
hooks:
5+
- id: black
6+
7+
- repo: https://github.com/asottile/pyupgrade
8+
rev: v2.11.0
9+
hooks:
10+
- id: pyupgrade
11+
12+
- repo: https://github.com/nbQA-dev/nbQA
13+
rev: 0.5.9
14+
hooks:
15+
- id: nbqa-black
16+
additional_dependencies: [black==20.8b1]
17+
- id: nbqa-pyupgrade
18+
additional_dependencies: [pyupgrade==2.11.0]

Makefile

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
all: build
2+
3+
defualt: build
4+
5+
build:
6+
jupyter-book build book/
7+
8+
clean: book/_build
9+
rm -rf book/_build

README.md

Lines changed: 30 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,30 @@
1-
# pyhf-tutorial
2-
Updating tutorial on pyhf
1+
# `pyhf` Tutorial
2+
3+
4+
Tutorial last given at the [March 2021 PyHEP topical meeting](https://indico.cern.ch/event/985425/).
5+
6+
**The tutorial is based off of [`pyhf` `v0.6.1`](https://pypi.org/project/pyhf/0.6.1/)**
7+
8+
[![Deploy Jupyter Book](https://github.com/pyhf/pyhf-tutorial/workflows/Deploy%20Jupyter%20Book/badge.svg?branch=main)](https://pyhf.github.io/pyhf-tutorial/)
9+
[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/pyhf/pyhf-tutorial/main)
10+
11+
## Setup
12+
13+
In a Python virtual environment run the following
14+
15+
```
16+
python -m pip install -r binder/requirements.txt
17+
python -m pip install -r book/requirements.txt
18+
```
19+
20+
## Build
21+
22+
To build the book after setup simply run
23+
24+
```
25+
make build
26+
```
27+
28+
## Past tutorials
29+
30+
- [LHC Reinterpretation Forum Workshop 2021](https://indico.cern.ch/event/982553/contributions/4219487/)

binder/apt.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
jq

binder/requirements.txt

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
pyhf[xmlio,minuit,contrib]==0.6.1
2+
# visualization
3+
ipywidgets~=7.5
4+
pandas~=1.0
5+
altair~=4.1

book/Combinations.ipynb

Lines changed: 171 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,171 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Performing a Combination\n",
8+
"\n",
9+
"We'll demonstrate how a combination works by combining everything we've learned so far.\n",
10+
"\n",
11+
"## Loading the Workspace\n",
12+
"\n",
13+
"To do so, we'll use a simple workspace to demonstrate functionality of combinations."
14+
]
15+
},
16+
{
17+
"cell_type": "code",
18+
"execution_count": null,
19+
"metadata": {},
20+
"outputs": [],
21+
"source": [
22+
"import json\n",
23+
"import pyhf"
24+
]
25+
},
26+
{
27+
"cell_type": "code",
28+
"execution_count": null,
29+
"metadata": {},
30+
"outputs": [],
31+
"source": [
32+
"with open(\"data/2-bin_1-channel.json\") as serialized:\n",
33+
" spec = json.load(serialized)\n",
34+
"\n",
35+
"workspace = pyhf.Workspace(spec)"
36+
]
37+
},
38+
{
39+
"cell_type": "markdown",
40+
"metadata": {},
41+
"source": [
42+
"## Combine Workspaces\n",
43+
"\n",
44+
"Let's just try to combine naively right now."
45+
]
46+
},
47+
{
48+
"cell_type": "code",
49+
"execution_count": null,
50+
"metadata": {
51+
"tags": [
52+
"raises-exception"
53+
]
54+
},
55+
"outputs": [],
56+
"source": [
57+
"pyhf.Workspace.combine(workspace, workspace)"
58+
]
59+
},
60+
{
61+
"cell_type": "markdown",
62+
"metadata": {},
63+
"source": [
64+
"As we can see, we can't just combine a workspace with itself if it has some channel names in common. We try very hard in `pyhf` to make sure a combination \"makes sense\".\n",
65+
"\n",
66+
"Let's go ahead and rename the channel (as well as the measurement). Then try to combine."
67+
]
68+
},
69+
{
70+
"cell_type": "code",
71+
"execution_count": null,
72+
"metadata": {},
73+
"outputs": [],
74+
"source": [
75+
"other_workspace = workspace.rename(\n",
76+
" channels={\"singlechannel\": \"othersinglechannel\"},\n",
77+
" modifiers={\"uncorr_bkguncrt\": \"otheruncorr_bkguncrt\"},\n",
78+
" measurements={\"Measurement\": \"OtherMeasurement\"},\n",
79+
")\n",
80+
"\n",
81+
"combined_workspace = pyhf.Workspace.combine(workspace, other_workspace)"
82+
]
83+
},
84+
{
85+
"cell_type": "markdown",
86+
"metadata": {},
87+
"source": [
88+
"And did we combine?"
89+
]
90+
},
91+
{
92+
"cell_type": "code",
93+
"execution_count": null,
94+
"metadata": {},
95+
"outputs": [],
96+
"source": [
97+
"print(f\" channels: {combined_workspace.channels}\")\n",
98+
"print(f\" nbins: {combined_workspace.channel_nbins}\")\n",
99+
"print(f\" samples: {combined_workspace.samples}\")\n",
100+
"print(f\" modifiers: {combined_workspace.modifiers}\")\n",
101+
"print(f\" parameters: {combined_workspace.parameters}\")\n",
102+
"print(f\"measurements: {combined_workspace.measurement_names}\")"
103+
]
104+
},
105+
{
106+
"cell_type": "markdown",
107+
"metadata": {},
108+
"source": [
109+
"Indeed. And at this point, we can just use all the same functionality we expect of pyhf, such as performing a fit:"
110+
]
111+
},
112+
{
113+
"cell_type": "code",
114+
"execution_count": null,
115+
"metadata": {},
116+
"outputs": [],
117+
"source": [
118+
"model = workspace.model()\n",
119+
"data = workspace.data(model)\n",
120+
"test_poi = 1.0\n",
121+
"\n",
122+
"pyhf.infer.hypotest(test_poi, data, model, test_stat=\"qtilde\")"
123+
]
124+
},
125+
{
126+
"cell_type": "code",
127+
"execution_count": null,
128+
"metadata": {},
129+
"outputs": [],
130+
"source": [
131+
"other_model = other_workspace.model()\n",
132+
"other_data = other_workspace.data(other_model)\n",
133+
"\n",
134+
"pyhf.infer.hypotest(test_poi, other_data, other_model, test_stat=\"qtilde\")"
135+
]
136+
},
137+
{
138+
"cell_type": "code",
139+
"execution_count": null,
140+
"metadata": {},
141+
"outputs": [],
142+
"source": [
143+
"combined_model = combined_workspace.model()\n",
144+
"combined_data = combined_workspace.data(combined_model)\n",
145+
"\n",
146+
"pyhf.infer.hypotest(test_poi, combined_data, combined_model, test_stat=\"qtilde\")"
147+
]
148+
}
149+
],
150+
"metadata": {
151+
"kernelspec": {
152+
"display_name": "Python 3",
153+
"language": "python",
154+
"name": "python3"
155+
},
156+
"language_info": {
157+
"codemirror_mode": {
158+
"name": "ipython",
159+
"version": 3
160+
},
161+
"file_extension": ".py",
162+
"mimetype": "text/x-python",
163+
"name": "python",
164+
"nbconvert_exporter": "python",
165+
"pygments_lexer": "ipython3",
166+
"version": "3.8.7"
167+
}
168+
},
169+
"nbformat": 4,
170+
"nbformat_minor": 4
171+
}

0 commit comments

Comments
 (0)