Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
42 changes: 42 additions & 0 deletions .github/workflows/deploy-jupyter-book.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
name: Deploy Jupyter Book

on:
push:
branches:
- main
pull_request:
workflow_dispatch:

jobs:

deploy-book:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2

- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.8

- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools wheel
python -m pip install --no-cache-dir -r binder/requirements.txt
python -m pip install --no-cache-dir -r book/requirements.txt

- name: Build the book
run: |
jupyter-book build book/

- name: Deploy Jupyter book to GitHub pages
if: success() && github.event_name == 'push' && github.ref == 'refs/heads/main'
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: book/_build/html
force_orphan: true
user_name: 'github-actions[bot]'
user_email: 'github-actions[bot]@users.noreply.github.com'
commit_message: Deploy to GitHub pages
30 changes: 30 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Lint

on:
push:
branches:
- main
pull_request:
workflow_dispatch:

jobs:
lint:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: [ '3.8' ]

steps:
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip setuptools wheel
python -m pip install pre-commit
pre-commit install
- name: Lint with pre-commit
run: |
pre-commit run --all-files
18 changes: 18 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
repos:
- repo: https://github.com/psf/black
rev: 20.8b1
hooks:
- id: black

- repo: https://github.com/asottile/pyupgrade
rev: v2.11.0
hooks:
- id: pyupgrade

- repo: https://github.com/nbQA-dev/nbQA
rev: 0.5.9
hooks:
- id: nbqa-black
additional_dependencies: [black==20.8b1]
- id: nbqa-pyupgrade
additional_dependencies: [pyupgrade==2.11.0]
9 changes: 9 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
all: build

defualt: build

build:
jupyter-book build book/

clean: book/_build
rm -rf book/_build
32 changes: 30 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,30 @@
# pyhf-tutorial
Updating tutorial on pyhf
# `pyhf` Tutorial


Tutorial last given at the [March 2021 PyHEP topical meeting](https://indico.cern.ch/event/985425/).

**The tutorial is based off of [`pyhf` `v0.6.1`](https://pypi.org/project/pyhf/0.6.1/)**

[![Deploy Jupyter Book](https://github.com/pyhf/pyhf-tutorial/workflows/Deploy%20Jupyter%20Book/badge.svg?branch=main)](https://pyhf.github.io/pyhf-tutorial/)
[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/pyhf/pyhf-tutorial/main)

## Setup

In a Python virtual environment run the following

```
python -m pip install -r binder/requirements.txt
python -m pip install -r book/requirements.txt
```

## Build

To build the book after setup simply run

```
make build
```

## Past tutorials

- [LHC Reinterpretation Forum Workshop 2021](https://indico.cern.ch/event/982553/contributions/4219487/)
1 change: 1 addition & 0 deletions binder/apt.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
jq
5 changes: 5 additions & 0 deletions binder/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
pyhf[xmlio,minuit,contrib]==0.6.1
# visualization
ipywidgets~=7.5
pandas~=1.0
altair~=4.1
171 changes: 171 additions & 0 deletions book/Combinations.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,171 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Performing a Combination\n",
"\n",
"We'll demonstrate how a combination works by combining everything we've learned so far.\n",
"\n",
"## Loading the Workspace\n",
"\n",
"To do so, we'll use a simple workspace to demonstrate functionality of combinations."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import json\n",
"import pyhf"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"with open(\"data/2-bin_1-channel.json\") as serialized:\n",
" spec = json.load(serialized)\n",
"\n",
"workspace = pyhf.Workspace(spec)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Combine Workspaces\n",
"\n",
"Let's just try to combine naively right now."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"tags": [
"raises-exception"
]
},
"outputs": [],
"source": [
"pyhf.Workspace.combine(workspace, workspace)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"As we can see, we can't just combine a workspace with itself if it has some channel names in common. We try very hard in `pyhf` to make sure a combination \"makes sense\".\n",
"\n",
"Let's go ahead and rename the channel (as well as the measurement). Then try to combine."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"other_workspace = workspace.rename(\n",
" channels={\"singlechannel\": \"othersinglechannel\"},\n",
" modifiers={\"uncorr_bkguncrt\": \"otheruncorr_bkguncrt\"},\n",
" measurements={\"Measurement\": \"OtherMeasurement\"},\n",
")\n",
"\n",
"combined_workspace = pyhf.Workspace.combine(workspace, other_workspace)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"And did we combine?"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"print(f\" channels: {combined_workspace.channels}\")\n",
"print(f\" nbins: {combined_workspace.channel_nbins}\")\n",
"print(f\" samples: {combined_workspace.samples}\")\n",
"print(f\" modifiers: {combined_workspace.modifiers}\")\n",
"print(f\" parameters: {combined_workspace.parameters}\")\n",
"print(f\"measurements: {combined_workspace.measurement_names}\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Indeed. And at this point, we can just use all the same functionality we expect of pyhf, such as performing a fit:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"model = workspace.model()\n",
"data = workspace.data(model)\n",
"test_poi = 1.0\n",
"\n",
"pyhf.infer.hypotest(test_poi, data, model, test_stat=\"qtilde\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"other_model = other_workspace.model()\n",
"other_data = other_workspace.data(other_model)\n",
"\n",
"pyhf.infer.hypotest(test_poi, other_data, other_model, test_stat=\"qtilde\")"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"combined_model = combined_workspace.model()\n",
"combined_data = combined_workspace.data(combined_model)\n",
"\n",
"pyhf.infer.hypotest(test_poi, combined_data, combined_model, test_stat=\"qtilde\")"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.8.7"
}
},
"nbformat": 4,
"nbformat_minor": 4
}
Loading