Skip to content

Nest markdown section headings below the main heading #511

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Nov 25, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions notebooks/quickstart.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -2086,7 +2086,7 @@
"id": "40c02f81-7e0d-44f3-8311-88867083f6f3",
"metadata": {},
"source": [
"# Storage\n",
"## Storage\n",
"\n",
"You've run you workflow and want to close your notebook for the day -- but you don't want to lose your data state. No problem! Just `.save()` your workflow. We'll go back and use our e-mail workflow as an example:"
]
Expand Down Expand Up @@ -2312,7 +2312,7 @@
"id": "53d72537-6db7-4b42-b729-fa8a16cc5815",
"metadata": {},
"source": [
"# Failure recovery\n",
"## Failure recovery\n",
"\n",
"If a graph raises an exception, the default behaviour is to save (pickle) a copy of the node into its directoy under the filename \"recovery\". These can then be manually re-loaded to investigate the failure state:"
]
Expand Down Expand Up @@ -2784,7 +2784,7 @@
"id": "4c7f5d84-d9f6-4eb5-9cf0-56494c53110c",
"metadata": {},
"source": [
"# Parallelization\n",
"## Parallelization\n",
"\n",
"`pyiron_workflow` actually splits apart data channels and the flow of data from \"signal\" channels and the flow of execution. This is important for while-loop flows, and if you want to learn more go check out the `deepdive.ipynb`. Most of the time, workflows form a Directed Acyclic Graph (DAG), and this execution flow can be completely automated -- you only need to define the flow of data.\n",
"\n",
Expand Down Expand Up @@ -2855,7 +2855,7 @@
"id": "5a66e5fe-7fdc-4c31-93ff-b1e21e9dd86c",
"metadata": {},
"source": [
"# For-loops\n",
"## For-loops\n",
"\n",
"You can quickly iterate over node instances to get a `pandas.DataFrame` linking looped input to output (non-looped input is accessible elsewhere, like in the node's input channels). This comes in two flavours: nested loops with `iter`:"
]
Expand Down
Loading