Skip to content

Commit 68afb24

Browse files
author
Github Actions
committed
eddiebergman: Merge HOTFIX master 0.14.3 into dev
1 parent 5208dc2 commit 68afb24

File tree

96 files changed

+2357
-2913
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

96 files changed

+2357
-2913
lines changed

development/.buildinfo

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
# Sphinx build info version 1
22
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
3-
config: 5cb66ffaf1a93d48276b86c4c7778f5e
3+
config: 1281a8de7e72cc26a3b0fce1416780a7
44
tags: 645f666f9bcd5a90fca523b33c5a78b7

development/_downloads/6507de4916ad481ba20cb60e6dc3309e/example_parallel_manual_spawning_python.py

Lines changed: 0 additions & 153 deletions
This file was deleted.

development/_downloads/986156f70b0b7eee4d669b791b35bd6d/example_parallel_manual_spawning_python.ipynb

Lines changed: 0 additions & 90 deletions
This file was deleted.

development/_downloads/baf53fc945368668a0cd202acebc6220/example_parallel_manual_spawning_cli.py

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,11 +11,17 @@
1111
This example shows how to start the dask scheduler and spawn
1212
workers for *Auto-sklearn* manually from the command line. Use this example
1313
as a starting point to parallelize *Auto-sklearn* across multiple
14-
machines. If you want to start everything manually from within Python
15-
please see :ref:`sphx_glr_examples_60_search_example_parallel_manual_spawning_python.py`.
14+
machines.
15+
1616
To run *Auto-sklearn* in parallel on a single machine check out the example
1717
:ref:`sphx_glr_examples_60_search_example_parallel_n_jobs.py`.
1818
19+
If you want to start everything manually from within Python
20+
please see ``:ref:sphx_glr_examples_60_search_example_parallel_manual_spawning_python.py``.
21+
22+
**NOTE:** Above example is disabled due to issue https://github.com/dask/distributed/issues/5627
23+
24+
1925
You can learn more about the dask command line interface from
2026
https://docs.dask.org/en/latest/setup/cli.html.
2127
Binary file not shown.

development/_downloads/c6746d1b897496495baebd219e94d74e/example_parallel_manual_spawning_cli.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Parallel Usage: Spawning workers from the command line\n\n*Auto-sklearn* uses\n`dask.distributed <https://distributed.dask.org/en/latest/index.html>`_\nfor parallel optimization.\n\nThis example shows how to start the dask scheduler and spawn\nworkers for *Auto-sklearn* manually from the command line. Use this example\nas a starting point to parallelize *Auto-sklearn* across multiple\nmachines. If you want to start everything manually from within Python\nplease see `sphx_glr_examples_60_search_example_parallel_manual_spawning_python.py`.\nTo run *Auto-sklearn* in parallel on a single machine check out the example\n`sphx_glr_examples_60_search_example_parallel_n_jobs.py`.\n\nYou can learn more about the dask command line interface from\nhttps://docs.dask.org/en/latest/setup/cli.html.\n\nWhen manually passing a dask client to Auto-sklearn, all logic\nmust be guarded by ``if __name__ == \"__main__\":`` statements! We use\nmultiple such statements to properly render this example as a notebook\nand also allow execution via the command line.\n\n## Background\n\nTo run Auto-sklearn distributed on multiple machines we need to set\nup three components:\n\n1. **Auto-sklearn and a dask client**. This will manage all workload, find new\n configurations to evaluate and submit jobs via a dask client. As this\n runs Bayesian optimization it should be executed on its own CPU.\n2. **The dask workers**. They will do the actual work of running machine\n learning algorithms and require their own CPU each.\n3. **The scheduler**. It manages the communication between the dask client\n and the different dask workers. As the client and all workers connect\n to the scheduler it must be started first. This is a light-weight job\n and does not require its own CPU.\n\nWe will now start these three components in reverse order: scheduler,\nworkers and client. Also, in a real setup, the scheduler and the workers should\nbe started from the command line and not from within a Python file via\nthe ``subprocess`` module as done here (for the sake of having a self-contained\nexample).\n"
18+
"\n# Parallel Usage: Spawning workers from the command line\n\n*Auto-sklearn* uses\n`dask.distributed <https://distributed.dask.org/en/latest/index.html>`_\nfor parallel optimization.\n\nThis example shows how to start the dask scheduler and spawn\nworkers for *Auto-sklearn* manually from the command line. Use this example\nas a starting point to parallelize *Auto-sklearn* across multiple\nmachines.\n\nTo run *Auto-sklearn* in parallel on a single machine check out the example\n`sphx_glr_examples_60_search_example_parallel_n_jobs.py`.\n\nIf you want to start everything manually from within Python\nplease see ``:ref:sphx_glr_examples_60_search_example_parallel_manual_spawning_python.py``.\n\n**NOTE:** Above example is disabled due to issue https://github.com/dask/distributed/issues/5627\n\n\nYou can learn more about the dask command line interface from\nhttps://docs.dask.org/en/latest/setup/cli.html.\n\nWhen manually passing a dask client to Auto-sklearn, all logic\nmust be guarded by ``if __name__ == \"__main__\":`` statements! We use\nmultiple such statements to properly render this example as a notebook\nand also allow execution via the command line.\n\n## Background\n\nTo run Auto-sklearn distributed on multiple machines we need to set\nup three components:\n\n1. **Auto-sklearn and a dask client**. This will manage all workload, find new\n configurations to evaluate and submit jobs via a dask client. As this\n runs Bayesian optimization it should be executed on its own CPU.\n2. **The dask workers**. They will do the actual work of running machine\n learning algorithms and require their own CPU each.\n3. **The scheduler**. It manages the communication between the dask client\n and the different dask workers. As the client and all workers connect\n to the scheduler it must be started first. This is a light-weight job\n and does not require its own CPU.\n\nWe will now start these three components in reverse order: scheduler,\nworkers and client. Also, in a real setup, the scheduler and the workers should\nbe started from the command line and not from within a Python file via\nthe ``subprocess`` module as done here (for the sake of having a self-contained\nexample).\n"
1919
]
2020
},
2121
{
Binary file not shown.
Loading
Loading
Loading
Binary file not shown.
Loading
Loading

development/_modules/autosklearn/estimators.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44
<head>
55
<meta charset="utf-8" />
66
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
7-
<title>autosklearn.estimators &#8212; AutoSklearn 0.14.2 documentation</title>
7+
<title>autosklearn.estimators &#8212; AutoSklearn 0.14.3 documentation</title>
88
<link rel="stylesheet" type="text/css" href="../../_static/pygments.css" />
99
<link rel="stylesheet" type="text/css" href="../../_static/bootstrap-sphinx.css" />
1010
<link rel="stylesheet" type="text/css" href="../../_static/sg_gallery.css" />
@@ -54,7 +54,7 @@
5454
</button>
5555
<a class="navbar-brand" href="../../index.html">
5656
auto-sklearn</a>
57-
<span class="navbar-text navbar-version pull-left"><b>0.14.2</b></span>
57+
<span class="navbar-text navbar-version pull-left"><b>0.14.3</b></span>
5858
</div>
5959

6060
<div class="collapse navbar-collapse nav-collapse">

0 commit comments

Comments
 (0)