File tree Expand file tree Collapse file tree 2 files changed +17
-9
lines changed Expand file tree Collapse file tree 2 files changed +17
-9
lines changed Original file line number Diff line number Diff line change 18
18
</p >
19
19
20
20
21
+ [ ![ PyPI - Python Version] ( https://img.shields.io/pypi/pyversions/pytorch-lightning )] ( https://pypi.org/project/pytorch-lightning/ )
21
22
[ ![ PyPI Status] ( https://badge.fury.io/py/pytorch-lightning.svg )] ( https://badge.fury.io/py/pytorch-lightning )
22
23
[ ![ PyPI Status] ( https://pepy.tech/badge/pytorch-lightning )] ( https://pepy.tech/project/pytorch-lightning )
23
24
[ ![ Conda] ( https://img.shields.io/conda/v/conda-forge/pytorch-lightning?label=conda&color=success )] ( https://anaconda.org/conda-forge/pytorch-lightning )
25
+ [ ![ Conda] ( https://img.shields.io/conda/dn/conda-forge/pytorch-lightning?color=blue&label=conda%20downloads )] ( https://anaconda.org/conda-forge/pytorch-lightning )
24
26
[ ![ DockerHub] ( https://img.shields.io/docker/pulls/pytorchlightning/pytorch_lightning.svg )] ( https://hub.docker.com/r/pytorchlightning/pytorch_lightning )
25
27
[ ![ codecov] ( https://codecov.io/gh/PyTorchLightning/pytorch-lightning/branch/master/graph/badge.svg )] ( https://codecov.io/gh/PyTorchLightning/pytorch-lightning )
26
28
Original file line number Diff line number Diff line change @@ -897,6 +897,12 @@ Pointer to the trainer
897
897
898
898
------------
899
899
900
+ use_amp
901
+ ~~~~~~~
902
+ True if using Automatic Mixed Precision (AMP)
903
+
904
+ ------------
905
+
900
906
use_ddp
901
907
~~~~~~~
902
908
True if using ddp
@@ -1021,7 +1027,7 @@ configure_ddp
1021
1027
configure_sync_batchnorm
1022
1028
~~~~~~~~~~~~~~~~~~~~~~~~
1023
1029
1024
- .. autofunction :: pytorch_lightning.core.lightning.LightningModule.configure_ddp
1030
+ .. autofunction :: pytorch_lightning.core.lightning.LightningModule.configure_sync_batchnorm
1025
1031
:noindex:
1026
1032
1027
1033
get_progress_bar_dict
@@ -1153,28 +1159,28 @@ on_pretrain_routine_end
1153
1159
.. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_pretrain_routine_end
1154
1160
:noindex:
1155
1161
1156
- on_test_epoch_start
1162
+ on_test_batch_start
1157
1163
~~~~~~~~~~~~~~~~~~~
1158
1164
1159
- .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_epoch_start
1165
+ .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_batch_start
1160
1166
:noindex:
1161
1167
1162
- on_test_epoch_end
1168
+ on_test_batch_end
1163
1169
~~~~~~~~~~~~~~~~~
1164
1170
1165
- .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_epoch_end
1171
+ .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_batch_end
1166
1172
:noindex:
1167
1173
1168
- on_test_batch_start
1174
+ on_test_epoch_start
1169
1175
~~~~~~~~~~~~~~~~~~~
1170
1176
1171
- .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_batch_start
1177
+ .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_epoch_start
1172
1178
:noindex:
1173
1179
1174
- on_test_batch_end
1180
+ on_test_epoch_end
1175
1181
~~~~~~~~~~~~~~~~~
1176
1182
1177
- .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_batch_end
1183
+ .. autofunction :: pytorch_lightning.core.hooks.ModelHooks.on_test_epoch_end
1178
1184
:noindex:
1179
1185
1180
1186
on_train_batch_start
You can’t perform that action at this time.
0 commit comments