forked from rstudio/keras3
-
Notifications
You must be signed in to change notification settings - Fork 0
/
training_callbacks.Rmd
208 lines (157 loc) · 4.44 KB
/
training_callbacks.Rmd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
---
title: "Training Callbacks"
output: rmarkdown::html_vignette
vignette: >
%\VignetteIndexEntry{Training Callbacks}
%\VignetteEngine{knitr::rmarkdown}
%\VignetteEncoding{UTF-8}
---
```{r setup, include = FALSE}
library(keras)
knitr::opts_chunk$set(comment = NA, eval = FALSE)
```
## Overview
A callback is a set of functions to be applied at given stages of the training procedure. You can use callbacks to get a view on internal states and statistics of the model during training. You can pass a list of callbacks (as the keyword argument `callbacks`) to the `fit()` function. The relevant methods of the callbacks will then be called at each stage of the training.
For example:
```{r}
library(keras)
# generate dummy training data
data <- matrix(rexp(1000*784), nrow = 1000, ncol = 784)
labels <- matrix(round(runif(1000*10, min = 0, max = 9)), nrow = 1000, ncol = 10)
# create model
model <- keras_model_sequential()
# add layers and compile
model %>%
layer_dense(32, input_shape = c(784)) %>%
layer_activation('relu') %>%
layer_dense(10) %>%
layer_activation('softmax') %>%
compile(
loss='binary_crossentropy',
optimizer = optimizer_sgd(),
metrics='accuracy'
)
# fit with callbacks
model %>% fit(data, labels, callbacks = list(
callback_model_checkpoint("checkpoints.h5"),
callback_reduce_lr_on_plateau(monitor = "val_loss", factor = 0.1)
))
```
## Built in Callbacks
The following built-in callbacks are available as part of Keras:
<table class="ref-index">
<tbody>
<tr>
<!-- -->
<td>
`callback_progbar_logger()`
</td>
<td><p>Callback that prints metrics to stdout.</p></td>
</tr><tr>
<!-- -->
<td>
`callback_model_checkpoint()`
</td>
<td><p>Save the model after every epoch.</p></td>
</tr><tr>
<!-- -->
<td>
`callback_early_stopping()`
</td>
<td><p>Stop training when a monitored quantity has stopped improving.</p></td>
</tr><tr>
<!-- -->
<td>
`callback_remote_monitor()`
</td>
<td><p>Callback used to stream events to a server.</p></td>
</tr><tr>
<!-- -->
<td>
`callback_learning_rate_scheduler()`
</td>
<td><p>Learning rate scheduler.</p></td>
</tr><tr>
<!-- -->
<td>
`callback_tensorboard()`
</td>
<td><p>TensorBoard basic visualizations</p></td>
</tr><tr>
<!-- -->
<td>
`callback_reduce_lr_on_plateau()`
</td>
<td><p>Reduce learning rate when a metric has stopped improving.</p></td>
</tr><tr>
<!-- -->
<td>
`callback_csv_logger()`
</td>
<td><p>Callback that streams epoch results to a csv file</p></td>
</tr><tr>
<!-- -->
<td>
`callback_lambda()`
</td>
<td><p>Create a custom callback</p></td>
</tr>
</tbody>
</table>
## Custom Callbacks
You can create a custom callback by creating a new [R6 class](https://cran.r-project.org/web/packages/R6/vignettes/Introduction.html) that inherits from the `KerasCallback` class.
Here's a simple example saving a list of losses over each batch during training:
```{r}
library(keras)
# define custom callback class
LossHistory <- R6::R6Class("LossHistory",
inherit = KerasCallback,
public = list(
losses = NULL,
on_batch_end = function(batch, logs = list()) {
self$losses <- c(self$losses, logs[["loss"]])
}
))
# define model
model <- keras_model_sequential()
# add layers and compile
model %>%
layer_dense(units = 10, input_shape = c(784)) %>%
layer_activation(activation = 'softmax') %>%
compile(
loss = 'categorical_crossentropy',
optimizer = 'rmsprop'
)
# create history callback object and use it during training
history <- LossHistory$new()
model %>% fit(
X_train, Y_train,
batch_size=128, epochs=20, verbose=0,
callbacks= list(history)
)
# print the accumulated losses
history$losses
```
```
[1] 0.6604760 0.3547246 0.2595316 0.2590170 ...
```
### Fields
Custom callback objects have access to the current model and it's training parameters via the following fields:
`self$params`
: Named list with training parameters (eg. verbosity, batch size, number of epochs...).
`self$model`
: Reference to the Keras model being trained.
### Methods
Custom callback objects can implement one or more of the following methods:
`on_epoch_begin(epoch, logs)`
: Called at the beginning of each epoch.
`on_epoch_end(epoch, logs)`
: Called at the end of each epoch.
`on_batch_begin(batch, logs)`
: Called at the beginning of each batch.
`on_batch_end(batch, logs)`
: Called at the end of each batch.
`on_train_begin(logs)`
: Called at the beginning of training.
`on_train_end(logs)`
: Called at the end of training.