lr and decay are deprecated in Keras optimizer, using learning_rate corrects the warning.
The corrected code for the cell I presume is :
model = vgg6(input_shape=image_shape, n_classes=n_classes)
Set up optimizer:
if optimizer == 'adam':
optimizer = tf.keras.optimizers.Adam(learning_rate=3e-4, beta_1=0.9, beta_2=0.999, epsilon=None, amsgrad=False)
elif optimizer == 'sgd':
optimizer = tf.keras.optimizers.SGD(learning_rate=0.01, momentum=0.9, decay=1e-6, nesterov=True)
else:
print('Could not recognize optimizer, using Adam')
optimizer = tf.keras.optimizers.Adam(learning_rate=3e-4, beta_1=0.9, beta_2=0.999, epsilon=None, amsgrad=False)
model.compile(optimizer=optimizer, loss=loss, metrics=['accuracy'])
print(model.summary())
lrand decay are deprecated in Keras optimizer, usinglearning_ratecorrects the warning.The corrected code for the cell I presume is :
model = vgg6(input_shape=image_shape, n_classes=n_classes)
Set up optimizer:
if optimizer == 'adam':
optimizer = tf.keras.optimizers.Adam(learning_rate=3e-4, beta_1=0.9, beta_2=0.999, epsilon=None, amsgrad=False)
elif optimizer == 'sgd':
optimizer = tf.keras.optimizers.SGD(learning_rate=0.01, momentum=0.9, decay=1e-6, nesterov=True)
else:
print('Could not recognize optimizer, using Adam')
optimizer = tf.keras.optimizers.Adam(learning_rate=3e-4, beta_1=0.9, beta_2=0.999, epsilon=None, amsgrad=False)
model.compile(optimizer=optimizer, loss=loss, metrics=['accuracy'])
print(model.summary())