Description
System information.
TensorFlow version (you are using): TF 2.11
Are you willing to contribute it (Yes/No) : Yes
Describe the feature and the current behavior/state.
Some people write custom loss functions from scratch that is not being differentiable. Then this leads getting None
gradients while training. Warning users directly could be a useful feature instead of saying no gradients are provided
.
There are way too many Stackoverflow posts about that, I'll include some of them:
https://stackoverflow.com/questions/63874265/keras-custom-loss-function-error-no-gradients-provided
https://stackoverflow.com/questions/73197501/raise-valueerror-no-gradients-provided-for-any-variables-custom-loss-function
https://stackoverflow.com/questions/59292992/tensorflow-2-custom-loss-no-gradients-provided-for-any-variable-error
https://stackoverflow.com/questions/65619581/no-gradients-provided-for-any-variable-for-custom-loss-function
https://stackoverflow.com/questions/70537503/custom-loss-function-error-valueerror-no-gradients-provided-for-any-variable
https://datascience.stackexchange.com/questions/116645/custom-loss-function-for-binary-classificatio-in-keras-gets-error-no-gradients
https://stackoverflow.com/questions/74074934/error-no-gradients-provided-for-any-variable-while-using-custom-loss
https://stackoverflow.com/questions/75738678/gradienttape-returning-none-with-custom-csi-loss-function
https://stackoverflow.com/questions/72259489/valueerror-no-gradients-provided-for-any-variable-custom-loss-function
...
Will this change the current api? How?
This will change the current API by adding some checks on loss function before starting to training, an error/warning can be thrown.
- Do you want to contribute a PR? (yes/no): Yes
- Briefly describe your candidate solution(if contributing): Probably add some checks using GradientTape in: https://github.com/keras-team/keras/blob/17af3fcb1d21f950fff097e0534a6ae56bd25a46/keras/engine/compile_utils.py#L104