-
Couldn't load subscription status.
- Fork 19.6k
Description
When trying to execute forward_pass of a multi-output model with a loss the loss calculation fails because of a call to squeeze_or_expands_to_same_rank call inside the LossWrapper, as the parameters passed are of type tuple and list instead of a singular Tensor. The function fails on the call of shape on the passed parameters:
keras/keras/src/losses/loss.py
Line 109 in bce176f
| def squeeze_or_expand_to_same_rank(x1, x2, expand_rank_1=True): |
Loss' implementation initially calls to tf.convert_to_tenzor before the squeeze call while LossWrapper omits this step. As the wrapper is executed first, losses are no longer handle multiple outputs.
keras/keras/src/losses/losses.py
Line 1291 in bce176f
| >>> loss = keras.losses.mean_squared_error(y_true, y_pred) |
There are two ways to fix this problem that shouldn't pose too big of a problem:
- add convert_to_tensor first in LossWrapper
- or remove the squeeze_or_expands_to_same_rank call alltogether and rely on proper loss function implementation