Skip to content

Loss calculation fails with multiple outputs due to the LossWrapper #20373

@markomitos

Description

@markomitos

When trying to execute forward_pass of a multi-output model with a loss the loss calculation fails because of a call to squeeze_or_expands_to_same_rank call inside the LossWrapper, as the parameters passed are of type tuple and list instead of a singular Tensor. The function fails on the call of shape on the passed parameters:

def squeeze_or_expand_to_same_rank(x1, x2, expand_rank_1=True):

Loss' implementation initially calls to tf.convert_to_tenzor before the squeeze call while LossWrapper omits this step. As the wrapper is executed first, losses are no longer handle multiple outputs.

>>> loss = keras.losses.mean_squared_error(y_true, y_pred)

There are two ways to fix this problem that shouldn't pose too big of a problem:

  • add convert_to_tensor first in LossWrapper
  • or remove the squeeze_or_expands_to_same_rank call alltogether and rely on proper loss function implementation

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions