Skip to content

Trying to save a model compiled with tfa.MultiOptimizer | gives error saying TypeError: ('Not JSON Serializable:', ... #2771

@maifeeulasad

Description

@maifeeulasad

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Kaggle Kernel
  • TensorFlow version and how it was installed (source or binary): binary(pip)
  • TensorFlow-Addons version and how it was installed (source or binary): 0.14.0 binary(pip)
  • Python version: 3.7.12
  • Is GPU used? (yes/no): tried both

Describe the bug

I'm trying to save a model which is developed and compiled with MultiOptimizer provided by tensorflow-addons. But it keeps giving me an error saying:

TypeError: ('Not JSON Serializable:', ...

I tried with different models, environments, and versions.

Code to reproduce the issue
Kernel: https://www.kaggle.com/code/maifeeulasad/tfa-multioptimizer-model-save?scriptVersionId=108636090

Other info / logs

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
/tmp/ipykernel_20/1141714396.py in <module>
     24         batch_size=32,
     25       callbacks=callbacks,
---> 26         validation_data=(valid_xs, valid_ys))

/opt/conda/lib/python3.7/site-packages/keras/engine/training.py in fit(self, x, y, batch_size, epochs, verbose, callbacks, validation_split, validation_data, shuffle, class_weight, sample_weight, initial_epoch, steps_per_epoch, validation_steps, validation_batch_size, validation_freq, max_queue_size, workers, use_multiprocessing)
   1228           epoch_logs.update(val_logs)
   1229 
-> 1230         callbacks.on_epoch_end(epoch, epoch_logs)
   1231         training_logs = epoch_logs
   1232         if self.stop_training:

/opt/conda/lib/python3.7/site-packages/keras/callbacks.py in on_epoch_end(self, epoch, logs)
    411     logs = self._process_logs(logs)
    412     for callback in self.callbacks:
--> 413       callback.on_epoch_end(epoch, logs)
    414 
    415   def on_train_batch_begin(self, batch, logs=None):

/tmp/ipykernel_20/3193528328.py in on_epoch_end(self, epoch, logs)
      3         print('name: ' + self.model._name)
      4         self.model.save('epoch-' + str(epoch + 1) + '-' + self.model._name + '.h5',overwrite=True,
----> 5     include_optimizer=True,)
      6 
      7 callbacks = [ModelSaverCallback()]

/opt/conda/lib/python3.7/site-packages/keras/engine/training.py in save(self, filepath, overwrite, include_optimizer, save_format, signatures, options, save_traces)
   2144     # pylint: enable=line-too-long
   2145     save.save_model(self, filepath, overwrite, include_optimizer, save_format,
-> 2146                     signatures, options, save_traces)
   2147 
   2148   def save_weights(self,

/opt/conda/lib/python3.7/site-packages/keras/saving/save.py in save_model(model, filepath, overwrite, include_optimizer, save_format, signatures, options, save_traces)
    144           'or using `save_weights`.')
    145     hdf5_format.save_model_to_hdf5(
--> 146         model, filepath, overwrite, include_optimizer)
    147   else:
    148     with generic_utils.SharedObjectSavingScope():

/opt/conda/lib/python3.7/site-packages/keras/saving/hdf5_format.py in save_model_to_hdf5(model, filepath, overwrite, include_optimizer)
    112       if isinstance(v, (dict, list, tuple)):
    113         f.attrs[k] = json.dumps(
--> 114             v, default=json_utils.get_json_type).encode('utf8')
    115       else:
    116         f.attrs[k] = v

/opt/conda/lib/python3.7/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw)
    236         check_circular=check_circular, allow_nan=allow_nan, indent=indent,
    237         separators=separators, default=default, sort_keys=sort_keys,
--> 238         **kw).encode(obj)
    239 
    240 

/opt/conda/lib/python3.7/json/encoder.py in encode(self, o)
    197         # exceptions aren't as detailed.  The list call should be roughly
    198         # equivalent to the PySequence_Fast that ''.join() would do.
--> 199         chunks = self.iterencode(o, _one_shot=True)
    200         if not isinstance(chunks, (list, tuple)):
    201             chunks = list(chunks)

/opt/conda/lib/python3.7/json/encoder.py in iterencode(self, o, _one_shot)
    255                 self.key_separator, self.item_separator, self.sort_keys,
    256                 self.skipkeys, _one_shot)
--> 257         return _iterencode(o, 0)
    258 
    259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr,

/opt/conda/lib/python3.7/site-packages/keras/saving/saved_model/json_utils.py in get_json_type(obj)
    140     return obj.value
    141 
--> 142   raise TypeError('Not JSON Serializable:', obj)

TypeError: ('Not JSON Serializable:', <tf.Tensor 'gradient_tape/model_w_multioptimizer/dense_2/MatMul:0' shape=(3, 12) dtype=float32>)

Ref: tensorflow/tensorflow#58184

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions