Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Cannot read property 'backend' of undefined #4296

Closed
loretoparisi opened this issue Nov 23, 2020 · 21 comments
Closed

TypeError: Cannot read property 'backend' of undefined #4296

loretoparisi opened this issue Nov 23, 2020 · 21 comments

Comments

@loretoparisi
Copy link

I'm using the latest version of tfjs-node on npm:

{
 "peerDependencies": {
    "@tensorflow/tfjs-core": "^2.4.0"
  },
  "dependencies": {
      "@tensorflow/tfjs-converter": "^2.4.0",
      "@tensorflow/tfjs-node": "^2.4.0"
     }
}

I get this error when loading a saved model with loadSavedModel

TypeError: Cannot read property 'backend' of undefined
    at Engine.moveData (/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3280:31)
    at DataStorage.get (/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:115:28)
    at NodeJSKernelBackend.getInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:153:43)
    at NodeJSKernelBackend.getMappedInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1487:30)
    at NodeJSKernelBackend.runSavedModel (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1506:66)
    at TFSavedModel.predict (/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:362:52)
    at /lib/tests/models/audio.js:44:22
const tf = require('@tensorflow/tfjs-node');

(function () {

    const modelPath='/root/saved_model/';
    
    // load model
    tf.node.loadSavedModel(modelPath)
    .then(model => {
       // it holds a waveform of audio file
        const data = require('fs').readFileSync('/root/test.json');
        const waveform = JSON.parse(data).data;
        const inputTensor = tf.tensor2d(waveform, [ waveform.length, 1], 'float32' );
        const inputs = {
            audio_id: '',
            mix_spectrogram: null,
            mix_stft: null,
            waveform: inputTensor
        };
        return model.predict(inputs);
    })
    .then(output => {
        console.dir(output, { depth: null, maxArrayLength: null });
    })
    .catch(error => {
        console.error(error);
    })

    // load model metadata
    tf.node.getMetaGraphsFromSavedModel(modelPath)
    .then(modelInfo => {
        console.dir(modelInfo[0].signatureDefs.serving_default.outputs, { depth: null, maxArrayLength: null });
        console.dir(modelInfo[0].signatureDefs.serving_default.inputs, { depth: null, maxArrayLength: null });
    })
    .catch(error => {
        console.error(error);
    })

}).call(this);
@rthadur
Copy link
Contributor

rthadur commented Nov 23, 2020

i see you are using version 2.4 , can you try latest version 2.7 and let us know ?

@rthadur rthadur self-assigned this Nov 23, 2020
@loretoparisi
Copy link
Author

@rthadur thanks I have updated to

 "peerDependencies": {
    "@tensorflow/tfjs-core": "^2.7.0"
  },
  "dependencies": {
      "@tensorflow/tfjs-converter": "^2.7.0",
      "@tensorflow/tfjs-node": "^2.7.0"
     }
}

and I get exactly the same error. I have also verified that the installed version was the 2.7.0:

Schermata 2020-11-23 alle 19 03 05

@rthadur
Copy link
Contributor

rthadur commented Nov 23, 2020

Can you please check if this issue is related.
cc @tafsiri

@loretoparisi
Copy link
Author

loretoparisi commented Nov 23, 2020

Not sure it's related, because I'm not consuming data (as in the example), so there should be no problem of yelding those tensor data, but it's my intuition, maybe it's not correct. In the meantime I digged a bit into the offending code here

 Engine.prototype.moveData = function (backend, dataId) {
        console.log(backend,dataId)
        var info = this.state.tensorInfo.get(dataId); 
        var srcBackend = info.backend; // <--- line 3820, TypeError: Cannot read property 'backend' of undefined
        var values = this.readSync(dataId);

and dumped the backend and dataId:

NodeJSKernelBackend {
  binding: {},
  isGPUPackage: false,
  isUsingGpuDevice: false,
  tensorMap: DataStorage {
    backend: [Circular],
    dataMover: Engine {
      ENV: [Environment],
      registry: [Object],
      registryFactory: [Object],
      pendingBackendInitId: 0,
      state: [EngineState],
      backendName: 'tensorflow',
      backendInstance: [Circular],
      profiler: [Profiler]
    },
    data: WeakMap { <items unknown> },
    dataIdsCount: 1
  }
} undefined

Hope this helps!

@loretoparisi
Copy link
Author

loretoparisi commented Nov 23, 2020

More help. Looking at the model getMetaGraphsFromSavedModel input modelInfo[0].signatureDefs.serving_default.inputs

The shape of the waveform input is the following:

 {
    dtype: 'float32',
    tfDtype: 'DT_FLOAT',
    name: 'Placeholder:0',
    shape: [
      {
        wrappers_: null,
        messageId_: undefined,
        arrayIndexOffset_: -1,
        array: [ -1 ],
        pivot_: 1.7976931348623157e+308,
        convertedPrimitiveFields_: {}
      },
      {
        wrappers_: null,
        messageId_: undefined,
        arrayIndexOffset_: -1,
        array: [ 2 ],
        pivot_: 1.7976931348623157e+308,
        convertedPrimitiveFields_: {}
      }
    ]
  }

@rthadur rthadur assigned lina128 and unassigned rthadur Nov 23, 2020
@tafsiri tafsiri added comp:node.js type:bug Something isn't working and removed type:others labels Nov 23, 2020
@tafsiri tafsiri self-assigned this Nov 23, 2020
@tafsiri
Copy link
Contributor

tafsiri commented Nov 23, 2020

The two issues are related though there may be different paths that produce the same behaviour. @loretoparisi are you able to share the model you are using with us?

@loretoparisi
Copy link
Author

loretoparisi commented Nov 23, 2020

@tafsiri Yes you can download the saved model from here

@lina128 lina128 removed their assignment Nov 23, 2020
@loretoparisi
Copy link
Author

Hi @tafsiri do you need any additional info to check this issue? Thanks a lot!

@tafsiri
Copy link
Contributor

tafsiri commented Nov 30, 2020

Not yet, just got back from a long holiday in the US so will be able to take a closer look this week.

@tafsiri
Copy link
Contributor

tafsiri commented Dec 3, 2020

So the cause of the error you are seeing is that you are passing null values as inputs to the model. Your code has

const inputs = {
	audio_id: '',
	mix_spectrogram: null,
	mix_stft: null,
	waveform: inputTensor
};

All of those inputs need to be tensors. Here is the code i used to get past that point (just creating some random data):

tf.node.loadSavedModel(modelPath, ['serve'], 'serving_default')
  .then(model => {
      const inputs = {
          audio_id: tf.tensor(['id']),
          mix_spectrogram: tf.randomNormal([2, 512, 1024, 2]),
          mix_stft: tf.randomNormal([2, 2049, 2]),
          waveform: tf.randomNormal([2, 2])
      };
      return model.predict(inputs);
  })
  .then(output => {
      console.dir(output, { depth: null, maxArrayLength: null });
  })
  .catch(error => {
      console.error(error);
  })

Beyond that I still ran into an issue Error: Session fail to run with error: Placeholder_1:0 is both fed and fetched., and looking at some of the signature def info (see below), I noticed that audio_id is both an input and and output and is referring to the same placeholder. From what I can tell this isn't allowed, it appears one should at least call tf.identity on the input placeholder if you are trying to pass it through. Have you been able to execute this model succesfully in python (or otherwise know that it works in python)?

inputs {
  audio_id: {
    dtype: 'string',
    tfDtype: 'DT_STRING',
    name: 'Placeholder_1:0',
    shape: []
  },
  mix_spectrogram: {
    dtype: 'float32',
    tfDtype: 'DT_FLOAT',
    name: 'strided_slice_3:0',
    shape: [ [Object], [Object], [Object], [Object] ]
  },
  mix_stft: {
    dtype: 'complex64',
    tfDtype: 'DT_COMPLEX64',
    name: 'transpose_1:0',
    shape: [ [Object], [Object], [Object] ]
  },
  waveform: {
    dtype: 'float32',
    tfDtype: 'DT_FLOAT',
    name: 'Placeholder:0',
    shape: [ [Object], [Object] ]
  }
}
outputs {
  accompaniment: {
    dtype: 'float32',
    tfDtype: 'DT_FLOAT',
    name: 'strided_slice_23:0',
    shape: [ [Object], [Object] ]
  },
  audio_id: {
    dtype: 'string',
    tfDtype: 'DT_STRING',
    name: 'Placeholder_1:0',
    shape: []
  },
  vocals: {
    dtype: 'float32',
    tfDtype: 'DT_FLOAT',
    name: 'strided_slice_13:0',
    shape: [ [Object], [Object] ]
  }
}

@loretoparisi
Copy link
Author

loretoparisi commented Dec 3, 2020

@tafsiri thanks for your analysis. So yes the model in Python 3.7 and Tensorflow 1.15 works fine. Looking at the code, one of the mix_spectrogram, mix_stft, waveform is mandatory, so at least one among them - in the Python version - it is required (that's the audio source). I have then tried this way as you did:

const inputTensor = tf.tensor2d(waveform.data, [ waveform.data.length, 1], 'float32' );
const inputs = {
            audio_id: tf.tensor(['id']),
            mix_spectrogram: null,
            mix_stft: null,
            waveform: inputTensor
        };

where waveform.data contains a wave with shape [88175, 2 ] of float32

but now I get

TypeError: Cannot read property 'dataId' of null
    at NodeJSKernelBackend.getInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:153:58)
    at NodeJSKernelBackend.getMappedInputTensorIds (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1487:30)
    at NodeJSKernelBackend.runSavedModel (/node_modules/@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js:1506:66)
    at TFSavedModel.predict (//node_modules/@tensorflow/tfjs-node/dist/saved_model.js:362:52)

so the error this time is here (@tensorflow/tfjs-node/dist/nodejs_kernel_backend.js line 153) in function getInputTensorIds of the nodejs backend:

    // Prepares Tensor instances for Op execution.
    NodeJSKernelBackend.prototype.getInputTensorIds = function (tensors) {
        console.log(tensors)
        var ids = [];
        for (var i = 0; i < tensors.length; i++) {
            if (tensors[i] instanceof int64_tensors_1.Int64Scalar) {
                // Then `tensors[i]` is a Int64Scalar, which we currently represent
                // using an `Int32Array`.
                var value = tensors[i].valueArray;
                var id = this.binding.createTensor([], this.binding.TF_INT64, value);
                ids.push(id);
            }
            else {
                var info = this.tensorMap.get(tensors[i].dataId);
                // TODO - what about ID in this case? Handle in write()??
                if (info.values != null) {
                    // Values were delayed to write into the TensorHandle. Do that before
                    // Op execution and clear stored values.
                    info.id =
                        this.binding.createTensor(info.shape, info.dtype, info.values);
                    info.values = null;
                }
                ids.push(info.id);
            }
        }
        return ids;
    };

In my example the tensors list is the following

0 Tensor {
  kept: false,
  isDisposedInternal: false,
  shape: [ 1 ],
  dtype: 'string',
  size: 1,
  strides: [],
  dataId: {},
  id: 0,
  rankType: '1'
}
1 null

and in fact we get a null for tensors[ 1 ].
So if I did as you tried

const inputTensor = tf.tensor2d(waveform.data, [ waveform.data.length, 1], 'float32' );
        //const inputTensor = tf.tensor1d(waveform, 'float32')
        const inputs = {
            audio_id: tf.tensor(['id']),
            mix_spectrogram: tf.randomNormal([2, 2]),
            mix_stft: tf.randomNormal([2, 2]),
            waveform: inputTensor
        };

I come out with the Error: Session fail to run with exactly the same error: Placeholder_1:0 is both fed and fetched. even with a correct input (at least for waveform).

Question. Could it be this related how the model has been saved from the original checkpoint?
We have used this script for converting that makes use of tf.estimator.Estimator. export_saved_model :

def to_predictor(estimator, directory=DEFAULT_EXPORT_DIRECTORY):
    """ Exports given estimator as predictor into the given directory
    and returns associated tf.predictor instance.

    :param estimator: Estimator to export.
    :param directory: (Optional) path to write exported model into.
    """
    def receiver():
        shape = (None, estimator.params['n_channels'])
        features = {
            'waveform': tf.compat.v1.placeholder(tf.float32, shape=shape),
            'audio_id': tf.compat.v1.placeholder(tf.string)}
        return tf.estimator.export.ServingInputReceiver(features, features)

    estimator.export_saved_model(directory, receiver)
    versions = [
        model for model in Path(directory).iterdir()
        if model.is_dir() and 'temp' not in str(model)]
    latest = str(sorted(versions)[-1])
    return predictor.from_saved_model(latest)

I think the error is here tf.estimator.export.ServingInputReceiver(features, features) passing features as input, that is not correct I think since the signature is

tf.estimator.export.ServingInputReceiver(
    features, receiver_tensors, receiver_tensors_alternatives=None
)

@tafsiri
Copy link
Contributor

tafsiri commented Dec 3, 2020

It is possible that tf.Estimator adds some preprocessing that is outside of the saved model (and which for example allows you to pass just one of 3 of the audio related inputs). We don't directly support estimator models in tfjs-node, there are some parts of that api that are not part of the c++ api used under the hood (they are just in the python layer). You might be able to modify tf.estimator.export.ServingInputReceiver(features, features) to tweak the savedModel to just have the features you plan to use, but I don't really know if that will work/is possible.

If you are able to execute the saved model in python without instantiating an estimator instance, that may suggest a path to using this model with tfjs-node.

Also going to cc @pyu10055 who may know about estimator compatibility.

@loretoparisi
Copy link
Author

@tafsiri thank you, I appreciate it! So we did some step forward, using the checkpoint directly! @shoegazerstella is trying in this way

import os
import tensorflow as tf
trained_checkpoint_prefix = 'pretrained_models/2stems/model'
export_dir = os.path.join('export_dir', '0')
graph = tf.Graph()
with tf.compat.v1.Session(graph=graph) as sess:
    loader = tf.compat.v1.train.import_meta_graph(trained_checkpoint_prefix + '.meta')
    loader.restore(sess, trained_checkpoint_prefix)
    builder = tf.compat.v1.saved_model.builder.SavedModelBuilder(export_dir)
    builder.add_meta_graph_and_variables(sess,
                                         [tf.saved_model.TRAINING, tf.saved_model.SERVING],
                                         strip_default_attrs=True)
    builder.save()  

At this point it should be done, but we now get a Error: The SavedModel does not have signature: serving_default

2020-12-04 08:08:46.918697: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host, Default Version
Error: The SavedModel does not have signature: serving_default
    at getSignatureDefEntryFromMetaGraphInfo (/node_modules/@tensorflow/tfjs-node/dist/saved_model.js:210:23)

Thank you!

@google-ml-butler
Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you.

@google-ml-butler
Copy link

Closing as stale. Please @mention us if this needs more attention.

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@pistolla
Copy link

pistolla commented Jun 5, 2022

Im facing the same issue with "@tensorflow/tfjs-core": "^3.18.0" when using it with reactjs. None of the above response have resolved the issue.

@pistolla
Copy link

pistolla commented Jun 5, 2022

Screenshot from 2022-06-05 08-02-56

@pistolla
Copy link

pistolla commented Jun 5, 2022

Screenshot from 2022-06-05 08-11-41
Screenshot from 2022-06-05 08-11-48

I wonder how you are accessing these weakmaps.

@waqardongre
Copy link

@pistolla , if you have still issue this topic may help you, #4684 (comment)

@ClaytonSmith
Copy link

booleanMaskAsync Does not work inside of tf.tidy(() =>{ ..., it does work outside of tidy

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants