Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in solubility.py in Chapter 4 #22

Closed
tuffwave opened this issue Sep 7, 2020 · 3 comments
Closed

Error in solubility.py in Chapter 4 #22

tuffwave opened this issue Sep 7, 2020 · 3 comments

Comments

@tuffwave
Copy link

tuffwave commented Sep 7, 2020

I just copied and paste the code provided but after executing "model.fit(train_dataset, nb_epoch=100)", the error was shown to be below.
Please fix it and I really appreciate it.

import deepchem as dc
tasks, datasets, transformers = dc.molnet.load_delaney(featurizer='GraphConv')
train_dataset, valid_dataset, test_dataset = datasets

model = dc.models.GraphConvModel(n_tasks=1, mode='regression', dropout=0.2)
model.fit(train_dataset, nb_epoch=100)

TypeError Traceback (most recent call last)
in
----> 1 model.fit(train_dataset, nb_epoch=100)

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/deepchem/models/keras_model.py in fit(self, dataset, nb_epoch, max_checkpoints_to_keep, checkpoint_interval, deterministic, restore, variables, loss, callbacks, all_losses)
328 dataset, epochs=nb_epoch,
329 deterministic=deterministic), max_checkpoints_to_keep,
--> 330 checkpoint_interval, restore, variables, loss, callbacks, all_losses)
331
332 def fit_generator(self,

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/deepchem/models/keras_model.py in fit_generator(self, generator, max_checkpoints_to_keep, checkpoint_interval, restore, variables, loss, callbacks, all_losses)
414 inputs = inputs[0]
415
--> 416 batch_loss = apply_gradient_for_batch(inputs, labels, weights, loss)
417 current_step = self._global_step.numpy()
418

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in call(self, *args, **kwds)
455
456 tracing_count = self._get_tracing_count()
--> 457 result = self._call(*args, **kwds)
458 if tracing_count == self._get_tracing_count():
459 self._call_counter.called_without_tracing()

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in _call(self, *args, **kwds)
501 # This is the first call of call, so we have to initialize.
502 initializer_map = object_identity.ObjectIdentityDictionary()
--> 503 self._initialize(args, kwds, add_initializers_to=initializer_map)
504 finally:
505 # At this point we know that the initialization is complete (or less

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in _initialize(self, args, kwds, add_initializers_to)
406 self._concrete_stateful_fn = (
407 self._stateful_fn._get_concrete_function_internal_garbage_collected( # pylint: disable=protected-access
--> 408 *args, **kwds))
409
410 def invalid_creator_scope(*unused_args, **unused_kwds):

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _get_concrete_function_internal_garbage_collected(self, *args, **kwargs)
1846 if self.input_signature:
1847 args, kwargs = None, None
-> 1848 graph_function, _, _ = self._maybe_define_function(args, kwargs)
1849 return graph_function
1850

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _maybe_define_function(self, args, kwargs)
2148 graph_function = self._function_cache.primary.get(cache_key, None)
2149 if graph_function is None:
-> 2150 graph_function = self._create_graph_function(args, kwargs)
2151 self._function_cache.primary[cache_key] = graph_function
2152 return graph_function, args, kwargs

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/function.py in _create_graph_function(self, args, kwargs, override_flat_arg_shapes)
2039 arg_names=arg_names,
2040 override_flat_arg_shapes=override_flat_arg_shapes,
-> 2041 capture_by_value=self._capture_by_value),
2042 self._function_attributes,
2043 # Tell the ConcreteFunction to clean up its graph once it goes out of

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/framework/func_graph.py in func_graph_from_py_func(name, python_func, args, kwargs, signature, func_graph, autograph, autograph_options, add_control_dependencies, arg_names, op_return_value, collections, capture_by_value, override_flat_arg_shapes)
913 converted_func)
914
--> 915 func_outputs = python_func(*func_args, **func_kwargs)
916
917 # invariant: func_outputs contains only Tensors, CompositeTensors,

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/eager/def_function.py in wrapped_fn(*args, **kwds)
356 # wrapped allows AutoGraph to swap in a converted function. We give
357 # the function a weak reference to itself to avoid a reference cycle.
--> 358 return weak_wrapped_fn().wrapped(*args, **kwds)
359 weak_wrapped_fn = weakref.ref(wrapped_fn)
360

~/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages/tensorflow_core/python/framework/func_graph.py in wrapper(*args, **kwargs)
903 except Exception as e: # pylint:disable=broad-except
904 if hasattr(e, "ag_error_metadata"):
--> 905 raise e.ag_error_metadata.to_exception(e)
906 else:
907 raise

TypeError: in converted code:
relative to /Users/hankilee/opt/anaconda3/envs/home-dd/lib/python3.7/site-packages:

deepchem/models/keras_model.py:482 apply_gradient_for_batch  *
    grads = tape.gradient(batch_loss, vars)
tensorflow_core/python/eager/backprop.py:1014 gradient
    unconnected_gradients=unconnected_gradients)
tensorflow_core/python/eager/imperative_grad.py:76 imperative_grad
    compat.as_str(unconnected_gradients.value))
tensorflow_core/python/eager/backprop.py:138 _gradient_function
    return grad_fn(mock_op, *out_grads)
tensorflow_core/python/ops/math_grad.py:455 _UnsortedSegmentMaxGrad
    return _UnsortedSegmentMinOrMaxGrad(op, grad)
tensorflow_core/python/ops/math_grad.py:432 _UnsortedSegmentMinOrMaxGrad
    _GatherDropNegatives(op.outputs[0], op.inputs[1])

TypeError: 'NoneType' object is not subscriptable
@peastman
Copy link
Contributor

peastman commented Sep 7, 2020

The code in this repository is written to work with DeepChem 2.3. It looks like you're using more recent development code? As soon as we get 2.4 released (hopefully very soon!), we'll update the examples in this repository to work with it.

@tuffwave
Copy link
Author

tuffwave commented Sep 8, 2020

The code in this repository is written to work with DeepChem 2.3. It looks like you're using more recent development code? As soon as we get 2.4 released (hopefully very soon!), we'll update the examples in this repository to work with it.

I appreciate your kindly response. I am looking forward to your update soon.

@tuffwave
Copy link
Author

tuffwave commented Sep 8, 2020

I fixed this problem through reinstalling python 3.7, not python 3.8. by myself.

@tuffwave tuffwave closed this as completed Sep 8, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants