-
Notifications
You must be signed in to change notification settings - Fork 6.8k
Upgrade nnvm to use automatic gradient correspondence guessing #3973
Conversation
head_grad_entry_.emplace_back(NodeEntry{nnvm::Node::Create(), 0, 0}); | ||
head_grad_map_[head_grad_entry_.back().node.get()] = i; | ||
NodeEntry ngrad{nnvm::Node::Create(), 0, 0}; | ||
head_grad_entry_.emplace_back(AttrHint(ngrad, g.outputs[i])); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
whats this for?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is for enforcing shape constraints in the gradient input. This is needed, in case some of the backward node is simply forward path, Say gradient of add returns an identity of ograd, and we need shape of the gradient input variable to start with
Remind me why did we need to register isbackward in op instead of marking
it in node?
…On Nov 25, 2016 9:36 AM, "Tianqi Chen" ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In src/executor/graph_executor.cc
<#3973>:
> @@ -162,8 +171,9 @@ nnvm::Graph GraphExecutor::InitFullGraph(
}
if (!need_grad) return g;
for (size_t i = 0; i < g.outputs.size(); ++i) {
- head_grad_entry_.emplace_back(NodeEntry{nnvm::Node::Create(), 0, 0});
- head_grad_map_[head_grad_entry_.back().node.get()] = i;
+ NodeEntry ngrad{nnvm::Node::Create(), 0, 0};
+ head_grad_entry_.emplace_back(AttrHint(ngrad, g.outputs[i]));
This is for enforcing shape constraints in the gradient input. This is
needed, in case some of the backward node is simply forward path, Say
gradient of add returns an identity of ograd, and we need shape of the
gradient input variable to start with
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#3973>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAiudJVuE9NV7LOXBsnp_38XvpqcuUo_ks5rBxyQgaJpZM4K8DC5>
.
|
There is no is_backward field in the node. We used to have it, but concluded that is too artificial. Normal operator can also occur in gradient path, in which case is_backward is not registered, and normal shape inference function is used |
* specific op for gradient aggregation * Upgrade nnvm to use automatic gradient correspondence guessing
* specific op for gradient aggregation * Upgrade nnvm to use automatic gradient correspondence guessing
…e#3973) * specific op for gradient aggregation * Upgrade nnvm to use automatic gradient correspondence guessing
…e#3973) * specific op for gradient aggregation * Upgrade nnvm to use automatic gradient correspondence guessing
…e#3973) * specific op for gradient aggregation * Upgrade nnvm to use automatic gradient correspondence guessing
* specific op for gradient aggregation * Upgrade nnvm to use automatic gradient correspondence guessing
@piiswrong