-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow more options for Gradient calculation #215
Conversation
Codecov Report
@@ Coverage Diff @@
## master #215 +/- ##
=========================================
- Coverage 62.92% 62.9% -0.03%
=========================================
Files 48 48
Lines 3361 3367 +6
=========================================
+ Hits 2115 2118 +3
- Misses 1246 1249 +3
Continue to review full report at Codecov.
|
Not sure why coverage is decreasing. |
@@ -1370,15 +1370,37 @@ Base.haskey(graph::Graph, name) = isnull(get_node_by_name(graph, name)) | |||
|
|||
|
|||
|
|||
node_name(::Void) = nothing | |||
node_name(xs::AbstractVector)=node_name.(xs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Couldn't you just call node_name.(x)
whereever you expect an array? On 0.6 at least that should behave the right way for a single tensor.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you? I'll test this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tested it, doesn't work out.
Because we don't always know when we expect an array.
gradients(y, x, grad_y)
can have any of its 3 parameters as an array of Tensors or as a Tensor, (though if y
is an array y_grad
must also).
And on 0.5 at least nodename.(x)
for x::Tensor
comes back with:
MethodError: no method matching start(::TensorFlow.Tensor{Int32})
So with that trialed, I will merge this now, as is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this would only work on 0.6.
It may be a good idea to drop 0.5 fairly quickly, but that's up to you / Jon.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yesterday I struggled to get this working and thought I'd commented here – but now I've tried again and it worked first time ¯\(ツ)/¯
@malmaud seems like you may be busy, so I hope you don't mind if I approve this. If you don't object I may also take a look at some other of Lyndon's PRs also.
Ya sure, thanks. I got busy with end of semester things, although I am
still plugging away at my autoops branch (it's a pretty big change).
…On Wed, May 17, 2017 at 9:05 AM Mike J Innes ***@***.***> wrote:
***@***.**** approved this pull request.
Yesterday I struggled to get this working and thought I'd commented here –
but now I've tried again and it worked first time ¯_(ツ)_/¯
@malmaud <https://github.com/malmaud> seems like you may be busy, so I
hope you don't mind if I approve this. If you don't object I may also take
a look at some other of Lyndon's PRs.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#215 (review)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AA8SvZ9Pe9-x-gY5QIGHZ2pNb8UYgTK5ks5r6vCzgaJpZM4NWYzI>
.
|
Solves #212
Possible issue with implementation
node_name
is now applied element-wise toAbstractVectors
.It makes the implementation short and simple but I'm not sure how I feel about it