Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Mark initialisations nograd, restrict signatures #1908

Merged
merged 5 commits into from
Mar 20, 2022

Conversation

mcabbott
Copy link
Member

Inspired by this:
https://discourse.julialang.org/t/mutating-arrays-not-supported-toy-problem/78057
this PR adds @non_differentiable to all initialisation functions.

test/utils.jl Outdated
@test size(init(3)) == (3,)
end
@test size(init(3, 4)) == (3, 4)
@test_skip size(init((3, 4, 5))) == (3, 4, 5)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not all of them accept a size tuple like this. Should they?

Or perhaps they should all only accept init(size::Integer...)?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for the latter.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in 9d1f5d5

I guess that makes this breaking, hence v0.13

@ToucheSir
Copy link
Member

Is there a use case for having init functions succeed in a gradient context? Could we throw a more informative error in the rrule instead?

@mcabbott
Copy link
Member Author

mcabbott commented Mar 20, 2022

I can't think of a use. I guess the argument here is that they are variants of rand / randn with different distributions, and those are nograd:

julia> gradient(x -> sum(rand(3) .* x), 4)
(2.438304105970437,)

@codecov-commenter
Copy link

codecov-commenter commented Mar 20, 2022

Codecov Report

Merging #1908 (9d1f5d5) into master (2aa2a26) will increase coverage by 0.41%.
The diff coverage is 96.42%.

@@            Coverage Diff             @@
##           master    #1908      +/-   ##
==========================================
+ Coverage   86.23%   86.64%   +0.41%     
==========================================
  Files          18       18              
  Lines        1438     1445       +7     
==========================================
+ Hits         1240     1252      +12     
+ Misses        198      193       -5     
Impacted Files Coverage Δ
src/utils.jl 95.67% <96.42%> (+3.42%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2aa2a26...9d1f5d5. Read the comment docs.

@mcabbott mcabbott changed the title Mark initialisations nograd Mark initialisations nograd, restrict signatures Mar 20, 2022
@mcabbott mcabbott added this to the v0.13 milestone Mar 20, 2022
@mcabbott mcabbott merged commit b6dbefb into FluxML:master Mar 20, 2022
@mcabbott mcabbott deleted the nograd_init branch March 20, 2022 13:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants