Skip to content

Do gradient via mutating and unmutating cell #59

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Feb 2, 2020
Merged

Do gradient via mutating and unmutating cell #59

merged 3 commits into from
Feb 2, 2020

Conversation

oxinabox
Copy link
Member

@oxinabox oxinabox commented Feb 1, 2020

We can avoid copying every element every step if we just mutate the element we want to mutate on the original matrix, then mutate it back again after so we still have the original for the next step.

My micro benchmarks show this helps a bit.

@oxinabox
Copy link
Member Author

oxinabox commented Feb 1, 2020

I think similar could be done for the Dict case but I don't really understand how that works.
https://github.com/JuliaDiff/FiniteDifferences.jl/pull/59/files#diff-94e14cf43e653c4b17f49cc18895018cR33-R44

@longemen3000
Copy link

with the dict:

function grad(fdm, f, d::Dict{K, V}) where {K, V}
    ∇d = Dict{K, V}()
    for (k, v) in d
        temp = d[k]
        function f′(x)
            d[k] = x
            return f(d)
        end
        ∇d[k] = grad(fdm, f′, v)[1]
        d[k] = temp
    end
    return (dd, )
end

@oxinabox
Copy link
Member Author

oxinabox commented Feb 1, 2020

Benchmarks

using BenchmarkTools
using FiniteDifferences

const _fdm = central_fdm(2,1);
const xs = collect(1:0.1:200);
f(x) = sum(sin, x);
@btime grad($_fdm, $f, $xs);

Before:

222.774 ms (154790 allocations: 4.28 MiB)

With this PR;

212.711 ms (150807 allocations: 4.20 MiB)

So only a very modest improvement.
But still a step towards not allocating at all (beyond what the primal does).

@codecov-io
Copy link

codecov-io commented Feb 1, 2020

Codecov Report

Merging #59 into master will decrease coverage by 0.55%.
The diff coverage is 92.85%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #59      +/-   ##
==========================================
- Coverage    98.8%   98.24%   -0.56%     
==========================================
  Files           4        4              
  Lines         167      171       +4     
==========================================
+ Hits          165      168       +3     
- Misses          2        3       +1
Impacted Files Coverage Δ
src/grad.jl 96.66% <92.85%> (-1.01%) ⬇️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 70847c8...f677183. Read the comment docs.

Copy link
Member

@willtebbutt willtebbutt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks reasonable to me.

@@ -8,33 +8,40 @@ Approximate the gradient of `f` at `xs...` using `fdm`. Assumes that `f(xs...)`
"""
function grad end

function grad(fdm, f, x::AbstractArray{T}) where T <: Number
function _grad(fdm, f, x::AbstractArray{T}) where T <: Number
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doesn't technically matter, but maybe change / remove the type constraint on x, since we know it's going to be an Array by virtue of the fact it's called from grad

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could be a a few things actually.
But yes, the constraints are not needed

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am going to keep it because the T is useful

@oxinabox oxinabox mentioned this pull request Feb 2, 2020
@oxinabox oxinabox merged commit 573d645 into master Feb 2, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants