Skip to content

Conversation

@pshashk
Copy link
Contributor

@pshashk pshashk commented Sep 6, 2018

No description provided.

@MikeInnes
Copy link
Member

Looks good, can you add a couple of gradient checks? It'd also be good to support cumsum(x) (i.e. no dims provided).

There's a risk with the UpperTriangular support that in-place gradients will error out, but I think that's acceptable (we may have discussed this on slack already?).

@pshashk
Copy link
Contributor Author

pshashk commented Sep 6, 2018

Tried to add tests, but they all failed. Not sure what am I doing wrong.

@pshashk
Copy link
Contributor Author

pshashk commented Sep 7, 2018

  • From the stack trace, it seems like the internals of reverse and cumsum functions (the ones with dims argument) are being fed with Tracked* values.
  • Despite the similar definitions, gradtest for vector cumsum fails, but the one for vector reverse passes.
  • UpperTriangular and LowerTriangular throws: TypeError: in TrackedArray, in A, expected A<:AbstractArray{T,N}, got Type{TrackedArray{…,Array{Float64,2}}}

I will be grateful to receive advice on how to fix these issues.

@MikeInnes
Copy link
Member

Sorry for the annoying churn, but would it be convenient to split these into separate PRs for each piece of this? Would be quicker to review that way.

@pshashk
Copy link
Contributor Author

pshashk commented Mar 26, 2019

Thanks for the reminder. Will do a separate Tracker.jl PRs for Upper/LowerTriangular and reverse. What about cumsum? There is an implementation with correct gradient, passing tests and optimizations in #524.

@pshashk pshashk closed this Mar 26, 2019
@MikeInnes
Copy link
Member

If that looks good to you then feel free to leave it and we'll merge that instead. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants