-
-
Notifications
You must be signed in to change notification settings - Fork 615
Closed
Description
Hi,
How would you build a bidirectional RNN with Flux? Drawing from a few examples
I've written
using Pkg; for p in ["Flux"] Pkg.add(p) end
using Flux
# Bidirectional RNN
struct BRNN{L,D}
forward :: L
backward :: L
dense :: D
end
Flux.@functor BRNN
function BRNN(in::Integer, hidden::Integer, out::Integer, σ = relu)
return BRNN(
RNN(in, hidden, σ), # forward
RNN(in, hidden, σ), # backward
Dense(2hidden, out, σ)
)
end
function (m::BRNN)(xs)
m.dense(vcat(m.forward(xs), reverse(m.backward(reverse(xs)))))
end
inSize = 5
hiddenSize = 3
outSize = 1
trn = [(rand(inSize), rand(outSize)) for _ in 1:8]
@info "trn", summary(trn)
m = BRNN(inSize, hiddenSize, outSize)
loss(x, y) = Flux.mse(m(x), y)
ps = Flux.params(m)
opt = ADAM()
Flux.train!(loss, ps, trn, opt)errors with
ERROR: LoadError: Mutating arrays is not supported
Flux does not like reversing the input data.
reverse(m.backward(reverse(xs)))Flux.flip(m.backward, xs)but does not produce a compatible output shape for this example.
How would you approach it?
Metadata
Metadata
Assignees
Labels
No labels