Open
Description
In support of #155.
This will impact the forward
and backward
methods in:
network
typelayer
typedense_layer
typeconv2d_layer
type
Effectively, rather than looping over sample in a batch inside of network % train
, we will pass batches of data all the way down to the lowest level, that is, the forward
and backward
methods of dense_layer
and conv2d_layer
types. Lowering the looping over the sample in a batch will also allow the implementation of a batchnorm_layer
.
It will also potentially allow more efficient matmul
s in dense and conv layers if we replace the stock matmul
with some more specialized and efficient sgemm
or similar from some flavor of BLAS or MKL.