Skip to content

Refactor forward and backward methods to allow passing a batch of data instead of one sample at a time #156

Open
@milancurcic

Description

@milancurcic

In support of #155.

This will impact the forward and backward methods in:

  • network type
  • layer type
  • dense_layer type
  • conv2d_layer type

Effectively, rather than looping over sample in a batch inside of network % train, we will pass batches of data all the way down to the lowest level, that is, the forward and backward methods of dense_layer and conv2d_layer types. Lowering the looping over the sample in a batch will also allow the implementation of a batchnorm_layer.

It will also potentially allow more efficient matmuls in dense and conv layers if we replace the stock matmul with some more specialized and efficient sgemm or similar from some flavor of BLAS or MKL.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions