Skip to content

[API] [Layers] ParametrizedLayer/TrainableLayer interfaces to represent presence of parameters and trainability of the layer #217

Closed
@knok16

Description

@knok16

Hello, what about moving some aspects of Layer logic into separate interfaces, to offload Layer class?

For example, introduce ParametrizedLayer/TrainableLayer interfaces to encapsulate isTranable / paramCount logic and will be implemented only by layers that actually have variables/can be trained

public interface ParametrizedLayer {
    /**
     * Layer's variables
     */
    public val variables: List<Variable>

    /**
     * Returns amount of parameters
     */
    public val ParametrizedLayer.paramCount: Int
        get() = variables.sumOf { it.shape.numElements() }.toInt()
}

public interface TrainableLayer : ParametrizedLayer {
    /**
     * True, if layer's weights could be changed during training.
     * If false, layer's weights are frozen and could not be changed during the training.
     */
    public var isTrainable: Boolean
}

Benefits:

  • Layer class itself will become thinner and hence easier to grasp
  • Untrainable by nature (for example Flatten,Input,ActivationLayer) layers will not even have isTrainable flag and weight field
  • Main idea here is that the code Flatten().isTrainable = true logically doesn't make sense, and with ParametrizedLayer it will not compile
  • Instead of marking layers with NoGradient it will be possible just not to implement Trainable which will work as whitelisting for training (instead of current blacklisting) which is more explicit.
  • It will be possible to remove KGraph parameter from Layer::build method
  • It will be possible to move weight setter/getter to the model (example knok16@4dc0286)

Reference knok16@f5244b6

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions