Skip to content

Releases: matlab-deep-learning/constrained-deep-learning

Monotonic Battery State of Charge Example

28 Apr 13:43
dad9fa0
Compare
Choose a tag to compare

New example "Battery State of Charge Estimation Using Monotonic Neural Networks" added in BatteryStateOfChargeEstimationUsingMonotonicNeuralNetworks.md

This example shows how to train two monotonic neural networks to estimate the state of charge (SOC) of a battery, one to model the charging behavior, and one to model the discharging behavior. In this example, you train the networks to predict the rate of change of the state of charge and force the output to be positive or negative for the charging and discharging networks, respectively. This way, you enforce monotonicity of the battery state of charge by constraining its derivative to be positive or negative.

Fully Input Convex CNNs Example

13 Jan 10:02
Compare
Choose a tag to compare

The convex image classification example, TrainICNNOnCIFAR10Example.md, has been updated. The example now uses the convolutional architecture built by buildConvexCNN, instead of the fully-connected architecture from buildConstrainedNetwork.

Fully Input Convex CNNs

11 Nov 09:54
Compare
Choose a tag to compare

The function buildConvexCNN adds the capability to build fully input convex convolutional neural networks. The function trainConstrainedNetwork has been updated to facilitate training these networks.

More Flexible Convex MLPs

25 Sep 12:39
d53bfc2
Compare
Choose a tag to compare
  1. The convex architecture produced by the buildConstrainedNetwork function has been modified. The positivity constraint on the weights of the skip connections has been removed, as it was unnecessary for maintaining convexity.

  2. The PositiveNonDecreasingActivationFunction name-value argument in the buildConstrainedNetwork function has been renamed to ConvexNonDecreasingActivation.

Initial Release

25 Sep 12:30
Compare
Choose a tag to compare

This initial release of the Constrained Deep Learning repository includes examples that demonstrate how to design and train multi-layer perceptions (MLPs) under the following constraints:

  1. Convexity
  2. Monotonicity
  3. Lipschitz Continuity

For further details, please refer to the README.