DeepLearning.scala is a DSL for creating complex neural networks.
With the help of DeepLearning.scala, a normal programmer is able to build complex neural networks from simple code. He still writes code as usual, and the only difference is that the code with DeepLearning.scala are differentiable, which let the code evolve itself and modify its parameters continuously.
Like Theano or other deep learning toolkits, DeepLearning.scala allows you to build neural networks from mathematical formula, which handle floats, doubles, GPU-accelerated N-dimensional arrays, and calculate derivatives of the weights in the formula.
Neural networks created by DeepLearning.scala are able to handle ADT data structures(e.g. HList and Coproduct), and calculate derivatives through these data structure.
Neural networks created by DeepLearning.scala may contains control flows like if
/else
/switch
/case
. Combining with ADT data structures, You can implement arbitary algorithms inside neural networks, and train variables used in the algorithms.
Neural networks created by DeepLearning.scala are composible. You can create large network by combining smaller networks. If two larger networks shares some sub-networks, the weights in shared sub-networks trained with one network affects the other network.
All the above features are statically type checked.
Version 1.0 is the current version with all above features. The final version will be released in Janary 2017.
- Support
for
/while
and other higher-order functions on differenitableSeq
s. - Support
for
/while
and other higher-order functions on GPU-accelerated differenitable N-dimensional arrays.
Version 2.0 will be released in March 2017.
- Support using custom
case class
es inside neural networks. - Support distributed models and distributed training on Spark.
Version 3.0 will be released in late 2017.
DeepLearning.scala is heavily inspired by my colleague @MarisaKirisame.
@milessabin's shapeless provides a solid foundation for type-level programming used in DeepLearning.scala.