TypedFlow is a typed, higher-order frontend to TensorFlow and a high-level library for deep-learning.
The main design principles are:
- To make the parameters of layers explicit. This choice makes sharing of parameters explicit and allows to implement “layers” as pure functions.
- To provide as precise as possible types. Functions are explicit about the shapes and elements of the tensors that they manipulate (they are often polymorphic in shapes and elements though.)
- To let combinators be as transparent as possible. If a NN layers is a simple tensor transformation it will be exposed as such.
In this version, the interface to TensorFlow is done via python-code generation and a suitable runtime system.
The compiled documentation should be found on hackage.
TypedFlow comes with two examples of neural networks:
- An adaptation of the MNIST tensorflow tutorial
- A simple sequence to sequence model which attempts to learn to translate pre-order into post-order.
To running the examples can be done like so:
nix-env -iA nixpkgs.haskellPackages.styx
nix-env -iA nixpkgs.cabal2nix
styx configure
cd examples/seq2seq
make