A parallel neural net microframework. Read the paper here.
- Dense, fully connected neural layers
- Convolutional and max-pooling layers (experimental, forward propagation only)
- Stochastic and mini-batch gradient descent for back-propagation
- Data-based parallelism
- Several activation functions
Layer type | Constructor name | Supported input layers | Rank of output array | Forward pass | Backward pass |
---|---|---|---|---|---|
Input (1-d and 3-d) | input |
n/a | 1, 3 | n/a | n/a |
Dense (fully-connected) | dense |
input (1-d) |
1 | ✅ | ✅ |
Convolutional (2-d) | conv2d |
input (3-d), conv2d , maxpool2d |
3 | ✅ | ❌ |
Max-pooling (2-d) | maxpool2d |
input (3-d), conv2d , maxpool2d |
3 | ✅ | ❌ |
Flatten | flatten |
input (3-d), conv2d , maxpool2d |
1 | ✅ | ✅ |
Get the code:
git clone https://github.com/modern-fortran/neural-fortran
cd neural-fortran
Dependencies:
- Fortran 2018-compatible compiler
- OpenCoarrays (optional, for parallel execution, GFortran only)
- BLAS, MKL (optional)
Compilers tested include:
- gfortran-9.4.0
- ifort-2021.4
- ifx-2021.4
fpm build
By default, without specifying the build profile, fpm will build neural-fortran using the debug compiler flags, and without optimization. To build optimized code, build with the release profile:
fpm build --profile release
If you're using GFortran, make sure to also pass an additional flag:
fpm build --profile release --flag "-fno-frontend-optimize"
The -fno-frontend-optimize
disables some optimizations that may be harmful
when building neural-fortran.
If you use GFortran and want to run neural-fortran in parallel,
you must first install OpenCoarrays.
Once installed, use the compiler wrappers caf
and cafrun
to build and execute
in parallel, respectively:
fpm build --compiler caf --profile release --flag "-fno-frontend-optimize"
fpm test
For the time being, you need to specify the same compiler flags to fpm test
as you did in fpm build
so that fpm knows it should use the same build
profile.
See Fortran Package Manager for more info on fpm.
mkdir build
cd build
cmake .. -DSERIAL=1
make
Tests and examples will be built in the bin/
directory.
If you use GFortran and want to run neural-fortran in parallel,
you must first install OpenCoarrays.
Once installed, use the compiler wrappers caf
and cafrun
to build and execute
in parallel, respectively:
FC=caf cmake ..
make
cafrun -n 4 bin/mnist # run MNIST example on 4 cores
If you want to build with a different compiler, such as Intel Fortran,
specify FC
when issuing cmake
:
FC=ifort cmake ..
To use an external BLAS or MKL library for matmul
calls,
run cmake like this:
cmake .. -DBLAS=-lblas
where the value of -DBLAS
should point to the desired BLAS implementation,
which has to be available in the linking path.
This option is currently available only with gfortran.
To build with debugging flags enabled, type:
cmake .. -DCMAKE_BUILD_TYPE=debug
Type:
ctest
to run the tests.
The easiest way to get a sense of how to use neural-fortran is to look at examples, in increasing level of complexity:
- simple: Approximating a simple, constant data relationship
- sine: Approximating a sine function
- mnist: Hand-written digit recognition using the MNIST dataset
- cnn: Creating and running forward a simple CNN using
input
,conv2d
,maxpool2d
,flatten
, anddense
layers.
The examples also show you the extent of the public API that's meant to be
used in applications, i.e. anything from the nf
module.
The MNIST example uses curl to download the dataset, so make sure you have it installed on your system. Most Linux OSs have it out of the box. The dataset will be downloaded only the first time you run the example in any given directory.
If you're using Windows OS or don't have curl for any other reason, download mnist.tar.gz directly and unpack in the directory in which you will run the example program.
API documentation can be generated with FORD. Assuming you have FORD installed on your system, run
ford ford.md
from the neural-fortran top-level directory to generate the API documentation in doc/html. Point your browser to doc/html/index.html to read it.
Thanks to all open-source contributors to neural-fortran: