After cloning the repository with:
git clone --recurse-submodules git@github.com:alessandropedone/real-function-optimization.git
Once you are in the right directory, you can install muparserx (used to parse information from the user) and TBB (used for parallel execution) libraries by running the provided little script with the command.
./install_pkgs.sh
You can find more information about TBB on Wikipedia and about muparserx on its website.
You can uninstall all the required packages running the other bash script running the following command.
./uninstall_pkgs.sh
Now you are ready to go and by just type make
you can compile the code.
If you want to call the executable type ./main
.
You can modify input parameters inside data.txt
.
We implemented the following optimization algorithms.
Basic method:
- Gradient Descent (GD)
Momentum-based methods:
- Heavy Ball (or Momentum) Method (HB)
- Nesterov Accelerated Gradient (NAG)
Adaptive methods:
- ADAM Method (it combines the benefits of momentum and the adaptiveness of RMSProp method)
Second order methods:
Derivative-Free Optimization (DFO) methods:
- Powell's Method
- Nelder-Mead Method (Downhill Simplex)
The implemented methods were validated on the following test function (which has a very steep gradient, also near its minimum):
For each execution of a method, the following information is printed:
- number of iterations to convergence (if it converged before the maximum number of iterations specified);
- final values of
$x$ (the estimated minimum) and$f(x)$ (the value that the function takes at the estimated minimum); - norm of the gradient at the minimum point;
- method-specific parameters used during the run.
Two options are available:
- Exact Gradient (Analytical): it must be specified in data.txt.
- Finite Differences (FD): it is available in the following three variations.
- Forward Differences
- Backward Differences
- Centered Differences
- Solvers Design: each solver is implemented as a functor and its state corresponds to the method's parameters, allowing for modular and reusable design.
- Template Usage: template programming is used to manage method choices and gradient computation strategies efficiently, as the set of choices is finite and limited in size.
data.txt
is used with the aim to take as input all information from the user, exploiting GetPot and muparsex combined.
The explanation of all input parameters is provided directly inside data.txt as comments.
This structured file facilitates testing of different optimization scenarios (without recompiling anything).
It's an interface developed to parse functions (also vector functions) of an arbitrary number of variables. It's adapted from muParserInterface
inside pacs-examples
repository.
These two files contain parallel implementations of gradient and hessian matrix computed with finite differences.
- Marta Pignatelli (@martapignatelli)
- Alessandro Pedone (@alessandropedone)